CS 898: Kolmogorov complexity and its applications
Winter 2021
Home 

Home

Instructor: 
Ming
Li 
Contact me by email 
mli@uwaterloo.ca 
Course time and location:  The official course
time is Fridays 2:004:50pm. We will hold the first meeting
at this slot on Jan. 15, 2pm EST, on zoom: zoom meeting id:
863 4846 7016. (There is no passcode)
Regular meeting time for Q and A will be back to Fridays
2:00pmwhenever we are done, at the same zoom
meeting id: 863 4846 7016, with no passcode.
The students will view the recorded lectures online at
their convenience (but please do this timely, taking
the proper lectures for each week).

Office hours:  Any time by email or
phone 5195003026 (I am back to Waterloo now) 
Textbook:  Ming Li and Paul Vitanyi, An
introduction to Kolmogorov
complexity and its applications. 4th edition, Springer,
2019 (Other editions are ok too) 
The course material
will be mainly from the textbook, research papers, the ppt lecture
notes and recorded zoom session presentations.
The students will do three homeworks and a research project at
the end of the term, and present it via zoom (recorded).
Kolmogorov complexity is a modern theory of information and
randomness.
The goal of this course is to teach the students the basic concepts of
Kolmogorov complexity and how to use this tool in their own research.
Topics include: plain Kolmogorov complexity, randomness,
prefix Kolmogorov complexity, incompressibility method,
information distance, applications in various fields ranging from
averagecase analysis of algorithms, Lovasz Local Lemma,
zeroshot learning. It is impossible to cover
the complete textbook, rather we will focus more on recent
developments but we will focus on the applications, especially I wish
to explore potential applications of Kolmogorov complexity
in deep learning and zeroshot learning.
Marking Scheme: Each student is expected to do three homeworks (20%
each)
and a final project and presentation in class (via zoom recording)
which can be either theoretical or programming (40%).
Course announcements and lecture notes will appear on this page.
You are responsible for reading it regularly.
Assignments:
(All exercise numbers refer to the 4th edition
of the textbook unless otherwise stated) Please submit your assignments via
email mli@uwaterloo.ca before the deadline.

The assignment 1 (Due Jan. 31)
contains the following problems from the textbook (4th edition):
Exercises 2.1.1[a], 2.1.2, 2.1.3, 2.1.5 (page 113) and
2.2.5, 2.2.6 (page 122).

Assignment 2 (Due Feb. 28)).
Do Exercises 2.4.2 (page 141), 2.5.1, 2.5.2, 2.5.3 (page 160),from the
textbook (4th edition)
and prove the following statement: If a binary sequence x
of length n has exactly n/2 0's, then x is not cincompressible, for
large enough n.

Assignment 3 (Due March 25). Problem 1: Use Kolmorogov
complexity
to directly prove that a 1 head 2way DFA cannot accept
L={w#w  w in {0,1}*}. Note, please do not use other theorems, do a
direct Kolmogorov complexity proof.
Problem 2: Using Kolmogorov complexity, obtain
the average case lower bound for binary search.
Problem 3: Using Kolmogorov complexity (directly, without using
theorems from Chapter 6), prove that {0^n 1^m  m>n} is not regular.
Problem 4: Do Excercise 6.8.5 (page 504 in the textbook, 4th
edition).
Problem 5: ShakerSort sorts a list of numbers by
comparing each adjacent pair of items in a list in turn, swapping
them if necessary, and alternately passes through the list from the
beginning to the end then from the end to the beginning. It stops when
a pass does no swaps. Using incompressibility argument, prove a tight
average case lower bound for ShakerSort. Problem 6. Do Exercise
6.3.1 (Page 465 of the textbook, 4th edition),
using Kolmogorov complexity. Problem 7. Do Exercise
6.3.3 (Page 465, 4th edition), using Kolmogorov complexity.
Possible project topics:

Review and extend the Bill Gasarch's recent work:
Gasarch recent manuscript on contextfree grammar
size using Kolmogrov complexity.

Kolmogorov complexity and zero shot learning

Simplicity of Deep learning models.
Take a look at this paper.

Replace mutual information argument in this and next paper by Kolmogorov
complexity?
This paper extends the usage of mutual information to
NLP

Do average analysis of an algorithm, using Kolmogorov complexity.

Interesting applications of Kolmogorov complexity to LLL type of
problems.

Simplify and improve averagecase analysis of
Quicksort using Kolmogorov complexity (I will provide the papers).

Improve the Shellsort averagecase lower bound (Open). This is very
difficult.

Close the gap of logn upper bound
and (1/2)logn lower bound for the
following problem: given a sequence of k stacks linearly connected
from left to right. The input is a permutation of n integers to the
leftmost stack. The integers move in order (not passing each other
unless they go into stacks) from left to right and any of them can go
into
any stack when they arrive at that stack and popped out at some
subsequent
steps. The output is from the rightmost stack and should be sorted
in increasing order. How many stacks do we need?
Or demonstrate why Kolmogorov complexity cannot be used for further
improvement.
Note: Luke Schaeffer has improved (1/2)logn to 0.53logn.

Please do not hesitate to discuss with me on both finding and solving
your project problems. Please let me know what you wish to
work on. I generally prefer people working
on the different projects.
A project can be 1) Original theoretical research; 2) An application of
Kolmogorov complexity (proving a theorem, analyzing an algorithms,
or implement a program); 3) In depth analysis of one potential
application of Kolmogorov complexity, when you failed 1) or 2);
4) In depth survey / review of one area.
Lectures (The videos will be uploaded gradually):

Lecture 1. History, definition of Kolmogorov complexity, basics
(Chapters 1 and 2). 1 week.
Lecture 1 ppt. 
Lecture 1 video, Part 1 
Lecture 1 video, Part 2

Lecture 2: Randomness. Definitions and MartinLof tests
(Sections 2.4, 2.5). 1 week.
Lecture 2, Part 1 ppt 
Part 2 ppt. 
Lecture 2 video, Part 1 
Lecture 2 video, Part 2.

Lecture 3: Symmetry of Information (Section 2.8). 1 week.
Lecture 3 ppt 
Lecture 3 video
CNCC video

Lecture 4: Information Distance
and zero shot learning.
(Sections 8.2, 8.3, 8.4, 8.6
this
paper,
this
paper, and
this paper). 2 weeks.
Lecture 4 ppt 
Lecture 4 video

Lecture 5: The incompressibility method and its applications in
average case analysis and lower bounds (Chapter 6)
Lecture 5 ppt 
Lecture 5 video, part 1
Lecture 5 video, part 2
Lecture 5 video, part 3
Lecture 5 video, part 4
Some nice additional notes
by Tao Jiang 2 weeks.

Lecture 6: Prefix Kolmogorov complexity (Chapter 3)
Lecture 6 ppt 
Lecture 6 video

Lecture 7: Inductive Inference (Chapter 5)
Lecture 7 ppt 
Lecture 7 video

Lecture 8: Kolmogorov complexity and the nature (Chapter 8)
Lecture 8 ppt 
Lecture 8 video

Lecture 9: Resource bounded Kolmogorov complexity (Chapter 7).
Lecture 9 ppt 
Lecture 9 video 
For those who are interested in reading more, you can look at
this Lance Fortnow notes).

Student zoom presentations: (45 min. each student).
Watching other student presentations is also part of this
course. Please make sure you watch student presentations as well.
Here are the presenations by the students
Ben Baral: https://drive.google.com/file/d/1ka8NVk6gzWHtG3VrC4nBQJKWGQ5yHyRu/view?usp=sharing,
Stan Ivashkevich: https://drive.google.com/file/d/1pujUc7iAIxiPPsgimfY60jf_bYYd1U6/view?usp=sharing,
Adam Jaffe: https://www.icloud.com/attachment/?u=https%3A%2F%2Fcvws.icloudcontent.com%2FB%2FARWRLCcelqd05DLofbpcBUd3OVnqAc0bG5xzK5A7MldMT3muJFpn435o%2F%24%7Bf%7D%3Fo%3DArsXCUBOyO8IrshPxrW49_tY0raom7QM8PNLzGQkRjgH%26v%3D1%26x%3D3%26a%3DCAog0yAe8pylW9VEzK2sa9LLz9CqnVa2yqWMD7wQG_x1ZiYSeBDu5ayEkS8Y7vWn2JovIgEAKgkC6AMA_1CCRkVSBHc5WepaBGfjfmhqJhBC14toUn9kA7NJFXwCnfPCEMTBhl9byMWFuZaAC4TroX_JZwXqciYMTjnhQodwrLgibVmkbX3Cql8P6XtVaQ2NA78rjPds1kvBywj3A%26e%3D1622072228%26fl%3D%26r%3D59285E6919024879A3F50E1FBD699B9D1%26k%3D%24%7Buk%7D%26ckc%3Dcom.apple.largeattachment%26ckz%3D3C01E2633A024A6587BD5F5505BDBB89%26p%3D32%26s%3D3XrQK1dGEWih_orgAvj6H_hXERY&uk=99qgI8ndRMjBdTEEF6tESw&f=ajaffe_presentation.mp4&sz=83509966
Zhiying Jiang: https://drive.google.com/file/d/1jQKB0cBTTGKe7LZpyj8XGtt4ne38STor/view?usp=sharing,
Josh Lemmon: https://drive.google.com/file/d/1_6Y4gb0JbiFJnPdMcEYEhEvhfkOVBFZV/view?usp=sharing,
Joseph Meleshko: https://drive.google.com/drive/folders/13h6WHsb4j4WpwLmSLKqTQPpu8yd8zYri?usp=sharing,
Haolin Yu https://drive.google.com/drive/folders/1TXsktk8a9bzr9FId_bHaOettjBB6Dq_8?usp=sharing
Announcements:
Maintained by Ming Li