Algorithmic Randomness and Complexity: Rod Downey Victoria University Wellington New Zealand
Algorithmic Randomness and Complexity: Rod Downey Victoria University Wellington New Zealand
Melbourne, 2011
Algorithmic
Etymology : Al-Khwarizmi, Persian astronomer and mathematician, wrote a treatise in 825 AD, On Calculation with Hindu Numerals, together with an error in the Latin translation. What we intuitively mean From a set of basic instructions (ingredients) specify a mechanical method to obtain the desired result.
Already you can see that I plan to be sloppy, but you should try to get the feel of the subject.
Algorithmic
From a set of basic instructions (ingredients) specify a mechanical method to obtain the desired result.
Euclids Algorithm
To nd gcd(1001,357). 1001 = 357 2 + 287 357 = 287 1 + 70 287 = 70 4 + 7 70 = 7 10 7=gcd(1001,357).
Commonly accepted is Churchs Thesis that the intuitively computable functions are the same as those dened by Turing machine (or your favourite programming language, such as JAVA, C++, etc.) Trickier when we talk about complexity theory. (feasible is a subset of polynomial time on a Turing Machine)
Randomness
How dare we speak of the laws of chance? Is not chance the antithesis of all law? Joseph Bertrand, Calcul des Probabilits, 1889 e
Intuitive Randomness
Intuitive Randomness
Which of the following binary sequences seem random? A 000000000000000000000000000000000000000000000000000000000000 B 001101001101001101001101001101001101001101001101001101001101 C 010001101100000101001110010111011100000001001000110100010101 D 001001101101100010001111010100111011001001100000001011010100 E 010101110110111101110010011010110111001101101000011011110111 F 011101111100110110011010010000111111001101100000011011010101 G 000001100010111000100000000101000010110101000000100000000100 H 010100110111101101110101010000010111100000010101110101010001
Intuitive Randomness
Non-randomness: increasingly complex patterns. A 000000000000000000000000000000000000000000000000000000000000 B 001101001101001101001101001101001101001101001101001101001101 C 010001101100000101001110010111011100000001001000110100010101 D 001001101101100010001111010100111011001001100000001011010100 E 010101110110111101110010011010110111001101101000011011110111 F 011101111100110110011010010000111111001101100000011011010101 G 000001100010111000100000000101000010110101000000100000000100 H 010100110111101101110101010000010111100000010101110101010001
Intuitive Randomness
Randomness: bits coming from atmospheric patterns. A 000000000000000000000000000000000000000000000000000000000000 B 001101001101001101001101001101001101001101001101001101001101 C 010001101100000101001110010111011100000001001000110100010101 D 001001101101100010001111010100111011001001100000001011010100 E 010101110110111101110010011010110111001101101000011011110111 F 011101111100110110011010010000111111001101100000011011010101 G 000001100010111000100000000101000010110101000000100000000100 H 010100110111101101110101010000010111100000010101110101010001
Intuitive Randomness
Partial Randomness: mixing random and nonrandom sequences. A 000000000000000000000000000000000000000000000000000000000000 B 001101001101001101001101001101001101001101001101001101001101 C 010001101100000101001110010111011100000001001000110100010101 D 001001101101100010001111010100111011001001100000001011010100 E 010101110110111101110010011010110111001101101000011011110111 F 011101111100110110011010010000111111001101100000011011010101 G 000001100010111000100000000101000010110101000000100000000100 H 010100110111101101110101010000010111100000010101110101010001
Intuitive Randomness
Randomness relative to other measures: biased coins. A 000000000000000000000000000000000000000000000000000000000000 B 001101001101001101001101001101001101001101001101001101001101 C 010001101100000101001110010111011100000001001000110100010101 D 001001101101100010001111010100111011001001100000001011010100 E 010101110110111101110010011010110111001101101000011011110111 F 011101111100110110011010010000111111001101100000011011010101 G 000001100010111000100000000101000010110101000000100000000100 H 010100110111101101110101010000010111100000010101110101010001
Villes Theorem
Theorem (Ville)
Given any countable collection of selection functions, there is a real passing every member of the test yet the number of zeros less than or equal to n in the A n (the rst n bits of the real A) is always less than or equal to the number of 1s.
Martin-Lf o
Martin-Lf, 1966 suggests using shrinking eective null sets as o representing eective tests. Basis of modern eective randomness theory. A c.e. open set is one of the form i (qi , ri ) where {qi : i } and {ri : i } are c.e.. U = {[] : W }. A Martin-Lf test is a uniformly c.e. sequence U1 , U2 , . . . of c.e. open o sets s.t. i((Ui ) 2i ). (Computably shrinking to measure 0) is Martin-Lf random if for every Martin-Lf test, o o /
i>0
Ui .
Universal Tests
Enumerate all c.e. tests, {We,j,s : e, j, s N}, stopping should one threated to exceed its bound. Un = eN We,n+e+1 . A passes this test i it passes all tests. It is a universal martin-Lf o test. (Martin-Lf) o
reals
From this point of view we should have all the initial segments of a real to be random. First try , a real, is random i for all n, C ( n) n d. By complexity oscillations no such real can exist. The reason as is that C lacks the intentional meaning of Komogorov complexity. the bits of encode the information of the bits of . Because C really uses + | | as we know it halts there.
And...
They all give the same class of randoms!
Theorem (Schnorr)
A is Martin-Lf random i A is K -random. o
Similar ideas using martingales were you bet on the nest bit. A is random i no eective marrtingale succeeds in achieving innite winnings betting on the bits of A. f () =
f (0)+f (1) . 2
(fairness)
Many variations depending of sensitivity of the tests. Implementations approximate the truth: ZIP, GZIP, RAR and other text compression programmes. Notice no claims about randomness in nature But very intersting question as to e.g. how much is needed for physics etc. Interesting experiments can be done. E.g. ants. (or children) (Reznikova and Yu, 1986)
and Complexity
How hard is it to compute the solution?
Two examples.
CDs
Algebraic coding: (Hamming, 1950) (something to be coded) (longer with redundancies for decoding) e.g. parity check 101001010100101. can decide if there is likely a single error. ISBN etc. More complex decoding uses algebra (specically algebra following a long line for Fermats Last Theorem (Kummer, 1840s), and non-solving the quintic (Galois 1830)) Like the Yellow Pages, instead of letting your ngers do the walking, algebra talks the talk. (mixed metaphor) CDs rst takes 00000000. . . 11111111 and amplies to something of length 256, so there are 2256 many possible codewords, which are decoded, and they are in real time.
Beer Delivery
Take a big map and plan a tour to cost the least amount. (Beer delivery, or Travelling Salesman Problem) Yet Beer delivery (TSP) for 256 cities is computationally impossible. No way known except to try all possibilities! (P =?NP) (Cook, 1970, Karp, 1972, Levin-who knows?) Called computational intractability Sometimes intractablity is good e.g. RSA and credit cards. if factorization was easy, modern banking would break down!
Some applications
Using Chaos and randomness enable us to treat dynamical systems like the weather. Replace statistical tools by computational ones. Speeding up algorithms. E.g. supplying primes for things like RSA. (of course open if BPP=P) Phylogeny and language etc evolution (something of a dream). Understanding how levels of randomness relate to performance, etc. Dierential geometry, reverse mathematics, Brownian motion, sampling randoms, etc. (AND misuses such as creationists!)
My work
What is random? What level of randomness is necessary for applications. Suppose I have a source of weak randomness, how can I amplify this to get better randomness? How can we calibrate levels randomness? Among randoms?, Among non-randoms? How does this relate to classical computability notions, which calibrate levels of computational complexity? If a real is random does it have strong or weak computational power?
There has been a lot of popular press about the number of knowledge etc, which is random, but has high computational power.
A musical example. Excerpt A: from Music of Changes by John Cage Excerpt B: from Structures for Two Pianos by Pierre Boulez
Cages piece is an example of aleatory music. Boulezs piece is an example of total serialism.
Theorem (Stephan)
A random real can compute a DNC function (we say the real has PA degree) i A computes the halting problem. f is DNC i for all x, f (x) = x (x), and f (x) {0, 1}. If we remove the 0,1 restriction then the f is called xed point free and any random can compute one.
Halting probabilities
One would think therefore that has nothing to do with most randoms, but:
Theorem (Kurtz)
Almost every random A is computably enumerable relative to some B <T A.
K-triviality
Theorem (Chaitin)
If C (A n) + C (n) for all n, then A is computable. This is proven using the fact that a 0 class with a nite number of 1 paths has computable paths, combined with the Counting Theorem { : C () C (n) + d || = n} A2d . (The Loveland Technique)
K-triviality
Theorem (Chaitin)
If C (A n) + C (n) for all n, then A is computable. This is proven using the fact that a 0 class with a nite number of 1 paths has computable paths, combined with the Counting Theorem { : C () C (n) + d || = n} A2d . (The Loveland Technique) What is K (A n) + K (n) for all n? We call such reals K -trivial. Does A K -trivial imply A computable?
K-triviality
Theorem (Solovay)
There are noncomputable K -trivial reals. A = { e, n : s (We,s As = e, n We,s and e,n js 2K (j)[s] < 2(e+2) )}.
Thank You