DBerger NKS
DBerger NKS
DBerger NKS
Notion of Computation
- Underlies all of NKS
- A computation is an operation that begins with some initial conditions and gives
an output which follows from a definite set of rules. The most common example
are computations performed by computers, in which the fixed set of rules may be
the functions provided by a particular programming language.
- classify systems according to what types of computations they can perform
Implications
- universality associated with all instances of general complex behavior
- a wide variety of systems (CA, mobile CA’s, substitution systems, Turing
machines…), though internally different, are all computationally equivalent.
- thus studying computation at an abstract level allows one to say meaningful things
about a lot of systems
- there is a threshold of complexity that is relatively low. Once this threshold is
past, nothing more is gained in a computational sense
- Because the threshold is so low almost any system can generate an arbitrary
amount of complexity. Wolfram believes that threshold is between class 2 and 3.
“it is possible to think of any process that follows definite rules as being a computation-
regardless of the kinds of elements involved.” (716)
- thus we can view processes in nature as computations
Therefore “almost all processes that are not obviously simple can be viewed as
computations of equivalent sophistication.” (717)
-this follows because Wolfram believes that all systems of class 3 and 4
behavior are universal and are therefore computationally equivalent.
- implies that there is an upper limit to complexity that is relatively low
More specifically, the principle of computational equivalence says that systems found in
the natural world can perform computations up to a maximal ("universal") level of
computational power, and that most systems do in fact attain this maximal level of
computational power. Consequently, most systems are computationally equivalent. For
example, the workings of the human brain or the evolution of weather systems can, in
principle, compute the same things as a computer. Computation is therefore simply a
question of translating inputs and outputs from one system to another.
our processes of perception and analysis are of equivalent sophistication and the most
powerful computers or even our minds must obey the PoCE. “Can these (computer and
brain) be more sophisticated? Presumably they cannot, at least if we want actual results,
and not just generalities.” Difference is that results require definite physical processes
and are therefore subject to the same limitations of any process.
Generation of Randomness
1.Externally imposed.
2. Randomness initial conditions
3. intrinsically generated
-Wolfram believes that the third option underlies most of the generation of complexity in
the physical world because of the ease of which nature seems to do it. Now that we know
that we can generate enormous complexity from very simple rules it seems plausible that
this is what nature does as well. Fits in with the idea of natural selection- nature just
generates all possible outcomes and the ones that are feasible survive over time.
Problem of Randomness:
- how can one have a deterministic emergence of randomness? Does the answer lie
in the introduction of interaction between localized structures? no because there
are no localized structures to speak of.
- Randomness is colloquially defined as “lots of complicated stuff going on” and
“without any patterns.” Though the complexity in rule 30 fits this definition as
well as the other statistical, mathematical and cryptographical definitions of
randomness, it seems that there is something more to be explain.
- This implies that even if one knows the underlying rules of a system, then to
predict what the system will do can still take an irreducible amount of
computational work.
- To make meaningful predictions, “it must at some level be the case that the
systems making the predications outrun the systems it is trying to predict. But for
this to happen the system making the predictions must be able to perform more
sophisticated computations than the system it is trying to predict.” (741)
PoCE says this is not possible thus there are many systems where systematic prediction
can not be done because the behavior is computationally irreducible.
- implies that explicit simulation may be the only way to attack a large class of
problems instead of just being a convenient way to do mathematical expressions
- helps explain how one can have a deterministic emergence of randomness as even
though the rules are simple it is irreducibly complex to predict is evolution
- the general question of what the system will do can be considered formally
undecidable. Undecidability is common (755)
Implications
- though one can know the underlying rules for a systems behavior, there will
almost never be an easy theory for any behavior that seems complex. (748)
- essential features of complex behavior can be captured with models that have
simple underlying structures. Thus use models that whose underlying rules are
simple
- Even though a system follows definite underlying rules, its overall behavior can
still have aspects that cannot be described by reasonable laws
- Free Will? “For even though all the components of our brains presumably follow
definite laws, I strongly suspect that their overall behavior corresponds to an
irreducible computation whose outcome can never in effect be found by
reasonable laws.” (750)
- Strong A.I. is possible