0% found this document useful (0 votes)
13 views7 pages

IT Assignment

Uploaded by

Shun Lae
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views7 pages

IT Assignment

Uploaded by

Shun Lae
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 7

Evolution of Computers and IT

Can we build computers that are intelligent and alive? This question has been on the minds

of computer scientists since the dawn of the computer age and remains a most compelling

line of inquiry. Some would argue that the question makes sense only if we put scare quotes

around "intelligent" and "alive," since we're talking about computers, after all, not biological

organisms. My own view is that the answer is unequivocally yes, no scare quotes or other

punctuation needed, but that to get there our notions of life, intelligence, and computation

will have to be deepened considerably.

You can ask ten biologists what are the ten (or 20 or 100) key requisites for life and you'll

get a different list each time. But most are likely to include autonomy, metabolism, selfreproduction,
survival instinct, and evolution and adaptation. As a start, can we understand

these processes mechanistically and capture them in computers?

"lines of code" in DNA are executed when the enzymes are created and act on the DNA

itself, interpreting it as data to be split up and copied.

A major difference between the self-copying program above and DNA self-reproduction is

that the self-copying program required an interpreter to execute it: an instruction pointer

to move one by one down the lines of computer code and a computer operating system to

carry them out (e.g., actually perform the storing and retrieving of internal variables like

mem and L, actually print strings of characters, and so on). The interpreter is completely

external to the program itself. However, in the case of DNA, the instructions for building the

"interpreter"-the messenger RNA, transfer RNA, ribosomes, and all the other machinery

of protein synthesis-are encoded along with everything else in the DNA. Von Neumann's

original self-reproducing automaton also contained not only a self-copying program but also

the machinery needed for its own interpretation. Thus it was truly a selfreproducing machine. That
it was formulated in the 1950s, before the details of biological self-reproduction

were well understood, is testament to von Neumann's insight. Von Neumann's design and

mathematical proofs of its correctness were eventually published in 1966 as a book, Theory

of Self Reproducing Automata [28], completed and edited by his colleague Arthur Burks.
(See [4] and [25] for descriptions of von Neumann's self-replicating automaton. See Chapter

16 of [14] for an account of self-replication in DNA and how it relates to mathematical logic

and self-copying computer programs.)

Von Neumann's design for a self-reproducing automaton was one of the first real advances

in the science of artificial life. He recognized it as such, and accordingly took it very seriously,

saying that he wanted no mention of the "reproductive potentialities of the machines of the

future" made to the mass media.

Evolution in Computers

After he answered the question "Can a machine reproduce itself?" in the affirmative, von

Neumann wanted to take the next logical step and have computers (or computer programs)

reproduce themselves with mutations and compete for resources to survive in some environment.
This would counter the "survival instinct" and "evolution and adaptation" arguments

mentioned above. However, von Neumann died in 1957 without being able to work on the

evolution problem.

Others quickly took up where he left off. By the early 1960s, several groups of researchers

were experimenting with evolution in computers. Such work has come to be known collectively as
"evolutionary computation" [11]. The most widely known of these efforts today

was the work on genetic algorithms done by John Holland and his students and colleagues

at the University of Michigan [12, 15, 24].

A genetic algorithm (GA) is an idealized computational version of Darwinian evolution. In

Darwinian evolution, organisms reproduce at differential rates, with fitter organisms producing
more offspring than less fit ones. Offspring inherit traits from their parents; those traits

are inherited with variation via random mutation, sexual recombination, and other sources

of variation. Thus traits that lead to higher reproductive rates get preferentially spread in

the population, and new traits can arise via variation. In GAs, computer "organisms"-e.g.,

computer programs encoded as strings of ones and zeros (bit strings)-reproduce in proportion to
their fitness in the environment, where fitness is a measure of how well an organism

solves a given problem. Offspring inherit traits from their parents with variation coming
from random mutation, in which parts of an organism are changed at random, and sexual

reproduction, in which an organism is made up of recombined parts coming from its parents.

Assume the individuals in the population are computer programs encoded as bit strings.

The following is a simple genetic algorithm.

1. Generate a random initial population of M individuals.

Repeat the following for N generations:

2. Calculate the fitness of each individual in the population. (The user must define a

function assigning a numerical fitness to each individual. For example, if the individuals

represent computer programs, the fitness of an individual is calculated by running the

corresponding computer program and seeing how well it does on a given task.)

3. Repeat until the new population has M individuals:

(a) Choose two parent individuals from the current population probabilistically as a

function of fitness.

(b) Cross them over at a randomly chosen locus to produce an offspring. That is,

choose a position in each bit string, form one offspring by taking the bits before

that position from one parent and after that position from the other parent.

(c) Mutate each locus in the offspring with a small probability.

(d) Put the offspring in the new population.

4. Go to step 2 with the new population.

This process is iterated for many generations, at which point hopefully one or more highfitness
individuals have been created.

Notice that reproduction in the simple G A consists merely of copying parts of bi t stringsvon
Neumann's complicated design for self-reproduction is avoided by having an external

copying routine. This is because research on GAs is typically aimed at solving computational
problems via evolution and at investigating evolutionary dynamics in an idealized

setting rather than trying to capture all the facets of evolution in biological systems
(selfreproduction, complex genotype-to-phenotype mappings via development, and so on).

The simple GA given above is simple indeed, but versions of it that are only slightly more

complex have been used to solve problems in many scientific and engineering disciplines.

GAs in one form or another have been used for numerical optimization, circuit design,
factory scheduling, drug design, telecommunications network optimization, robot navigation,

financial-market prediction, and models of the immune system, the economy, and population

genetics, to name a few of the areas in which these algorithms have been applied.

Mitchell, M., 2001. Life and evolution in computers.


History and philosophy of the life sciences, pp.361-383.

Computers actually date back to the 1930s. Here's how they've changed.

Frank Olito Sep 13, 2019, 10:43 PM

vintage computers

Computers have changed drastically since the 1930s. AP

From the 1930s to today, the computer has changed dramatically.

The first modern computer was created in the 1930s and was called the Z1, which was followed by large
machinery that took up entire rooms.

In the '60s, computers evolved from professional use to personal use, as the first personal computer was
introduced to the public.

In the 1980s, Apple introduced its first computer, the Macintosh, and has dominated the computer
industry ever since with laptops and tablets.

Visit Insider's homepage for more stories.

Although computers seem like a relatively modern invention, computing dates back to the early 1800s.

Throughout computing history, there has not been a lone inventor or a single first computer. The
invention of the computer was incremental, with dozens of scientists and mathematicians building on
their predecessors. The modern computer, however, can be traced back to the 1930s.

Keep reading to learn how the computer has changed throughout the decades.

The 1930s marked the beginning of calculating machines, which were considered the first
programmable computers.
computer in the 1930s

A calculating machine in the 1930s. AP

Konrad Zuse created what became known as the first programmable computer, the Z1, in 1936 in his
parent's living room in Berlin. He assembled metal plates, pins, and old film, creating a machine that
could easily add and subtract. Although his early models were destroyed in World War II, Zuse is
credited with creating the first digital computer.

In the 1940s, computers took up entire rooms, like the ENIAC, which was once called a "mathematical
robot."

computer room vintage

A computer room. AP

John Mauchly created the ENIAC during World War II to help the Army with ballistics analytics. The
machine could calculate thousands of problems each second. The large-scale ENIAC weighed 30 tons
and needed a 1,500-square-foot room to house the 40 cabinets, 6,000 switches, and 18,000 vacuum
tubes that comprise the machine.

Some call this invention the beginning of the computer age.

In the 1950s, computers were strictly used for scientific and engineering research, like the JOHNNIAC,
which was once described as a "helpful assistant" for mathematicians.

computer from 1950s

A man working at a computer in the '50s. AP

The JOHNNIAC was completed in 1954 and was used by RAND researchers. The massive machine
weighed just over two tons with over 5,000 vacuum tubes. This early computer operated for 13 years or
51,349 hours before being dismantled.

In the 1960s, everything changed when the Programma 101 became the first desktop computer sold to
the average consumer.

Programma 101

Programma 101. Pierce Fuller/ Wikimedia Commons


Up until 1965, computers were reserved for mathematicians and engineers in a lab setting. The
Programma 101 changed everything, by offering the general public a

desktop computer

that anyone could use. The 65-pound machine was the size of a typewriter and had 37 keys and a
printer built-in.

The Italian invention ushered in the idea of the personal computer that would last to this day.

As personal computers became popular in the 1970s, the Xerox Alto helped pave the way for Steve Jobs'
Apple.

Xerox Alto

Xerox Alto. Francisco Antunes/ Flickr

The Xerox Alto was created in the '70s as a personal computer that could print documents and send
emails. What was most notable about the computer was its design, which included a mouse, keyboard,
and screen. This state-of-the-art design would later influence Apple designs in the following decade.

The Alto computers were also designed to be kid-friendly so that everyone — no matter the age — could
operate a personal computer.

In the '80s, Apple's Macintosh was described as a game-changer for the computer industry.

apple macintosh

The Macintosh. Raneko/ Flickr

When Steve Jobs introduced the first Macintosh computer in 1984, Consumer Reports called it a
"dazzling display of technical wizardry." Like the Xerox Alto, the Macintosh had a keyboard, a mouse,
and a small 9-inch screen. The computer — which weighed in at 22 pounds and cost $2,495 — was
applauded for its interface of windows and icons.

As the '90s marked a period of self-expression, Apple released the famous iMac G3, which was
customizable.

imac g3
iMac G3. Marcin Wichary/ Flickr

The iMac G3 was launched in 1998 after Steve Jobs' return to Apple in 1997. The computer quickly
became known for its Bondi blue, clear casing. The 38-pound iMac included USB ports, a keyboard, and
a mouse. It was meant to be portable and customizable.

The company sold 800,000 computers in the first five months, saving Apple from extinction. The iMac is
also notable because it was the first time Apple used the "I" to name its products, explaining it stood for
"internet," "innovation," and "individuality."

In the early 2000s, laptops became increasingly popular, especially after Apple launched its MacBook
Air.

macbook air

MacBook Air. Tim Malabuyo/ Flickr

In 2008, Steve Jobs slid the first MacBook Air from a manila envelope and shocked the audience at
Apple's Macworld with how thin the laptop was. Measuring only 0.76-inch thick, the expertly designed
laptop changed the industry forever. Apple got rid of the CD drive and only included a USB port and a
headphone jack. At the time, the minimalistic device cost $1,799.

Today, computers come in all shapes and sizes, including tablets.

Lenovo Yoga Tablet 4

A tablet. Hollis Johnson

Today's most innovative computers are tablets, which are simple touchscreens without a keyboard or a
mouse. Although tablet sales are on the decline, 33 million tablets were sold in 2018.

The market is also filled with other computer models, including the MacBook Pro, iMac, Dell XPS, and
iPhones.

You might also like