Evolution of Computing

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 13

Evolution of Computing

Before 1944, computers were mechanical before devices that do not required electricity to operate. There have
been a few attempts to build hybrid computers that use electricity and mechanical components, but these have not
been as successful as pure mechanical or pure electrical models.
It took the outbreak of war, as is common, to really stimulate innovation. The Allied Forces used a computer
called "Colossus," which ran solely on electricity, to assist in decoding messages toward the end of World War II.
The Zeus Z3 machine, which the Axis Forces used, may have inspired the design of Colossus, but the Zeus Z3
was electro-mechanical. Later, the United States developed their own version of Colossus, called ENIAC, which
was released several years late.

Computers used to be enormous and necessitated a whole room just to house them. As we'll see in a moment,
computers switch electric current to do their work, and the technology they used back then required vacuum tubes
to do so. These massive computers were renamed "mainframes" in retrospect, and they are still built today, but
with modern components, making modern mainframes far more powerful.

A working computer required thousands of these enormous vacuum tubes, which got extremely hot when they
were in use. They also required a significant amount of power. An ENIAC would cost about $20 per hour to run if
one were to have one in one's house today.

Transistors, which are smaller, more efficient, and significantly less expensive than vacuum tubes, began to be
used in computers in the 1960s. As a result, a computer no longer required the space of an entire room, but rather
one the size of a small closet. The term "minicomputer" was coined to describe the diminutive size of these
machines.

The microprocessor was the next major advancement. This made it possible to integrate entire circuits into a
single chip, hence the term "integrated circuit chip" for a microprocessor. These chips have shrunk in size while
increasing in power at the same time. Chips became less thermally efficient as they became smaller and more

powerful.
The error (or greed) of Apple in pricing their “home computer” so high allowed many competitors to enter the
market at significantly lower price points and compete against Apple aggressively. Commodore was Apple's main
competitor in the early 1980s home computing market, with the C64 costing less than a third of what an Apple II
cost while using much of the same technology. The C64 had better graphics and sound, so it was better for
gaming. However, Apple had a much wider selection of software for business and education.
Small computers that fit on top of a desk are now feasible for the first time. These were the first microcomputers,
which is a bit misleading because they're not really small. Although early microcomputers were impressive for
their size reduction, they were actually a step backwards in terms of processing power.
For comparison, the electro-mechanical Zeus Z3 was built more than 30 years earlier and was a 22-bit computer
("bit" in computing is an abbreviation for "binary digit"), whereas many early microcomputers were only 8-bit
machines. Due to other technological advancements incorporated into their design, they were still faster and more
efficient than a Z3, though.
The Apple II computer, released in 1977, marked the next major advancement in computer technology (Apple
chose to use the Roman numeral II to represent 2, it should not be confused for 11). The revolutionary feature of
this computer was its integrated keyboard and VDU (visual display unit) with a resolution of 40 by 24 characters
(originally monochrome, and later evolving to color). When 5.25′′ floppy drives became available, Apple II users
could move their data from cassette tapes to them.
Due to its exorbitant price, the Apple II was unaffordable for the majority of home users, despite the company's
claims that it was designed specifically for them. This is a tradition that Apple has carried on to this day. Most of
the early customers were businesses and schools, with only a few exceptions being very wealthy individuals.
Apple's mistake (or avarice) in setting the price of their "home computer" so high allowed many competitors to
enter the market at significantly lower price points and fiercely compete against it. Commodore was Apple's main
competitor in the early 1980s home computing market, with the C64 costing less than a third of what an Apple II
cost while using much of the same technology. Even though the C64 had superior graphics and sound for gaming,
Apple had a much wider selection of software geared toward businesses and schools.
One thing, however, made the PC a significant step forward in the evolution of computers, and that was the
decision to use open architecture, which means they used standard parts available on the common market, making
it easy for technicians to repair and replace faulty components. Because computers were so expensive, this was a
big hit with corporate clients.
These machines were dubbed "PC clones," and because they were free of IBM's legacy costs, they could sell "PC
clones" for much less than a genuine IBM PC. Apart from Apple's Apple IIe and Macintosh computer line-ups,
this was the beginning of the end for virtually every other home computer manufacturer in the personal computer
market.

When
the PC clone became the industry standard in the early 1990s, the number of software titles available for the PC
grew even further, until nearly all of the software on the market was made exclusively for the PC, with only a few
titles being converted to be compatible with the newly rebranded Apple Mac.
After Windows 95's success, all software and hardware had to be Windows compatible, allowing Microsoft to
further consolidate its position as the dominant player in the operating system market. A major factor in this was
the widespread adoption of the Visual Studio programming suite, which made it easier and faster for programmers
to create Windows-compatible software.

That discovery is what affected the early success of Linux, a newcomer to the operating system market. Although
Windows software continues to dominate the market, Linux has become much easier to use and because it is free,
the range of Linux software is also increasing and hardware manufacturers like Hewlett Packard have started
ensuring their products are Linux compatible.
Cell phones began to improve in computing capabilities, including the ability to access web pages, toward the end
of the 1990s. Initially, they looked like a PDA, like the still widely used (but now obsolete) Blackberry. Then
came the iPhone and iPad from Apple, and everything changed again.

It was fascinating to see how these new phones and tablets did away with the traditional integrated keypad in
favor of a touchscreen interface. Yes, the same company that first integrated a keyboard into a microcomputer
was also the first to remove it from a cellphone. "The future" had finally arrived... a world where almost anyone
could carry a computer in their pocket, thanks to Google and Microsoft following in Apple's footsteps.
Which option, if either, would you prefer: having technology implanted directly into your body or having it worn
on your person? Consider some of the dangers of integrating that technology into your body, as well as the
benefits of not having to carry or wear anything.
How Computers Evolved | Juniors Coders (juniorcoders.ca)
The computer advancement has been an astounding one. There have been astounding accomplishments within the
computer industry, which dates back nearly 2000 a long time. The most punctual presence of the computer dates
to the primary century, but the electronic computer has as it have been around for over a half-century. All through
the last 40 years, a long-time computer has changed definitely. They have incredibly affected the American way
of life. A computer can be found in about each trade and one out of each two-family unit (Hall, 156). Our Society
depends basically on computers for nearly all of their everyday operations and forms. As it were once in a lifetime
will a modern innovation just like the computer come approximately.
Computers are currently used in the majority of jobs and in society, including, but not limited to, agriculture,
architecture, art, commerce and global trade, communication, education, governance, law, music, politics,
science, transportation, and writing. There aren't many places of work where computers aren't used because
they're useful for so many different things like organization, storage, retrieval of large amounts of data like library
catalogs or bank records, communication, and so on (McIver 2002). Some jobs, such as those in information
technology rooms, were eliminated, but others, such as that of an IT specialist, were created.
Given the fact that what I observed was a shift from mechanical to electronic, we still need to do more with less
effort. Like when we first started using the Abacus for math and computing, then came along a much better tool
that revolutionized the way we communicate and interact with one another in the 20th and 21st centuries. To
complete a task quickly, storage space and the speed at which data is read from and written to that storage space
have both improved over time.

You might also like