0% found this document useful (0 votes)
103 views10 pages

2-Generation of Computers & Media in The Digital Age

The document discusses the five generations of computers from 1946 to present. The first generation used vacuum tubes, the second used transistors, the third used integrated circuits, the fourth used VLSI chips, and the fifth uses ULSI technology and focuses on artificial intelligence and parallel processing. It also defines different types of computers including personal computers, workstations, minicomputers, mainframes, and supercomputers.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
103 views10 pages

2-Generation of Computers & Media in The Digital Age

The document discusses the five generations of computers from 1946 to present. The first generation used vacuum tubes, the second used transistors, the third used integrated circuits, the fourth used VLSI chips, and the fifth uses ULSI technology and focuses on artificial intelligence and parallel processing. It also defines different types of computers including personal computers, workstations, minicomputers, mainframes, and supercomputers.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 10

Generation of Computers

First Generation Computer (1946-1959)

The period of first generation was from 1946-1959. The computers of first
generation used vacuum tubes as the basic components for memory and circuitry for
CPU (Central Processing Unit). These tubes, like electric bulbs, produced a lot of heat
and the

installations used to fuse frequently.

Figure 1. Public domain U.S. Army photo of the ENIAC

Short for Electronic Numerical Integrator and Computer, the ENIAC was the first


electronic computer used for general purposes, such as solving numerical problems It
was invented by John Presper Eckert and John Mauchly at the University of
Pennsylvania to calculate artillery firing tables for the United States Army's Ballistic
Research Laboratory.
Its construction began in 1943 and was not completed until 1946. Although it
was not completed until the end of World War II, the ENIAC was created to help with
the war effort against German forces.
In 1953, the Burroughs Corporation built a 100-word magnetic-core memory,
which was added to the ENIAC to provide it with memory capabilities. By 1956, the end
of its operation, the ENIAC occupied about 1,800 square feet and consisted of almost
20,000 vacuum tubes, 1,500 relays, 10,000 capacitors, and 70,000 resistors. It also used
200 kilowatts of electricity, weighed over 30 tons, and cost about $487,000.
The main features of the first generation are vacuum tube technology, unreliable,
supported machine language only, very costly, generated a lot of heat, slow input and
output devices, huge size, need of ac, non-portable and consumed a lot of electricity.

Second Generation Computer (1959- 1965)

The period of second generation was from 1959-1965. In this generation,


transistors were used that were cheaper, consumed less power, more compact in size,
more reliable and faster than the first generation machines made of vacuum tubes. In
this

generation, magnetic cores were used as the primary memory and magnetic tape and
magnetic disks as secondary storage devices.
Figure 2. Transistors

In this generation, assembly language and high-level programming languages


like FORTRAN, COBOL were used. The computers used batch processing and
multiprogramming operating system.
The main features of second generation are use of transistors, reliable in
comparison to first generation computers, smaller size as compared to first generation
computers, generated less heat as compared to first generation computers, consumed
less electricity as compared to first generation computers, faster than first generation
computers, still very costly, ac required and supported machine and assembly
languages.

Third Generation Computer (1965- 1971)

The period of third generation was from 1965-1971. The computers of third
generation used Integrated Circuits (ICs) in place of transistors. A single IC has many
transistors, resistors, and capacitors along with the associated circuitry.
The IC was invented by Jack Kilby. This development made computers smaller in
size, reliable, and efficient. In this generation remote processing, time-sharing,
multiprogramming operating system were used. High-level languages (FORTRAN-II TO
IV, COBOL, PASCAL PL/1, BASIC, ALGOL-68 etc.) were used during this generation.
The main features of third generation are IC used, more reliable in comparison
to previous two generations, smaller size, generated less heat, faster, lesser
maintenance, costly, AC required, consumed lesser electricity and supported high-level
language.
Figure 3. Third- Generation Computer using Integrated Circuit
Fourth Generation Computer (1971- 1980)

This period of fourth generation was from 1971-1980. Computers of fourth


generation used Very Large Scale Integrated (VLSI) circuits. VLSI circuits having about
5000 transistors and other circuit elements computers became easily available with
their associated circuits on a single chip made it possible to have microcomputers of
fourth generation.
The main features of fourth generation are VLSI technology used, very cheap,
portable and reliable, use of pcs, very small size, pipeline processing, no AC required,
concept of internet was introduced, great developments in the fields of networks.

Figure 4. Fourth Generation Computer

Fourth generation computers became more powerful, compact, reliable, and


affordable. As a result, it gave rise to Personal Computer (PC) revolution. In this
generation, time sharing, real time networks, distributed operating system were used.
All the high-level languages like C, C++, DBASE etc., were used in this generation.
Fifth Generation Computer (1980- present)

The period of fifth generation is 1980-till date. In the fifth generation, VLSI
technology became ULSI (Ultra Large Scale Integration) technology, resulting in the
production of microprocessor chips having ten million electronic components.
This generation is based on parallel processing hardware and AI (Artificial
Intelligence) software. AI is an emerging branch in computer science, which interprets
the means and method of making computers think like human beings. All the high-level
languages like C and C++, Java, .Net etc., are used in this generation.
AI includes robotics, neural networks, game playing, and development of expert
systems to make decisions in real-life situations and natural language understanding
and generation.
The main features of fifth generation are ULSI technology, development of true
artificial intelligence, development of natural language processing, advancement in
parallel processing, advancement in superconductor technology, more user-friendly
interfaces with multimedia features and availability of very powerful and compact
computers at cheaper rates.

Figure 5. Fifth Generation Computer

Classification of Computer

Computer scan is broadly classified by their speed and computing power.

Personal Computer
A PC can be defined as a small, relatively
inexpensive computer designed for an individual
user. PCs are based on the microprocessor
technology that enables manufacturers to put an
entire CPU on one chip. Businesses use personal
computers for word processing, accounting, desktop publishing, and for running
spreadsheet and database management applications. At home, the most popular use for
personal computers is playing games and surfing the Internet.
Figure 6. Personal Computer
Although personal computers are designed as single-user systems, these systems
are normally linked together to form a network. In terms of power, nowadays high-end
models of the Macintosh and PC offer the same computing power and graphics
capability as low-end workstations by Sun Microsystems, Hewlett-Packard, and Dell.
Workstation

Figure 7. Workstation

The workstation is a computer used for engineering applications (CAD/CAM),


desktop publishing, software development, and other such types of applications which
require a moderate amount of computing power and relatively high-quality graphics
capabilities.
Workstations generally come with a large, high-resolution graphics screen, a
large amount of RAM, inbuilt network support, and a graphical user interface. Most
workstations also have mass storage device such as a disk drive, but a special type of
workstation, called diskless workstations comes without a disk drive.
Common operating systems for workstations are UNIX and Windows NT. Like
PC, workstations are also single-user computers like PC but are typically linked together
to form a local area network, although they can also be used as stand-alone systems.

Minicomputer

It is a midsize multi-processing system capable of supporting up to 250 users


simultaneously.
Figure 8. Minicomputer
Mainframe

The mainframe is very large in size and is an expensive computer capable of


supporting hundreds or even thousands of users simultaneously. Mainframe executes
many programs concurrently and supports much simultaneous execution of programs.

Figure 9. Mainframe
Supercomputer

Supercomputers are one of the fastest computers currently available.


Supercomputers are very expensive and are employed for specialized applications that
require an immense amount of mathematical calculations (number-crunching).

Figure 10. Supercomputer


For example, weather forecasting, scientific simulations, (animated)graphics,
fluid dynamic calculations, nuclear energy research, electronic design, and analysis of
geological data (e.g. in petrochemical prospecting).

Media in the Digital Age

New media and digital convergence may often seem synonymous with the
Internet and World Wide Web. Online newspapers, downloadable music and video,
bloggers, and podcasts are among the most familiar examples of new, or digital, media.
But in truth a wide host of technologies compose the full spectrum of media in the
digital age. Among them are not only the Internet and the Web, but wireless and mobile
media, digital television and satellite radio, digital cameras, digital music players, and
other new or emerging technologies for mediated public communication. 
Internet changed most of the paradigms that helped us to describe and
understand the public communication ecosystem.
The digital age arrives with a set of big communication challenges for traditional
mainstream media: new relations with audiences (interactivity), new languages
(multimedia) and a new grammar (hypertext). But this media revolution not only
changes the communication landscape for the usual players, most importantly, it opens
the mass communication system to a wide range of new players.
As far as enterprises, institutions, administrations, organizations, groups, families
and individuals start their own online presence, they become “media” by their own, they
also become “sources” for traditional media, and in many cases, they produce strong
“media criticism”: opinion about how issues are covered by legacy media and delivering
of alternative coverage.
Blogs and social media represent the ultimate challenge for the old
communication system because they integrate both: the new features of the digital world
and a wide democratization in the access to media with a universal scope.
The global process could be understood as a big shift from the classical
mass media models to the new media paradigms: the user becomes the axis of
communication process, the content is the identity of media, multimedia is the new
language, real-time is the only time, hypertext is the grammar, and knowledge is the new
name of information.
1. From audience to user. During the 80’s, the merge of satellite and cable
technologies enabled broadcast media the delivering of content to thematic segmented
target audiences, evolving from broadcasting to narrowcasting. From the 90’s on,
internet opens the way to a next step: from narrowcasting to point casting. Online
content provision not only could fit niche targets but even more: it could be arranged to
meet the specific interests and time constraints of every individual user. The
demassification of public communication arrives with the personal configuration options
of online media and services. The passive unidirectional way of media consumption is
replaced by the concept of an active user seeking for content, exploring and navigating
info-spaces. Users become also content producers in many web environments, mainly
the social web.

2. From media to content. The focus shift from the industrial production
constraints (press, radio, television) to content authority in order to define media.
National Geographic and CNN, for example, are not a particular kind of media, but brands
which represents authority over an area of content (natural life) or expertise in current
affairs content management (journalism). The media convergence towards digital
resets media identity, shifting from platforms to contents and outstanding brand
image in relation to a type of content not to a media format. Media brand image is
one of the most valuable actives of media companies in the new environment: a source of
credibility and prestige for digital content. Today media starts to understand that their
business is selling the content, not the holder: multi-platform services to be accessed by
users from a range of terminals according to the user’s situation and needs.

3. From monomedia to multimedia


One of the strongest issues about digitalization is that text, audio, video, graphics,
photos, and animation could be arranged together and interactively on a single media for
this first time in history. This multimedia identity of the actual environment allowed all
media industries to converge online (press, broadcast, movies) and this is the reason
why media distinctions related to use of single language (textual, audio, visual)
tend to be erased. Online media are multimedia, and multimedia is a new language.

4. From periodicity to real-time. Regular frequency was a strong paradigm of the


old scenario to the point the many media were defined in relation to its time constraints
(daily, weekly, monthly). Online media (whether they are digital versions of a daily
newspaper, or a weekly or monthly magazine) assume that they must to be real time
updated to survive in the new environment. What we lost in the road from periodicity
to real-time is the reflection. What we gain is dynamism and conversational styles.
Sharing news and opinions with the ability to interact in real-time are the seed of cyber
communities.

5. From scarcity to abundance. Space for the print media and time for broadcast
media ceased to be the limit to content and now the time of the user is the new scarce
resource. One of the strong effects of “readers becoming writers” is the proliferation of
online information without clear attribution of source authority and heterogeneity of
content quality. The overflow of information calls for new skills and tools to
manage data, news, and opinions.

6. From editor-mediated to non-mediated. The gatekeeper paradigm was broadly


used to explain the role of media editors and the agenda-setting theory and to describe
the functions of media in defining the daily issues. This intermediation function should
be revisited nowadays in the light of the decentralized nature of the net. Together with
legacy media, many other informal sources become relevant to establish
the agendas (because the agenda does not exist anymore). Worldwide publishing
without editors, but with a close peer review daily process and in most cases open to
comments from readers is the nature of social web publishing. As a result of that, the
agenda of relevant current affairs goes beyond the established media land and now is
share with a wide variety of new sources, most of them not media, including social web
portals, mailing lists, e-bulletins, search engines, newsgroups, forum and weblogs with
their respective feeds when available.
7. From distribution to access. The broadcasting paradigm of one to many
unilateral distribution is replaced by both: many to one access and many to many
communication. Client-server architecture of the internet started a new model based on
the decisions of the users. The access paradigm is complementary with the user
center paradigm and both explain the strong interactive nature of the new
environment. Access means to seek, search, navigate, surf, decide, an active attitude, a
will to connect and communicate, the contrary of the passive reception of media content.
“My daily visits”, “My homepage”, “My favourites”, walls and timelines are expressions of
this personal way to seek for content, and the lasts attempts of contextual advertising
shows how the old dynamics has change: now advertisers are looking for targets outside
the media arena, testing ways for a personal approach based on keywords searching and
database mining.

8. From one way to interactivity. Far from the single-direction point-multipoint


asymmetrical distribution model of legacy media, with the net emerges a bilateral
inverse model many-to-one based on the client-server architecture of the internet, but
also a multilateral horizontal and symmetrical many-to-many model. The fact that
content providers and users access the same channel to communicate, enable the users
to establish a bilateral relationship with media and also a multilateral relationship with
other users of the system. Secondly, by the same rule, users could become content
providers.

9. From linear to hypertext. Analogue media narrative construction is linear and


narrators have the power to control the story organization and tempo. The digital
platforms enable narrators to organize content by fragmenting it into small units (nodes)
with multiples paths between them (links). Hypertextual narratives empower the
user shifting the control of the narrative from the narrator to the reader.

10. From data to knowledge. The extraordinary amount of data available in the
Digital Age bring back the strategic role of media as social managers of knowledge, a role
to be shared with an increasing number of new players.

You might also like