0% found this document useful (0 votes)
96 views

Computer Science: Jump To Navigation Jump To Search

Computer science is the study of processes that interact with data and can be represented as programs. It enables the use of algorithms to manipulate, store, and communicate digital information. Computer science can be divided into theoretical foundations of computation and practical techniques for application. It includes fields like data structures and algorithms, theory of computation, programming languages, and human-computer interaction. The earliest foundations of computer science predate modern computers, with pioneers developing early mechanical calculators and algorithms in the 17th-18th centuries. Computer science emerged as a distinct academic discipline in the 1950s with the development of computer degree programs.

Uploaded by

Peter Luvis
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
96 views

Computer Science: Jump To Navigation Jump To Search

Computer science is the study of processes that interact with data and can be represented as programs. It enables the use of algorithms to manipulate, store, and communicate digital information. Computer science can be divided into theoretical foundations of computation and practical techniques for application. It includes fields like data structures and algorithms, theory of computation, programming languages, and human-computer interaction. The earliest foundations of computer science predate modern computers, with pioneers developing early mechanical calculators and algorithms in the 17th-18th centuries. Computer science emerged as a distinct academic discipline in the 1950s with the development of computer degree programs.

Uploaded by

Peter Luvis
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 12

Computer science

From Wikipedia, the free encyclopedia

Jump to navigationJump to search


"Computer sciences" redirects here. For the American corporation, see Computer Sciences
Corporation.

Not to be confused with computational science or software engineering.

Computer science deals with the theoretical foundations of computation and practical techniques for their
application.

Computer science

 History
 Outline
 Glossary
 Category

 v
 t
 e

Computer science is the study of processes that interact with data and that can be represented
as data in the form of programs. It enables the use of algorithms to manipulate, store,
and communicate digital information. A computer scientist studies the theory of computation and
the design of software systems.[1]
Its fields can be divided into theoretical and practical disciplines. Computational complexity
theory is highly abstract, while computer graphics emphasizes real-world
applications. Programming language theory considers approaches to the description of
computational processes, while software engineering involves the use of programming
languages and complex systems. Human–computer interaction considers the challenges in
making computers useful, usable, and accessible.
Contents

 1History

 2Etymology

 3Philosophy

 4Fields

o 4.1Theoretical computer science

 4.1.1Data structures and algorithms

 4.1.2Theory of computation

 4.1.3Information and coding theory

 4.1.4Programming language theory

 4.1.5Formal methods

o 4.2Computer systems

 4.2.1Computer architecture and computer engineering

 4.2.2Computer performance analysis

 4.2.3Concurrent, parallel and distributed systems

 4.2.4Computer networks

 4.2.5Computer security and cryptography

 4.2.6Databases

o 4.3Computer applications

 4.3.1Computer graphics and visualization

 4.3.2Human–computer interaction

 4.3.3Scientific computing and simulation

 4.3.4Artificial intelligence

o 4.4Software engineering

 5Discoveries

 6Programming paradigms
 7Academia

 8Education

o 8.1Women in Computer Science

 9See also

 10Notes

 11References

 12Further reading

o 12.1Overview

o 12.2Selected literature

o 12.3Articles

o 12.4Curriculum and classification

 13External links

o 13.1Bibliography and academic search engines

o 13.2Professional organizations

o 13.3Misc

History[edit]
Main article: History of computer science

History of computing

Hardware

 Hardware before 1960


 Hardware 1960s to present

Software

 Software
 Unix
 Free software and open-source software

Computer science

 Artificial intelligence
 Compiler construction
 Computer science
 Operating systems
 Programming languages
 Prominent pioneers
 Software engineering

Modern concepts

 General-purpose CPUs
 Graphical user interface
 Internet
 Laptops
 Personal computers
 Video games
 World Wide Web

By country

 Bulgaria
 Poland
 Romania
 Soviet Bloc
 Soviet Union
 Yugoslavia

Timeline of computing

 before 1950
 1950–1979
 1980–1989
 1990–1999
 2000–2009
 2010–2019
 more timelines ...

Glossary of computer science

  Category

 v
 t
 e
Charles Babbage, sometimes referred to as the "father of computing".[2]

Ada Lovelace is often credited with publishing the first algorithm intended for processing on a computer.[3]

The earliest foundations of what would become computer science predate the invention of the
modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have
existed since antiquity, aiding in computations such as multiplication and division. Algorithms for
performing computations have existed since antiquity, even before the development of
sophisticated computing equipment.
Wilhelm Schickard designed and constructed the first working mechanical calculator in 1623.[4] In
1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped
Reckoner.[5] He may be considered the first computer scientist and information theorist, for,
among other reasons, documenting the binary number system. In 1820, Thomas de
Colmar launched the mechanical calculator industry[note 1] when he invented his
simplified arithmometer, which was the first calculating machine strong enough and reliable
enough to be used daily in an office environment. Charles Babbage started the design of the
first automatic mechanical calculator, his Difference Engine, in 1822, which eventually gave him
the idea of the first programmable mechanical calculator, his Analytical Engine.[6] He started
developing this machine in 1834, and "in less than two years, he had sketched out many of
the salient features of the modern computer". [7] "A crucial step was the adoption of a punched
card system derived from the Jacquard loom"[7] making it infinitely programmable. [note 2] In 1843,
during the translation of a French article on the Analytical Engine, Ada Lovelace wrote, in one of
the many notes she included, an algorithm to compute the Bernoulli numbers, which is
considered to be the first published algorithm ever specifically tailored for implementation on a
computer.[8] Around 1885, Herman Hollerith invented the tabulator, which used punched cards to
process statistical information; eventually his company became part of IBM. Following Babbage,
although unaware of his earlier work, Percy Ludgate in 1909 published [9] the 2nd of the only two
designs for mechanical analytical engines in history. In 1937, one hundred years after Babbage's
impossible dream, Howard Aiken convinced IBM, which was making all kinds of punched card
equipment and was also in the calculator business[10] to develop his giant programmable
calculator, the ASCC/Harvard Mark I, based on Babbage's Analytical Engine, which itself used
cards and a central computing unit. When the machine was finished, some hailed it as
"Babbage's dream come true".[11]
During the 1940s, as new and more powerful computing machines such as the Atanasoff–Berry
computer and ENIAC were developed, the term computer came to refer to the machines rather
than their human predecessors.[12] As it became clear that computers could be used for more than
just mathematical calculations, the field of computer science broadened to study computation in
general. In 1945, IBM founded the Watson Scientific Computing Laboratory at Columbia
University in New York City. The renovated fraternity house on Manhattan's West Side was IBM's
first laboratory devoted to pure science. The lab is the forerunner of IBM's Research Division,
which today operates research facilities around the world. [13] Ultimately, the close relationship
between IBM and the university was instrumental in the emergence of a new scientific discipline,
with Columbia offering one of the first academic-credit courses in computer science in 1946.
[14]
 Computer science began to be established as a distinct academic discipline in the 1950s and
early 1960s.[15][16] The world's first computer science degree program, the Cambridge Diploma in
Computer Science, began at the University of Cambridge Computer Laboratory in 1953. The first
computer science department in the United States was formed at Purdue University in 1962.
[17]
 Since practical computers became available, many applications of computing have become
distinct areas of study in their own rights.
Although many initially believed it was impossible that computers themselves could actually be a
scientific field of study, in the late fifties it gradually became accepted among the greater
academic population.[18][19] It is the now well-known IBM brand that formed part of the computer
science revolution during this time. IBM (short for International Business Machines) released the
IBM 704[20] and later the IBM 709[21] computers, which were widely used during the exploration
period of such devices. "Still, working with the IBM [computer] was frustrating […] if you had
misplaced as much as one letter in one instruction, the program would crash, and you would
have to start the whole process over again". [18] During the late 1950s, the computer science
discipline was very much in its developmental stages, and such issues were commonplace. [19]
The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John
Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first
working transistor, the point-contact transistor, in 1947.[22][23] In 1953, the University of
Manchester built the first transistorized computer, called the Transistor Computer.[24] However,
early junction transistors were relatively bulky devices that were difficult to manufacture on a
mass-production basis, which limited them to a number of specialised applications. [25] The metal–
oxide–silicon field-effect transistor (MOSFET, or MOS transistor) was invented by Mohamed
Atalla and Dawon Kahng at Bell Labs in 1959.[26][27] It was the first truly compact transistor that
could be miniaturised and mass-produced for a wide range of uses.[25] The MOSFET made it
possible to build high-density integrated circuit chips,[28][29] leading to what is known as
the computer revolution[30] or microcomputer revolution.[31]
Time has seen significant improvements in the usability and effectiveness of computing
technology.[32] Modern society has seen a significant shift in the users of computer technology,
from usage only by experts and professionals, to a near-ubiquitous user base. Initially,
computers were quite costly, and some degree of humanitarian aid was needed for efficient use
—in part from professional computer operators. As computer adoption became more widespread
and affordable, less human assistance was needed for common usage.
See also: History of computing and History of informatics

Etymology[edit]
See also: Informatics §  Etymology

Although first proposed in 1956,[19] the term "computer science" appears in a 1959 article
in Communications of the ACM,[33] in which Louis Fein argues for the creation of a Graduate
School in Computer Sciences analogous to the creation of Harvard Business School in 1921,
[34]
 justifying the name by arguing that, like management science, the subject is applied and
interdisciplinary in nature, while having the characteristics typical of an academic discipline. [33] His
efforts, and those of others such as numerical analyst George Forsythe, were rewarded:
universities went on to create such departments, starting with Purdue in 1962. [35] Despite its
name, a significant amount of computer science does not involve the study of computers
themselves. Because of this, several alternative names have been proposed. [36] Certain
departments of major universities prefer the term computing science, to emphasize precisely that
difference. Danish scientist Peter Naur suggested the term datalogy,[37] to reflect the fact that the
scientific discipline revolves around data and data treatment, while not necessarily involving
computers. The first scientific institution to use the term was the Department of Datalogy at the
University of Copenhagen, founded in 1969, with Peter Naur being the first professor in datalogy.
The term is used mainly in the Scandinavian countries. An alternative term, also proposed by
Naur, is data science; this is now used for a multi-disciplinary field of data analysis, including
statistics and databases.
Also, in the early days of computing, a number of terms for the practitioners of the field of
computing were suggested in the Communications of the ACM—turingineer, turologist, flow-
charts-man, applied meta-mathematician, and applied epistemologist.[38] Three months later in
the same journal, comptologist was suggested, followed next year by hypologist.[39] The
term computics has also been suggested.[40] In Europe, terms derived from contracted
translations of the expression "automatic information" (e.g. "informazione automatica" in Italian)
or "information and mathematics" are often used,
e.g. informatique (French), Informatik (German), informatica (Italian,
Dutch), informática (Spanish, Portuguese), informatika (Slavic languages and Hungarian)
or pliroforiki (πληροφορική, which means informatics) in Greek. Similar words have also been
adopted in the UK (as in the School of Informatics of the University of Edinburgh).[41] "In the U.S.,
however, informatics is linked with applied computing, or computing in the context of another
domain."[42]
A folkloric quotation, often attributed to—but almost certainly not first formulated by—Edsger
Dijkstra, states that "computer science is no more about computers than astronomy is about
telescopes."[note 3] The design and deployment of computers and computer systems is generally
considered the province of disciplines other than computer science. For example, the study of
computer hardware is usually considered part of computer engineering, while the study of
commercial computer systems and their deployment is often called information technology
or information systems. However, there has been much cross-fertilization of ideas between the
various computer-related disciplines. Computer science research also often intersects other
disciplines, such as philosophy, cognitive
science, linguistics, mathematics, physics, biology, statistics, and logic.
Computer science is considered by some to have a much closer relationship with mathematics
than many scientific disciplines, with some observers saying that computing is a mathematical
science.[15] Early computer science was strongly influenced by the work of mathematicians such
as Kurt Gödel, Alan Turing, John von Neumann, Rózsa Péter and Alonzo Church and there
continues to be a useful interchange of ideas between the two fields in areas such
as mathematical logic, category theory, domain theory, and algebra.[19]
The relationship between Computer Science and Software Engineering is a contentious issue,
which is further muddied by disputes over what the term "Software Engineering" means, and how
computer science is defined.[43] David Parnas, taking a cue from the relationship between other
engineering and science disciplines, has claimed that the principal focus of computer science is
studying the properties of computation in general, while the principal focus of software
engineering is the design of specific computations to achieve practical goals, making the two
separate but complementary disciplines.[44]
The academic, political, and funding aspects of computer science tend to depend on whether a
department formed with a mathematical emphasis or with an engineering emphasis. Computer
science departments with a mathematics emphasis and with a numerical orientation consider
alignment with computational science. Both types of departments tend to make efforts to bridge
the field educationally if not across all research.
Philosophy[edit]
Main article: Philosophy of computer science

A number of computer scientists have argued for the distinction of three separate paradigms in
computer science. Peter Wegner argued that those paradigms are science, technology, and
mathematics.[45] Peter Denning's working group argued that they are theory, abstraction
(modeling), and design.[46] Amnon H. Eden described them as the "rationalist paradigm" (which
treats computer science as a branch of mathematics, which is prevalent in theoretical computer
science, and mainly employs deductive reasoning), the "technocratic paradigm" (which might be
found in engineering approaches, most prominently in software engineering), and the "scientific
paradigm" (which approaches computer-related artifacts from the empirical perspective of natural
sciences, identifiable in some branches of artificial intelligence).[47] Computer science focuses on
methods involved in design, specification, programming, verification, implementation and testing
of human-made computing systems. [48]

Fields[edit]
Computer science is no more about computers than astronomy is about telescopes.

— Michael Fellows
Further information: Outline of computer science
As a discipline, computer science spans a range of topics from theoretical studies of algorithms
and the limits of computation to the practical issues of implementing computing systems in
hardware and software.[49][50] CSAB, formerly called Computing Sciences Accreditation Board—
which is made up of representatives of the Association for Computing Machinery (ACM), and
the IEEE Computer Society (IEEE CS)[51]—identifies four areas that it considers crucial to the
discipline of computer science: theory of computation, algorithms and data
structures, programming methodology and languages, and computer elements and architecture.
In addition to these four areas, CSAB also identifies fields such as software engineering, artificial
intelligence, computer networking and communication, database systems, parallel computation,
distributed computation, human–computer interaction, computer graphics, operating systems,
and numerical and symbolic computation as being important areas of computer science.[49]

Theoretical computer science[edit]


Main article: Theoretical computer science

Theoretical Computer Science is mathematical and abstract in spirit, but it derives its motivation
from the practical and everyday computation. Its aim is to understand the nature
of computation and, as a consequence of this understanding, provide more efficient
methodologies. All studies related to mathematical, logic and formal concepts and methods could
be considered as theoretical computer science, provided that the motivation is clearly drawn from
the field of computing.
Data structures and algorithms[edit]
Main articles: Data structure and Algorithm

Data structures and algorithms are the studies of commonly used computational methods and
their computational efficiency.

O(n2)
Analysis of Combinatorial Computational
Algorithms Data structures
algorithms optimization geometry
Theory of computation[edit]
Main article: Theory of computation

According to Peter Denning, the fundamental question underlying computer science is, "What
can be (efficiently) automated?" [15] Theory of computation is focused on answering fundamental
questions about what can be computed and what amount of resources are required to perform
those computations. In an effort to answer the first question, computability theory examines
which computational problems are solvable on various theoretical models of computation. The
second question is addressed by computational complexity theory, which studies the time and
space costs associated with different approaches to solving a multitude of computational
problems.
The famous P = NP? problem, one of the Millennium Prize Problems,[52] is an open problem in the
theory of computation.

GNITIRW-
P = NP?
TERCES

Computability Computational Quantum computing


Automata theory Cryptography
theory complexity theory theory
Information and coding theory[edit]
Main articles: Information theory and Coding theory

Information theory is related to the quantification of information. This was developed by Claude
Shannon to find fundamental limits on signal processing operations such as compressing data
and on reliably storing and communicating data. [53] Coding theory is the study of the properties
of codes (systems for converting information from one form to another) and their fitness for a
specific application. Codes are used for data compression, cryptography, error detection and
correction, and more recently also for network coding. Codes are studied for the purpose of
designing efficient and reliable data transmission methods. [54]
Programming language theory[edit]
Main article: Programming language theory

Programming language theory is a branch of computer science that deals with the design,
implementation, analysis, characterization, and classification of programming languages and
their individual features. It falls within the discipline of computer science, both depending on and
affecting mathematics, software engineering, and linguistics. It is an active research area, with
numerous dedicated academic journals.

Type theory Compiler design Programming languages


Formal methods[edit]
Main article: Formal methods

Formal methods are a particular kind of mathematically based technique for the specification,


development and verification of software and hardware systems.[55] The use of formal methods for
software and hardware design is motivated by the expectation that, as in other engineering
disciplines, performing appropriate mathematical analysis can contribute to the reliability and
robustness of a design. They form an important theoretical underpinning for software
engineering, especially where safety or security is involved. Formal methods are a useful adjunct
to software testing since they help avoid errors and can also give a framework for testing. For
industrial use, tool support is required. However, the high cost of using formal methods means
that they are usually only used in the development of high-integrity and life-critical systems,
where safety or security is of utmost importance. Formal methods are best described as the
application of a fairly broad variety of theoretical computer science fundamentals, in
particular logic calculi, formal languages, automata theory, and program semantics, but also type
systems and algebraic data types to problems in software and hardware specification and
verification.

Computer systems[edit]
Computer architecture and computer engineering[edit]
Main articles: Computer architecture and Computer engineering

Computer architecture, or digital computer organization, is the conceptual design and


fundamental operational structure of a computer system. It focuses largely on the way by which
the central processing unit performs internally and accesses addresses in memory. [56] The field
often involves disciplines of computer engineering and electrical engineering, selecting and
interconnecting hardware components to create computers that meet functional, performance,
and cost goals.

Digital logic Microarchitecture Multiprocessing

Ubiquitous computing Systems architecture Operating systems


Computer performance analysis[edit]
Main articles: Computer performance and Benchmark (computing)

Computer performance analysis is the study of work flowing through computers with the general
goals of improving throughput, controlling response time, using resources efficiently,
eliminating bottlenecks, and predicting performance under anticipated peak loads.
[57]
 Benchmarks are used to compare the performance of systems carrying different chips and/or
system architectures.[58]
Concurrent, parallel and distributed systems[edit]
Main articles: Concurrency (computer science) and Distributed computing

Concurrency is a property of systems in which several computations are executing


simultaneously, and potentially interacting with each other. [59] A number of mathematical models
have been developed for general concurrent computation including Petri nets, process
calculi and the Parallel Random Access Machine model.[60] When multiple computers are
connected in a network while using concurrency, this is known as a distributed system.
Computers within that distributed system have their own private memory, and information can be
exchanged to achieve common goals.[61]
Computer networks[edit]
Main article: Computer network

This branch of computer science aims to manage networks between computers worldwide.
Computer security and cryptography[edit]
Main articles: Computer security and Cryptography

Computer security is a branch of computer technology with an objective of protecting information


from unauthorized access, disruption, or modification while maintaining the accessibility and
usability of the system for its intended users. Cryptography is the practice and study of hiding
(encryption) and therefore deciphering (decryption) information. Modern cryptography is largely
related to computer science, for many encryption and decryption algorithms are based on their
computational complexity.
Databases[edit]
Main article: Database

This article is missing information about a structured set of


data held in a computer, especially one that is accessible in
various ways.. Please expand the article to include this
information. Further details may exist on the talk page. (September
2018)

A database is intended to organize, store, and retrieve large amounts of data easily. Digital
databases are managed using database management systems to store, create, maintain, and
search data, through database models and query languages.

Computer applications[edit]
Computer graphics and visualization[edit]
Main article: Computer graphics (computer science)

Computer graphics is the study of digital visual contents and involves the synthesis and
manipulation of image data. The study is connected to many other fields in computer science,
including computer vision, image processing, and computational geometry, and is heavily applied
in the fields of special effects and video games.
Human–computer interaction[edit]
Main article: Human–computer interaction

Research that develops theories, principles, and guidelines for user interface designers, so they
can create satisfactory user experiences with desktop, laptop, and mobile devices.
Scientific computing and simulation[edit]
Scientific computing (or computational science) is the field of study concerned with
constructing mathematical models and quantitative analysis techniques and using computers to
analyze and solve scientific problems. A major usage of scientific computing is simulation of
various processes, including computational fluid dynamics, physical, electrical, and electronic
systems and circuits, as well as societies and social situations (notably war games) along with
their habitats, among many others. Modern computers enable optimization of such designs as
complete aircraft. Notable in electrical and electronic circuit design are SPICE, [62] as well as
software for physical realization of new (or modified) designs. The latter includes essential design
software for integrated circuits.[citation needed]
Numerical analysis Computational physics Computational chemistry Bioinformatics
Artificial intelligence[edit]
Main article: Artificial intelligence

Artificial intelligence (AI) aims to or is required to synthesize goal-orientated processes such as


problem-solving, decision-making, environmental adaptation, learning, and communication found
in humans and animals. From its origins in cybernetics and in the Dartmouth Conference (1956),
artificial intelligence research has been necessarily cross-disciplinary, drawing on areas of
expertise such as applied mathematics, symbolic logic, semiotics, electrical
engineering, philosophy of mind, neurophysiology, and social intelligence. AI is associated in the
popular mind with robotic development, but the main field of practical application has been as an
embedded component in areas of software development, which require computational
understanding. The starting point in the late 1940s was Alan Turing's question "Can computers
think?", and the question remains effectively unanswered, although the Turing test is still used to
assess computer output on the scale of human intelligence. But the automation of evaluative and
predictive tasks has been increasingly successful as a substitute for human monitoring and
intervention in domains of computer application involving complex real-world data.

Machine learning Computer vision Image processing

Pattern recognition Data mining Evolutionary computation

Knowledge representation and Natural language


Robotics
reasoning processing

You might also like