What Is A Programming Language
What Is A Programming Language
What Is A Programming Language
In the simplest terms, a programming language is a formal system of rules and symbols used to write
computer programs. It acts as an intermediary or link between humans and machines, allowing programmers
to express their intentions through code that a computer can execute.
These languages provide a structured and consistent way to write instructions that machines can understand
and execute. By writing code in a format that is easier to read and understand, programmers can design
reliable, maintainable, and scalable programs. Various programming languages have different strengths and
weaknesses, making them suitable for other tasks.
The first programming languages were the machine and assembly languages of the earliest computers,
beginning in the 1940s. Hundreds of programming languages and dialects have been developed since that
time. Most have had a limited life span and utility, while a few have enjoyed widespread success in one or
more application domains. Many have played an impm1ant role in influencing the design of future languages.
A snapshot of the historical development of several influential programming languages appears in Figure 1.2.
While it is surely not complete, Figure 1.2 identifies some of the most influential events and trends. Each
arrow in Figure 1.2 indicates a significant design influence from an older language to a successor.
The 1950s marked the beginning of the age of "higher-order languages" (HOLs for short). A HOL
distinguishes itself from a machine or assembly language because its programming style is independent of any
particular machine architecture. The first higher-order languages were Fortran, Cobol, Algol, and Lisp. Both
Fortran and Cobol have survived and evolved greatly since their emergence in the late 1950s. These languages
built a large following and carry with them an enormous body of legacy code that today's programmers
maintain. On the other hand, Lisp has substantially declined in use and Algol has disappeared altogether.
However, the innovative designs of these early languages have had influence on their successors. For
example, Fortran's demonstration that algebraic notation could be translated to efficient code is now taken for
granted, as are Cobol's introduction of the record structure, Pascal's design for one-pass compiling, and
Algol's demonstration that a linguistic grammar could formally define its syntax.
Perhaps the greatest motivator for the development of programming languages over the last several decades is
the rapidly evolving demand for computing power and new applications by large and diverse communities of
users. The following user communities can claim a major stake in the programming language landscape:
• Artificial intelligence
• Education
• Science and engineering
• Information systems
• Systems and networks
• World Wide Web
The computational problem domains of these communities are all different, and so are the major
programming languages that developed around them. Below we sketch the major computational goals and
language designs that have served each of these communities.
Artificial Intelligence
The artificial intelligence programming community has been active since the early 1960s. This community is
concerned about developing programs that model human intelligent behavior, logical deduction, and
cognition. Symbol manipulation, functional expressions, and the design of logical proof systems have been
central goals in this ongoing effort.
The paradigms of functional programming and logic programming have evolved largely through the efforts of
artificial intelligence programmers. Prominent functional programming languages over the years include Lisp,
Scheme, ML, and HaskelL The prominent logic programming languages include Pro log and CLP.
The first AI language, Lisp (an acronym for "List Processor"), was designed by John McCarthy in 1960.
Figure 1.2 suggests that Lisp was dominant in early years and has become less dominant in recent years.
However, Lisp's core features have motivated the development of more recent languages such as Scheme,
ML, and Haskell. The strong relationship between Lisp and the lambda calculus (a formalism for modeling
the nature of mathematical functions) provides a firm mathematical basis for the later evolution of these
successors.
Education
In the 1960s and 1970s, several key languages were designed with a primary goal of teaching students about
programming. For example, Basic was designed in the 1960s by John Kemeny to facilitate the learning of
programming through time sharing, an architecture in which a single computer is directly connected to several
terminals at one time. Each terminal user shares time on the computer by receiving a small "time slice" of
computational power on a regular basis. Basic has enjoyed great popularity over the years, especially as a
teaching language in secondary schools and college-level science programs.
The language Pascal, a derivative of Algol, was designed in the 1970s for the purpose of teaching
programming. Pascal served for several years as the main teaching language in college-level computer science
curricula. During the last decade, these languages have been largely replaced in educational programs by such
"industrial strength" languages as C, C , and Java. This change has both benefits and liabilities. On the one
hand, learning an industrial strength language provides graduates with a programming tool that they can use
immediately when they enter the computing profession. On the other hand, such a language is inherently more
complex to learn as a first language in undergraduate course work.
Syntax: The syntax of a language describes what constitutes a structurally correct program. Syntax answers
many questions. What is the grammar for writing programs in the language? What is the basic set of words
and symbols that programmers use to write structurally correct programs? Most of the syntactic structure of
modern programming languages is defined using a linguistic formalism called the context-free grammar:
Other elements of syntax are outside the realm of context-free grammars, and are defined by other means. A
study of language syntax raises many questions. How does a compiler analyze the syntax of a program? How
are syntax errors detected? How does a context-free grammar facilitate the development of a syntactic
analyzer?
Names and Types: The vocabulary of a programming language includes a carefully designed set of rules for
naming entities- variables, functions, classes, parameters, and so forth. Names of entities also have other
properties during the life of a program, such as their scope, visibility, and binding.
A language's types denote the kinds of values that programs can manipulate: simple types, structured types,
and more complex types. Among the simple types are integers, decimal numbers, characters, and Boolean
values. Structured types include character strings, lists, trees, and hash tables. More complex types include
functions and classes.
A type system enables the programmer to understand and properly implement operations on values of various
types. A carefully specified type system allows the compiler to perform rigorous type checking on a program
before run time, thus heading off run-time errors that may occur because of inappropriately typed operands.
Semantics: The meaning of a program is defined by its semantics. That is, when a program is run, the effect
of each statement on the values of the variables in the program is given by the semantics of the language.
Thus, when we write a program, we must understand such basic ideas as the exact effect that an assignment
has on the program's variables. If we have a semantic model that is independent of any particular platform, we
can apply that model to a variety of machines on which that language may be implemented.
Functions represent the key element of procedural abstraction in any language. An understanding of the
semantics of function definition and call is central to any study of programming languages. The
implementation of functions also requires an understanding of the static and dynamic elements of memory,
including the run-time stack. The stack also helps us understand other ideas like the scope of a name and the
lifetime of an object.
The stack implementation of function call and return is a central topic deserving deeper study. Moreover,
strategies for the management of another memory area called the heap, are important to the understanding of
dynamic objects like arrays. Heap management techniques called "garbage collection" are strongly related to
the implementation of these dynamic objects.