0% found this document useful (0 votes)
47 views

Computer Language

A programming language is a formal language used to communicate instructions to a machine, particularly a computer, to perform specific tasks. Programming languages allow humans to write programs that control the behavior of machines and to express algorithms precisely. There are thousands of programming languages that have been created for different purposes and domains, with many new ones still being developed. Programming languages can be categorized based on their design and features, such as whether they are imperative, functional, static or dynamically typed.
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views

Computer Language

A programming language is a formal language used to communicate instructions to a machine, particularly a computer, to perform specific tasks. Programming languages allow humans to write programs that control the behavior of machines and to express algorithms precisely. There are thousands of programming languages that have been created for different purposes and domains, with many new ones still being developed. Programming languages can be categorized based on their design and features, such as whether they are imperative, functional, static or dynamically typed.
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

A programming language is an artificial language designed to express computations that can be

performed by a machine, particularly a computer. Programming languages can be used to create


programs that control the behavior of a machine, to express algorithms precisely, or as a mode of
human communication.

The earliest programming languages predate the invention of the computer, and were used to
direct the behavior of machines such as Jacquard looms and player pianos. Thousands of
different programming languages have been created, mainly in the computer field, with many
more being created every year. Most programming languages describe computation in an
imperative style, i.e., as a sequence of commands, although some languages, such as those that
support functional programming or logic programming, use alternative forms of description.

A programming language is usually split into the two components of syntax (form) and
semantics (meaning) and many programming languages have some kind of written specification
of their syntax and/or semantics. Some languages are defined by a specification document, for
example, the C programming language is specified by an ISO Standard, while other languages,
such as Perl, have a dominant implementation that is used as a reference.

A programming language is a notation for writing programs, which are specifications of a


computation or algorithm.[1] Some, but not all, authors restrict the term "programming language"
to those languages that can express all possible algorithms.[1][2] Traits often considered important
for what constitutes a programming language include:

 Function and target: A computer programming language is a language[3] used to write


computer programs, which involve a computer performing some kind of computation[4] or
algorithm and possibly control external devices such as printers, disk drives, robots,[5] and
so on. For example PostScript programs are frequently created by another program to
control a computer printer or display. More generally, a programming language may
describe computation on some, possibly abstract, machine. It is generally accepted that a
complete specification for a programming language includes a description, possibly
idealized, of a machine or processor for that language.[6] In most practical contexts, a
programming language involves a computer; consequently programming languages are
usually defined and studied this way.[7] Programming languages differ from natural
languages in that natural languages are only used for interaction between people, while
programming languages also allow humans to communicate instructions to machines.
 Abstractions: Programming languages usually contain abstractions for defining and
manipulating data structures or controlling the flow of execution. The practical necessity
that a programming language support adequate abstractions is expressed by the
abstraction principle;[8] this principle is sometimes formulated as recommendation to the
programmer to make proper use of such abstractions.[9]
 Expressive power: The theory of computation classifies languages by the computations
they are capable of expressing. All Turing complete languages can implement the same
set of algorithms. ANSI/ISO SQL and Charity are examples of languages that are not
Turing complete, yet often called programming languages.[10][11]
Markup languages like XML, HTML or troff, which define structured data, are not generally
considered programming languages.[12][13][14] Programming languages may, however, share the
syntax with markup languages if a computational semantics is defined. XSLT, for example, is a
Turing complete XML dialect.[15][16][17] Moreover, LaTeX, which is mostly used for structuring
documents, also contains a Turing complete subset.[18][19]

The term computer language is sometimes used interchangeably with programming language.[20]
However, the usage of both terms varies among authors, including the exact scope of each. One
usage describes programming languages as a subset of computer languages.[21] In this vein,
languages used in computing that have a different goal than expressing computer programs are
generically designated computer languages. For instance, markup languages are sometimes
referred to as computer languages to emphasize that they are not meant to be used for
programming.[22] Another usage regards programming languages as theoretical constructs for
programming abstract machines, and computer languages as the subset thereof that runs on
physical computers, which have finite hardware resources.[23] John C. Reynolds emphasizes that
formal specification languages are just as much programming languages as are the languages
intended for execution. He also argues that textual and even graphical input formats that affect
the behavior of a computer are programming languages, despite the fact they are commonly not
Turing-complete, and remarks that ignorance of programming language concepts is the reason
for many flaws in input formats.[24]

A language is typed if the specification of every operation defines types of data to which the
operation is applicable, with the implication that it is not applicable to other types.[29] For
example, the data represented by "this text between the quotes" is a string. In most
programming languages, dividing a number by a string has no meaning. Most modern
programming languages will therefore reject any program attempting to perform such an
operation. In some languages, the meaningless operation will be detected when the program is
compiled ("static" type checking), and rejected by the compiler, while in others, it will be
detected when the program is run ("dynamic" type checking), resulting in a runtime exception.

A special case of typed languages are the single-type languages. These are often scripting or
markup languages, such as REXX or SGML, and have only one data type—most commonly
character strings which are used for both symbolic and numeric data.

In contrast, an untyped language, such as most assembly languages, allows any operation to be
performed on any data, which are generally considered to be sequences of bits of various lengths.
[29]
High-level languages which are untyped include BCPL and some varieties of Forth.

In practice, while few languages are considered typed from the point of view of type theory
(verifying or rejecting all operations), most modern languages offer a degree of typing.[29] Many
production languages provide means to bypass or subvert the type system.

[edit] Static versus dynamic typing


In static typing all expressions have their types determined prior to the program being run
(typically at compile-time). For example, 1 and (2+2) are integer expressions; they cannot be
passed to a function that expects a string, or stored in a variable that is defined to hold dates.[29]

Statically typed languages can be either manifestly typed or type-inferred. In the first case, the
programmer must explicitly write types at certain textual positions (for example, at variable
declarations). In the second case, the compiler infers the types of expressions and declarations
based on context. Most mainstream statically typed languages, such as C++, C# and Java, are
manifestly typed. Complete type inference has traditionally been associated with less mainstream
languages, such as Haskell and ML. However, many manifestly typed languages support partial
type inference; for example, Java and C# both infer types in certain limited cases.[30]

Dynamic typing, also called latent typing, determines the type-safety of operations at runtime; in
other words, types are associated with runtime values rather than textual expressions.[29] As with
type-inferred languages, dynamically typed languages do not require the programmer to write
explicit type annotations on expressions. Among other things, this may permit a single variable
to refer to values of different types at different points in the program execution. However, type
errors cannot be automatically detected until a piece of code is actually executed, making
debugging more difficult. Ruby, Lisp, JavaScript, and Python are dynamically typed.

[edit] Weak and strong typing

Weak typing allows a value of one type to be treated as another, for example treating a string as a
number.[29] This can occasionally be useful, but it can also allow some kinds of program faults to
go undetected at compile time and even at runtime.

Strong typing prevents the above. An attempt to perform an operation on the wrong type of value
raises an error.[29] Strongly typed languages are often termed type-safe or safe.

An alternative definition for "weakly typed" refers to languages, such as Perl and JavaScript,
which permit a large number of implicit type conversions. In JavaScript, for example, the
expression 2 * x implicitly converts x to a number, and this conversion succeeds even if x is
null, undefined, an Array, or a string of letters. Such implicit conversions are often useful, but
they can mask programming errors.

Strong and static are now generally considered orthogonal concepts, but usage in the literature
differs. Some use the term strongly typed to mean strongly, statically typed, or, even more
confusingly, to mean simply statically typed. Thus C has been called both strongly typed and
weakly, statically typed.[31][32]

You might also like