0% found this document useful (0 votes)
87 views31 pages

CH 1

The document provides an overview of the topics that will be covered in Chapter 1 of the introduction. It discusses the art of language design, the programming language spectrum, the reasons for studying programming languages, and an overview of compilation. Key points include that there are many programming languages due to evolution and specialization, that successful languages are easy to learn and use, and that studying languages helps in choosing languages and making better use of them. It also outlines the phases of compilation from scanning to code generation and optimization.

Uploaded by

vijay vidyalaya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
87 views31 pages

CH 1

The document provides an overview of the topics that will be covered in Chapter 1 of the introduction. It discusses the art of language design, the programming language spectrum, the reasons for studying programming languages, and an overview of compilation. Key points include that there are many programming languages due to evolution and specialization, that successful languages are easy to learn and use, and that studying languages helps in choosing languages and making better use of them. It also outlines the phases of compilation from scanning to code generation and optimization.

Uploaded by

vijay vidyalaya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 31

Chapter 1 Introduction

2003 봄
한욱신

1
Contents
 Art of language design
 Programming language spectrum
 Why study programming languages?
 Compilation and interpretation
 Programming environments
 Overview of compilation

2
Art of Language Design
 Why are there so many programming
languages?
 What makes a language successful?
 Why do we have programming
languages? -- what is a language for?

3
Why are there so many
programming languages?
 evolution -- we've learned better ways of
doing things over time
 orientation toward special purposes
 orientation toward personal preference

4
What makes a language successful?
 easy to learn (BASIC, Pascal, LOGO, Sche
me)
 easy to express things -- easy to use onc
e fluent -- "powerful" (C++, Common
Lisp, APL, Algol-68, perl)
 easy to implement (BASIC, Forth) possi
ble to compile to very good (fast/small) c
ode (Fortran)
 backing of a powerful sponsor (COBOL, P
L/1, Ada, Visual Basic)
 wide dissemination at minimal cost (Pasc
al, Turing, Java)
5
Why do we have programming languages? --
what is a language for?
 languages from the user's point of view
 way of thinking -- way of expressing algorith
ms
 languages from the implementor's point
of view
 abstraction of virtual machine -- way of specif
ying what you want the hardware to do witho
ut getting down into the bits

6
Why study programming languages?
 Help you choose a language
 C vs Modula-3 vs C++ for systems programmin
g
 Fortran vs APL vs Ada for numerical computati
ons
 C vs Ada vs Modula-2 for embedded systems

 Common Lisp vs Scheme vs ML for symbolic da


ta manipulation
 Java vs C/CORBA for networked PC programs
 Make it easier to learn new languages
 some languages are similar; easy to walk dow
n family tree
 concepts have even more similarity 7
 Help you make better use of whatever la
nguage you use
 understand obscure features
 understand implementation costs: choose bet
ween alternative ways of doing things, based
on knowledge of what will be done u
nderneath

8
Programming language
spectrum (taxonomy)
 Imperative
 von Neumann (Fortran, Pascal, Basic, C, ...)
 object-oriented (Smalltalk, Eiffel, C++/Java?)
 Declarative
 functional (Scheme, ML, pure Lisp, Haskell, FP)
 logic, constraint-based (Prolog, VisiCalc, RPG)

 Imperative languages, particularly the von


Neumann languages, predominate.
 They will occupy the bulk of our attention this
semester.

9
Compiler vs. Interpreter (1/5)
 Compilers: Translate a source (human-
writable) program to an executable
(machine-readable) program
 Interpreters: Convert a source program
and execute it at the same time.

10
Compiler vs. Interpreter (2/5)

Ideal concept:

Source code Compiler Executable

Input data Executable Output data

Source code
Interpreter Output data
Input data

11
Compiler vs. Interpreter (3/5)
 Actually, no sharp boundary between
them. General situation is a combo:

Source code Translator Intermed. code

Intermed. code
Virtual machine Output
Input Data

12
Compiler vs. Interpreter (4/5)
 Most languages are usually thought of as
using either one or the other:
 Compilers: FORTRAN, COBOL, C, C++, Pascal,
PL/1
 Interpreters: Lisp, scheme, BASIC, APL, Perl,
Python, Smalltalk
 BUT: not always implemented this way

13
Compiler vs. Interpreter (5/5)
Examples of fuzzy boundary:
 Preprocessor to interpreter (e.g. BASIC)
 Sometimes can both compile and interpre
t
 C++ -> C conversion (AT&T compiler)
 Linking of executables at runtime
 Pascal and P-code
 Java and bytecode
 Just-in-time compiling

14
Programming environment tools
 editors (structure?)
 vi, emacs ...
 pretty printers (formatting conventions)
 cb, indent ...
 pre-processors
 cpp, m4, watfor ...
 debuggers (needs symbol table)
 adb, sdb, dbx, gdb ...
 style checkers
 lint, purify, ...
 module management
 make
 version management
 sccs, rcs
 assemblers
 as
 link editors, loaders
 ld, ld-so 15
Programming environment tools
 The better, state-of-the-art environments
have more integration to them.
 Visual Studio on PCs
 Cedar, Smalltalk, Interlisp at Xerox PARC
 Common Lisp on Symbolics
 Sabre C on Unix
 MPW on Macs

16
Phase of compilations

17
Scanning/Lexical analysis
 Break program down into its smallest me
aningful symbols (tokens, atoms)
 Tools for this include lex, flex
 Tokens include e.g.:
 “Reserved words”: do if float while
 Special characters: ( { , + - = ! /
 Names & numbers myValue 3.07e02
 Start symbol table with new symbols fou
nd

18
Parsing
 Construct a parse tree from symbols
 A pattern-matching problem
 Language grammar defined by set of rules that
identify legal (meaningful) combinations of s
ymbols
 Each application of a rule results in a node in
the parse tree
 Parser applies these rules repeatedly to the p
rogram until leaves of parse tree are “atoms”
 If no pattern matches, it’s a syntax error
 yacc, bison are tools for this (generate c
code that parses specified language)
19
Parse tree
 Output of parsing
 Top-down description of program syntax
 Root node is entire program
 Constructed by repeated application of
rules in Context Free Grammar (CFG)
 Leaves are tokens that were identified
during lexical analysis

20
Example:
Parsing rules for Pascal
 These are like the following:
 program PROGRAM identifier (identifier
more_identifiers) ; block .
 more_identifiers , identifier more_identifiers
 block variables BEGIN statement more_statements
END
 statement do_statement | if_statement |
assignment | …
 if_statement IF logical_expression THEN statement
ELSE …

21
Pascal code example
program gcd (input, output)
var i, j : integer
begin
read (i , j)
while i <> j do
if i>j then i := i – j;
else j := j – i ;
writeln (i);
end .

22
Semantic analysis
 Discovery of meaning in a program using the
symbol table
 Do static semantics check
 Simplify the structure of the parse tree ( from parse
tree to abstract syntax tree (AST) )
 Static semantics check
 Making sure identifiers are declared before use
 Type checking for assignments and operators
 Checking types and number of parameters to
subroutines
 Making sure functions contain return statements
 Making sure there are no repeats among switch
statement labels

23
Example: parse tree (pp. 20~21)

24
Example: AST

25
(Intermediate) Code generation
 Go through the parse tree from bottom
up, turning rules into code.
 e.g.
 A sum expression results in the code that
computes the sum and saves the result
 Result: inefficient code in a machine-
independent language

26
Machine independent
optimization
 Perform various transformations that imp
rove the code, e.g.
 Find and reuse common subexpressions
 Take calculations out of loops if possible
 Eliminate redundant operations

27
Target code generation
 Convert intermediate code to machine
instructions on intended target machine
 Determine storage addresses for entries
in symbol table

28
Machine-dependent optimization
 Make improvements that require specific
knowledge of machine architecture, e.g.
 Optimize use of available registers
 Reorder instructions to avoid waits

29
Homework
1. 이번 시간에 배운 내용을 A4 용지 한
페이지로 요약해 오기

Deadline: 1 주일 후 , 이 수업이 끝나기


전까지 교탁밑에 올려 놓은 것만 인정함

30
Thank You! Any More Questions?

31

You might also like