0% found this document useful (0 votes)
306 views54 pages

Programing Paradigms PPT - 01. 04. 2023

The document discusses different programming paradigms and languages. It covers imperative, procedural, object-oriented, functional, and logic programming paradigms. Example languages for each paradigm like C, Java, ML, and Prolog are provided. Factorial functions in different languages like C, Scheme, and Haskell are shown as examples.

Uploaded by

Dhruv Rastogi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
306 views54 pages

Programing Paradigms PPT - 01. 04. 2023

The document discusses different programming paradigms and languages. It covers imperative, procedural, object-oriented, functional, and logic programming paradigms. Example languages for each paradigm like C, Java, ML, and Prolog are provided. Factorial functions in different languages like C, Scheme, and Haskell are shown as examples.

Uploaded by

Dhruv Rastogi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 54

Programming Paradigms

Abhimanyu Sahu

Assistant Professor
Dept. of Computer Science and Engineering
Motilal Nehru National Institute of Technology Allahabad
Syllabus:
• UNIT 1: Introduction: Role of Programming Languages: Why Programming Languages, Towards Higher Level
Languages, Programming Paradigms, Programming Environments Language Description: Syntactic
Structure, Language Translation Issues: Programming Language Syntax, Stages in Translation, Formal
Translation Models

• UNIT II: Data, Data Types, and Basic Statements: Names , Variables , Binding, Type Checking, Scope, Scope
Rules , Lifetime and Garbage Collection, Primitive Data Types, Strings, Array Types, Associative Arrays
,Record Types, Union Types, Pointers and References, Arithmetic Expressions , Overloaded Operators, Type
Conversions, Relational and Boolean Expressions, Assignment Statements, Mixed Mode Assignments,
Control Structures, Selection ,Iterations, Branching, Guarded Statements

• UNIT III: Subprograms and Implementations: Subprograms, Design Issues, Local Referencing, Parameter
Passing, Overloaded Methods, Generic Methods, Design Issues for Functions, Semantics of Call and Return,
Implementing Simple Subprograms, Stack and Dynamic Local Variables, Nested Subprograms, Dynamic
Scoping.
Syllabus:
• UNIT IV: Object-Orientation, Concurrency, and Event handling: Grouping of Data and Operations —
Constructs for Programming Structures, Abstraction Information Hiding, Program Design with Modules,
Defined Types, Object Oriented Programming — Concept of Object, Inheritance, Derived Classes and
Information Hiding — Templates, Semaphores, Monitors, Message Passing, Threads, Statement Level
Concurrency Exception Handling (Using C++ and Java as Example Language).

• UNIT V: Functional and Logic Programming Languages: Fundamentals of Functional Programming


Languages, Programming with ML, Introduction to Logic and Logic Programming — Introduction to
Programming with HASKELL, SCHEME& SCALA
Books:
• Text Books:
1. “Programming Languages: Design and Implementations” , Terrance Pratt, Marvin V. Zelkowitz,
T.V.Gopa1, Fourth ed., Prentice Hall
2. “Programming in HASKELL” by Graham Hutton , Cambridge University Press
3. “ The Scheme Programming language” by R.Kent , Dybvig
4. “Programming in SCALA by Bill Venners & Martin Odersky , ARTIMA Press, Mount View, California

• Reference Books:
1. Concepts of Programming Languages, Robert W. Sebesta, 10th Ed.,Pearson
2. Programming Language Design Concept”, David A. Watt, Willey India
3. “Programming languages: Concepts and Constructs”, Ravi Sethi, Second Ed.,Pearson.
4. “Types and programming Languages”, Benjamin C. Pierce. The MIT Press Cambridge, Massachusetts
Why Study Programming Languages:
History:
• Hundreds of different programming languages have been designed and
implemented.
• In 1969, Sammet listed 120 that were fairly widely used, and many others
have been developed.
• The particular use of a language such as Java, C, or Ada.
• Six primary reasons :
• To improve your ability to develop effective algorithms
• Improper use of recursion
• Object-oriented programming, logic programing, concurrent programming

• To improve your use of your existing programming Language


• Data structures for arrays, strings, lists, records, Malloc()
• Implementation details of recursion, object classes, subroutine calls,...
Why Study Programming Languages:
• Six primary reasons :
• To increase your vocabulary of useful programming constructs
• Increase programming vocabulary and its implementation tech.
• Coroutine

• To allow a better choice of programming language


• Numeric Computation: C, FORTRAN, Ada
• AI: LISP, Prolog
• Internet applications: Perl, Java

• To make it easier to learn a new language

• To make it easier to design a new language


Development of Early language:
• Numerically based languages:
• FORTRAN (FORmula TRANslator)-1957, FORTRAN II 1958, FORTRAN IV-1966, FORTRAN 77 and
FORTRAN 90
• Business language-
• FLOWMATIC (1955), COBOL (COmmon Business Oriented language)-1960
• Artificial-intelligence language-
• IPL (Information Processing Language)-1950
• LISP (List PRocessing)-designed for general-purpose list-processing application
• Prolog-special-purpose language (basic control structure and implementation strategy based on
concepts from mathematical logic)
• System languages-
• CPL and BCPL (Never widely used C),
• UNIX (written mostly in C -1970, high-level languages)
Evolution of software architecture:
• 1950s-Large expensive mainframe computers run single programs (Batch processing)

• 1960s-Interactive programing (time-sharing) on mainframes

• 1970s-Development of Minicomputers and first microcomputers.

• 1980s-Personal computer-Microprocessor, IBM PC and Apple Macintosh. Use of windows,


icons and mouse

• 1990s-Client-server computing -Networking, The internet, the World Wide Web

• 2000-??? P2P
Organization of Programming Languages:
• Understand how languages are designed and implemented
• Syntax--What a program looks like
• Semantic--What a program means
• Implementation--How a program executes

• Understand most appropriate language for solving specific problems, for


example
• Pascal, C-procedural, statement oriented
• C++, Java, Smalltalk-Object oriented
• ML, Lisp-Functional
• Prolog-Rule-based
Language Goals:
• During 1950s--1960s-Compiler programs to execute efficiently.
• There is a direct connection between language features and hardware-
integers, reals, goto statements
• Programmers cheap; Machine expensive; Keep the machine busy

• But today
• Compiler programs that are built efficiently
• CPU power and memory very cheap
• Direct connection between language features and design concepts-
encapsulation, records, inheritance, functionality, assertions
Languages for various application domains:
Factorial Functions (Example):
• C, C++, Java:
int fact (int n) { return (n == 0) ? 1 : n * fact (n-1); }

• Scheme:
(define fact
(lambda (n) (if (= n 0) 1 (* n (fact (- n 1))))))

• ML:
fun fact n = if n=0 then 1 else n*fact(n-1);

• Haskell:
• fact :: Integer->Integer
• fact 0 = 1
• fact n = n*fact(n-1)
Role of Programming Language:
• Some of these influences on programming language development include
the following:
• Computer capabilities
• Applications
• Programming methods
• Implementation methods
• Theoretical studies
Role of Programming Language:
Role of Programming Language:
Attributes of a good language:
• Clarity, simplicity, and unity-Provides both a framework for thinking about algorithms and a means of
expressing those algorithms
• Have a minimum number of different concepts, with the rules for their combination, simple and regular
(conceptual integrity)
• Readability
• Orthogonality-Being able to combine various features of a language in all possible combinations
• every combination of features is meaningful
• Logical errors and inefficiency
• Naturalness for the application-program structure reflects the logical structure of algorithm
• Sequential algorithm, concurrent algorithms, logic algorithm, non-deterministic algorithm
• Appropriate data structures, operations, control structures, natural syntax
• Support for abstraction-program data reflects problem being solved Attributes of a good language
• Data abstraction
• Encapsulation
Attributes of a good language:
• Ease of program verification-verifying that program correctly performs its required function
• Verification/validation
• Comments, assert()
• Design specification
• Programming environment-external support for the language
• Debugger, syntax-directed editor
• Supporting function, platforms
• Smalltalk
• Supporting all the software lifecycle phases
• Portability of programs-transportability of the resulting programs from the computer on which they are
developed to other computer systems
• Transportability
• C, C++, Pascal<=> Java
• ML: Single source implementation
• Cost of use
• Cost of program execution
• Cost of program translation
• Cost of program creation
• Cost of program maintenance
Influences on Programming languages:
Choosing language for application :
• Pascal, C-procedural, statement oriented
• C++, Java, Smalltalk-Object oriented
• ML, Lisp-Functional
• Prolog-Rule-based
Language paradigms :
Language paradigms :
Imperative programming paradigm:
• Examples of Imperative programming paradigm:
• C : developed by Dennis Ritchie and Ken Thompson
• Fortran : developed by John Backus for IBM
• Basic : developed by John G Kemeny and Thomas E Kurtz
 Procedural programming paradigm – Examples of Procedural programming paradigm:
• C : developed by Dennis Ritchie and Ken Thompson
• C++ : developed by Bjarne Stroustrup
• Java : developed by James Gosling at Sun Microsystems
• ColdFusion : developed by J J Allaire
• Pascal : developed by Niklaus Wirth
 Object oriented programming – Examples of Object Oriented programming paradigm:
 Simula : first OOP language
 Java : developed by James Gosling at Sun Microsystems
 C++ : developed by Bjarne Stroustrup
 Objective-C : designed by Brad Cox
 Visual Basic .NET : developed by Microsoft
 Python : developed by Guido van Rossum
 Ruby : developed by Yukihiro Matsumoto
 Smalltalk : developed by Alan Kay, Dan Ingalls, Adele Goldberg
 Parallel processing approach – Examples are NESL (one of the oldest one) and C/C++ also supports
because of some library function.
Language paradigms :
Generality of Computational Model:
Declarative programming paradigm: :
• It is divided as Logic, Functional, Database.
 Logic programming paradigms – Prolog

 Functional programming paradigms –Examples of Functional programming paradigm:


 JavaScript : developed by Brendan Eich
 Haskell : developed by Lennart Augustsson, Dave Barton
 Scala : developed by Martin Odersky
 Erlang : developed by Joe Armstrong, Robert Virding
 Lisp : developed by John Mccarthy
 ML : developed by Robin Milner
 Clojure : developed by Rich Hickey

 Database/Data driven programming approach – For example SQL. It is applied to streams of


structured data, for filtering, transforming, aggregating (such as computing statistics), or calling other
programs. So it has its own wide application.
Language Translation Issues :
Programming Language Syntax:

• The arrangement of words as elements in a sentence to show their relationship


• In C, X = Y + Z represents a valid sequence of symbols, XY +- does not

• provides significant information for


understanding a program
translation into an object program
rules: 2 + 3 x 4 is 14 not 20
(2+3) x 4 - specify interpretation by syntax - syntax guides the translator
Language Translation Issues :
Readability
• Algorithm is apparent from inspection of text
• self-documenting
• natural statement formats
• liberal use of key words and noise words
• provision for embedded comments
• unrestricted length identifiers
• mnemonic operator symbols
• COBOL design emphasizes readability often at the expense of ease of
writing and translation
Language Translation Issues :
Writeability:-
• Enhanced by concise and regular structures (notice readability->verbose,
different; help us to distinguish programming features)

• FORTRAN - implicit naming does not help us catch misspellings (like indx
and index, both are good integer variables, even though the programmer
wanted index to be index)

• redundancy can be good

• easier to read and allows for error checking


Language translation:
• Native-code compiler: produces machine code
• Compiled languages: Fortran, C, C++, SML …
• Interpreter: translates into internal form and immediately executes
(read-eval-print loop)
• Interpreted languages: Scheme, Haskell, Python …
• Byte-code compiler: produces portable bytecode, which is executed on
virtual machine (e.g., Java)
• Hybrid approaches
• Source-to-source translation (early C++  Ccompile)
• Just-in-time Java compilers convert bytecode into native machine code when first
executed
Language Compilation:
• Compiler: program that translates a source language into a target language
Or
A compiler is a program that translate high level language (PL,
FORTRAN, COBOL) into functionally equivalent to low level language
(machine language or assembly language)

• Target language is often, but not always, the assembly language for a particular
machine
Checks During Compilation:
• Syntactically invalid constructs
• Invalid type conversions
• A value is used in the “wrong” context, e.g., assigning a float to an int
• Static determination of type information is also used to generate more
efficient code
• Know what kind of values will be stored in a given memory region during program
execution
• Some programmer logic errors
• Can be subtle: if (a = b) … instead of if (a == b) …
Different stages in language translation:
• Language translation work done by compiler
• Logically , language translation divide into major parts:
• The analysis of the input source program
• The synthesis of the executable object code.
• Structure of Compiler: A compiler takes input as source program and
produces as output an equivalence sequence of machine instructions.
• The compilation process is divided into a series of sub process called phases.
A phase is a logically cohesive operation that takes as input one
representation of source program and produce as output i.e.; target
program in other representation.
Different stages in language translation:
• A simple compiler typically uses two passes
• First pass:- The first analysis pass decomposes
the program into its constituent components
and derives information, such as variable name
usage, from the program
• Second pass:- The second pass typically
generates an object code from this collected
information.
• If compilation speed is important, a one-pass
strategy may be employed.
-Pascal was designed so that a one-pass
compiler could be developed for the language
• If execution speed is paramount, a three (or
more) pass compiler may be developed
-first pass analyzes the source program
-second pass rewrites the source program
into a more efficient from using various well-
defined optimization algorithms
Figure: Structure of a compiler -third pass generates the object code
Phases of Compilation:
• First Phase: lexical analyzer or scanner separate character of source language into a group of logically belongs to
each other. This group is called as tokens. The usual tokens are keywords such as do or if, identifiers such as a or b,
operator symbol as = + < > and punctuation symbol as () and comma. The output is a stream of tokens pass to the
rest of phases.
• Second Phase: the syntax analyzer or parser group tokens together into syntactic structure for example 3 tokens as
A+B group together into syntactic structure as expression. Expression might further be combined into statement.
We form a tree who's leaf are tokens, an interior node represent string of token that logically belong together. for
example : the statement – if (5 equ max) goto 100 has 8 tokens.
• Third Phase: intermediate code generator use the structure produced by syntax analyzer to create a stream of
simple instructions. Different style we are using like MACROS.
• Fourth Phase: code optimization is an optional phase design to improve the intermediate code so that object
program run faster and take less space. Its output is also intermediate code but it saves time.
• Fifth Phase: the final phase at code generation produce the object code by deciding on memory location for data,
selecting code and selecting register in which each computation done.
• The table management or book keeping keeps track of the name and record and all the essential information
about each of the data type. The data structure which is used to record the information called the symbol table.
Analysis of the Source Program:
• It consists of three steps:
• Lexical Analysis (Linear or Scanning) : reads from left-to-right and grouped into tokens that are
sequences of characters having a collective meaning
• Syntax Analysis (Hierarchical or Parsing) : characters or tokens are grouped hierarchically into nested
collections with collective meaning
• Semantic Analysis : Certain checks are performed to ensure that the components of a program fit
together meaningfully

• Lexical analysis (Scanning) :


• The initial phase of any translation is to group this sequence of characters into its elementary
constituents: -identifiers, delimiters, operator symbols, numbers, noise word, blanks, comments, and
so on.
• This phase is termed lexical analysis, and the basic program units that results from lexical analysis are
termed as lexical items (or tokens).
• Typically the lexical analyzer (or scanner) is the input routine for the translator, reading successive lines
of the input program, breaking them down into individual lexical items, called lexemes.
Lexical analysis (Scanning) :
• Example 1:
• Input: count = 123
• Tokens:
identifier : Rule: “letter followed by …”
Lexeme: count
assg_op : Rule: =
Lexeme: =
integer_const : Rule: “digit followed by …”
Lexeme: 123
Lexical analysis (Scanning) :
• Example 2:
• Input string: size := r * 32 + c
<token, lexeme> pairs:
• <id, size>
• <assign, :=>
• <id, r>
• <arith_symbol, *>
• <integer, 32>
• <arith_symbol, +>
• <id, c>
Syntactic analysis (parsing) :
For example:
• The second stage in translation is syntactic analysis or
Parse Tree for the statement
parsing.
A=(B+C)
• First, the syntactic analyzer identifiers a sequence of
lexical items forming a syntactic unit such as
expression, statement, subprogram call, or declaration.
• Commonly, the syntactic and semantic analyzers
communicate using a stack.
• The syntactic analyzer enters in the stack the various
elements of the syntactic unit found, and these are
retrieved and processed by the semantic analyzer.
• Efficient syntactic techniques, particularly techniques
based on the use of context free-grammars.
Syntactic analysis (parsing) :
• The syntax analyzer (parser) checks whether a given source program satisfies
the rules implied by a context-free grammar or not.
• If it satisfied, the parser creates the parse tree of that program
• Otherwise the parser gives the error message
• Parser works on a stream of tokens
• The smallest item is a token.
Syntactic analysis (parsing) :
• Why do you need Syntax Analyzer?
• Check if the code is valid grammatically
• The syntactical analyzer helps you to apply rules to the code
• Helps you to make sure that each opening brace has a corresponding
closing balance
• Each declaration has a type and that the type must be exists
Synthesis of the Object Program:
• The final stages of translation are concerned with
For example, the statement:-
the construction of the executable program from
the outputs produced by the semantic analyzer.
• This phase involves code generation necessarily
and may also include optimization of the
generated p
• The semantic analyzer ordinarily produces as
output the executable translated program
represented intermediate code, as internal
representation such as a string of operators and
operands, or a table of operator-operand
sequences.
• Allow the generation of poor code sequences by
the semantic analyzer and then during
optimization replace these sequences by better
ones that avoid obvious inefficiencies.
Code generation:
• The compiler can generate the final code For example:-
in the machine readable form.
• After the translated program in the
internal representation has been
optimized, it must be formed into the
assembly language statements, machine
code, or other object program.
• The output code may be directly
executable, or there may be other
translation step to follow (e.g assembly or
linking and loading).
• The code generator takes an intermediate
representation of the source program and
maps it into the target code.
Linking and loading :
• Linking intends to generate an executable module of a program by combining the object codes generated
by the assembler.
• A loader, on the other hand, loads these executable modules to the main memory for execution.
• The output of the preceding translation phases typically consists of executable programs in almost final
form, except where the programs reference external data or other subprograms.
• These incomplete locations in the code are specified in attached loader tables produced by the translator.
• The linking loader (or link editor) loads the various segments of translated code into memory and then
uses the attached loader tables to link them together properly filling in data and subprogram addresses in
the code as needed.
• The result is the final executable program ready to be run.
• Linking of object modules is done at both compile as well as load time. Compile-time linking is done
when the source code is translated to machine code. The load-time linking is done while the program is
loaded into memory by the loader.
• Loading is the process of loading the program from secondary memory to the main memory for
execution.
Bootstrapping:
• Translator for a new language is written in that language.
• Example: The initial Pascal compiler was written in Pascal and designed to execute on a
P-code virtual machine.
• if the original Pascal compiler were written is Pascal , how did this first compiler get
compiled ? This is called bootstrapping
• The first Pascal compiler was written in Fortran, the first C compiler was written in
B(CPL?), the first Java compiler was written in C, and so on.
• Bootstrapping is used to produce a self-hosting compiler. Self-hosting compiler is a type
of compiler that can compile its own source code.
• If the source language and the implementation language are the same, we are
essentially compiling a new version of the compiler. A process known as bootstrapping
Bootstrapping:
• Translator for a new language is written in that language.
• Example: The initial Pascal compiler was written in Pascal and designed to execute on a
P-code virtual machine.
• if the original Pascal compiler were written is Pascal , how did this first compiler get
compiled ? This is called bootstrapping
• The first Pascal compiler was written in Fortran, the first C compiler was written in
B(CPL?), the first Java compiler was written in C, and so on.
• Bootstrapping is used to produce a self-hosting compiler. Self-hosting compiler is a type
of compiler that can compile its own source code.
• If the source language and the implementation language are the same, we are
essentially compiling a new version of the compiler. A process known as bootstrapping
Formal Translation Models :
• Based on the context-free theory of languages
• The formal definition of the syntax of a programming language is called a
grammar
• A grammar consists of a set of rules (production) that specify the
sequences of characters (lexical items) that form allowable programs in the
language beginning defined.
• The two classes of grammars useful in compiler design technology include
the BNF grammar (or context-free grammar) and the regular grammar.
BNF Grammars:
• In English, a simple sentence is often given as:
subject / verb / object Example: The girl / ran / home , The boy / cooks / dinner
• Each category can be further subdivided:
Examples: subject is represented by article noun
i.e. article / noun / verb / object
• Simple interrogative sentences (questions) often have a syntax of:
auxiliary verb / subject / predicate Example: Is / the boy / cooking dinner ?
• We can represent these sentences by a set of rules: < Sentence>::= <declarative > | < interrogative>,
Note: ::= signifying " is defined as" and | signifying "or“
• Each sentence type can further defined as follows:
<declarative>::=<subject> <verb> < object>
<subject>::=<article> <noun>
<interrogative>::=<auxiliary verb> <subject> <predicate>
• This specific notation is called BNF (Back-Naur Form)
BNF Grammars:
• Backus-Naur Form (1959).
• Invented by John Backus to describe Algol 58
• BNF is equivalent to context-free grammars
• BNF is a metalanguage used to describe another language
• In BNF, abstractions are used to represent classes of syntactic structures--they act
like syntactic variables (also called non terminal symbols)
• BNF and Context-Free Grammars
• Context-Free Grammars
• Developed by Noam Chomsky in the mid-1950s
• Language generators, meant to describe the syntax of natural languages
• Define a class of language called context-free language
BNF Grammars:
• BNF Fundamentals
• Non-Terminals: BNF abstractions
• Terminals: lexemes and tokens
• Grammar: a collection of rules
Example of BNF rules: <ident_list> -> identifier | identifier, <identifier_list>
<if_stmt> -> if <logic_expr> then <stmt>
• BNF Rules
• A rule has a left-hand side (LHS) and a right-hand side (RHS), and consists of terminal and
nonterminal symbols
• A grammar is a finite nonempty set of rules
• An abstraction (or nonterminal symbol) can have more than on RHS
<stmt>-> <single_stmt>
| begin <stmt_list> end
What does BNF look like? :
• Like this:
<number> ::= <digit> | <number> <digit>
<digit> ::= 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9
• “::=” means “is defined as” (some variants use “:=” instead)
• “|” means “or”
• Angle brackets mean a nonterminal
• Symbols without angle brackets are terminals

• More BNF Examples


<while loop> ::= while ( <condition> )
<statement>
<assignment statement> ::= <variable> =
<expression>
BNF Grammars:
• Describing Lists
• Syntactic lists are described using recursion
• <ident_list>->ident
| ident, <ident_list>
• A derivation is a repeated application of rules, starting with the start symbol and
ending with a sentence (all terminal symbols)
BNF Grammars:
Parse Tree:
• A hierarchical representation of a derivation.
• Strings obtained at each derivation step, may contain both terminal and non-terminal symbols.
• Parsing: Determines whether a string is a correct sentence or not . It can be displayed in the form
of a Parse tree.

• Ambiguity in Grammars:
- A grammar is ambiguous iff it generates a sentential form
that has two or more distinct parse trees
Example: S -> SS | 0 | 1 T -> 0T | 1T | 0 | 1
Ambiguous Grammars:
• Example:
Parse Tree:
Example : Generate Parse tree for an assignment statement W=Y X (U+V) using the BNF grammar
*Draw the corresponding parse tree
*Draw the corresponding leftmost derivation

You might also like