MTECHCSNEWSYLL2020
MTECHCSNEWSYLL2020
Students’ Brochure
PART II
Master of Technology (M.Tech.) in Computer Science
(Effective from 2020-21 Academic Year)
(See PART I for general information, rules and regulations)
The Headquarters is at
203 BARRACKPORE TRUNK ROAD
KOLKATA 700108
INDIAN STATISTICAL INSTITUTE
Master of Technology (M.Tech.) in Computer Science
Contents
1 Structure of the Programme 1
1.1 Structure for CS Stream . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Structure for non-CS Stream . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Semester-wise layout of the compulsory and formative courses . . . . . . . 3
1.3.1 Odd semester . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3.2 Even semester . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.4 Elective Courses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
• Introduction to Programming
This requirement can be waived if a student passes a programming test, which would
be designed by the Mentor Committee and conducted in the first week of the first
semester.
2. Two compulsory courses from the list of compulsory courses for the CS stream:
3. Five Formative Courses from the list of formative courses for the CS stream:
Pool A Pool B
• Probability and Stochastic • Automata Theory, Languages and
Processes Computation
• Statistical Methods • Operating Systems
• Linear Algebra • Database Management Systems
• Elements of Algebraic • Compiler Construction
Structures • Computer Networks
• Principles of Programming Languages
• Computing Laboratory
• Computer Architecture
1
5. Dissertation (equivalent to three courses).
2. Five compulsory courses from the list of compulsory courses for the non-CS stream:
3. Four Formative Courses from the list of formative courses for the non-CS stream:
Pool A Pool B
• Probability and Stochastic • Computer Organization
Processes • Automata Theory, Languages and
• Statistical Methods Computation
• Linear Algebra • Database Management Systems
• Elements of Algebraic • Compiler Construction
Structures • Computer Networks
• Principles of Programming Languages
• Computer Architecture
The following restrictions would be applicable for choosing the formative courses:
2
• For students without a masters degree in Mathematics/Statistics: At least
two courses from Pool A and at least one from Pool B.
The following subjects are offered in the odd semester (the first semester at the beginning
of an academic year):
The following subjects are offered in the even semester (the semester at the end of an
academic year):
3
Subject Course Type
CS-stream non-CS stream
Design and Analysis of Algorithms Compulsory Compulsory
Statistics Formative Formative
Automata Theory, Languages and Computation Formative Formative
Operating Systems Formative Compulsory
Database Management Systems Formative Formative
Principles of Programming Languages Formative Formative
Compiler Construction Formative Formative
Computer Architecture Formative Formative
Computer Networks Formative Formative
4
Course Theory Data Sc. Crypto & Sec. Systems
Advanced Operating Systems X
Advanced Logic and Automata Theory X
Algorithms for Big Data X X
Algorithms for Electronic Design Automation X
Coding Theory X X
Computational Algebra and Number Theory X X
Computational Complexity X X
Computational Finance X X
Computational Game Theory X X
Computational Geometry X
Computational Molecular Biology and Bioinformatics X
Computational Topology X X
Computing Systems Security I X X
Computing Systems Security II X X
Computer Graphics X X
Computer Vision X X
Cryptology I X X
Cryptology II X X
Cyber-Physical Systems X
Digital Signal Processing X X
Discrete and Combinatorial Geometry X
Distributed Computing X X
Fault Tolerance and Testing X
Foundations of Data Science X
Graph Algorithms X
Image Processing I X X
Image Processing II X X
Information Retrieval X
Information Theory X X X
Learning Theory X X X
Logic for Computer Science X
Machine Learning I X X
Machine Learning II X X
Mobile Computing X
Natural Language Processing X
Neural Networks X X
Optimization Techniques X X X
Quantum Computation X X
Quantum Information Theory X X
Randomized and Approximation Algorithms X X
Specification and Verification of Programs X
Statistical Computing X
Topics in Privacy X
5
2 Information/rules specific to M.Tech. in CS
The information given in this section supplements, and occasionally supersedes (but is
never superseded by), the information given in the general brochure for all non-JRF degree
courses. The sections of the general brochure on promotions and repeating a year do not
apply to this course.
• If a student opts for waiver of class attendance, (s)he would require to seek a permis-
sion from the Mentor Committee.
• The teacher of the course and Dean’s Office needs to be informed before the mid-
semester week.
• The usual attendance requirement for the student in such cases would be completely
waived for the specific course.
• Under this option, the student has to obtain at least 60% to pass.
6
2.4 Dissertation
A student is required to work for a dissertation on a topic assigned/approved by the Project
and Dissertation Committee under the supervision of a suitable ISI faculty member [see
also Internship/Industrial training].
The work for a dissertation should be substantial and relate to some important problem
in an area of computer science and/or its applications and should have substantial theoret-
ical or practical significance. A critical review of recent advances in an area of computer
science and/or its applications with some contribution by the student is also acceptable as
a dissertation. The work should be commenced at the beginning of the third semester and
be completed along with the courses of the fourth semester. The dissertation should be
submitted as indicated in the academic calendar (tentatively by the middle of July of the
year of completion). The dissertation will be evaluated by a committee consisting of the
supervisor and an external expert. The student has to defend his/her dissertation in an open
seminar. The dissertation is considered to be equivalent to three credit courses.
Joint supervision of a dissertation is possible, with permission from the Mentor Com-
mittee. In such a case, the student is allowed to spend considerable time outside the insti-
tute, provided his/her course requirements are fulfilled. The primary supervisor of a jointly
supervised dissertation needs to be an ISI faculty.
• The aggregate score of the best nine courses taken in the first two semesters is at least
75%.
An eligible student choosing not to do the minor project, as well as a student who is not
eligible for the minor project, has to do two additional courses from the list of formative or
elective courses.
A student who opts for the minor project would decide a topic for the project within
three weeks from the start of a semester. The topic has to be approved by the Project
and Dissertation Committee, and should be significantly different from the topic/problem
of the dissertation as judged by the Project and Dissertation Committee. The Project and
Dissertation Committee will also be responsible for the evaluation of the project.
7
2.6 Internship/Industrial Training
There would be a mandatory 12 weeks gap between the first and the second year in the
academic calendar. Students are allowed to pursue internship/industrial training outside
the institute during this period. It may be organized anywhere in research institutes or in
public/private sector organizations. However, internship is not mandatory.
A student who undergoes internship/industrial training somewhere in India during the
training period may receive his/her usual monthly stipend/remuneration/emoluments either
from the Institute (ISI) or from the host organization at his/her own discretion. The students
who are placed outside Kolkata for training will be reimbursed sleeper class to and fro train
fare from Kolkata to the place of internship.
Training may also be arranged at a research/R&D organization abroad. In case a student
undergoes practical training at such a place abroad, the Institute (ISI) will not provide any
financial support including the monthly stipend for that period.
2.7 Specialisation
Among the eight elective courses, if a student passes at least five elective courses from
a specific track and does his/her dissertation in a topic which falls under that track, (s)he
graduates with a specialisation. The classification of a dissertation into a track would be
done by the Project and Dissertation Committee during or before the mid-term evaluation.
A student would be eligible to obtain a double specialisation if (s)he fulfils the follow-
ing:
• Passes at least 10 elective courses with at least five in the two separate tracks in which
(s)he wishes to obtain the specialisations. One elective course cannot be counted for
two different specialisations.
• The minor project and the dissertation are in two different tracks for which (s)he
wishes to obtain the specialisation.
• For a student who does not opt for a specialisation, the total of the scores in the best
seventeen courses passed and the dissertation would be the final score. If the student
8
has done a minor project, then (s)he can use the score in the minor project in lieu of
two courses. The total marks for a student in this case is 2000.
• For a student who opts for a single specialisation, the final score would be the total of
the best seventeen courses passed as chosen by the student and the dissertation. If the
student has done a minor project, then (s)he can use the the score in the minor project
in lieu of two courses. The seventeen courses should include at least five elective
courses from the track in which the student desires to obtain the specialisation. The
total marks for a student in this case is 2000.
• For a student who opts for a double specialisation, the final grade would be the total
of the best seventeen courses passed as chosen by the student, the minor project and
the dissertation. The seventeen courses should include at least ten elective courses
with at least five in each track in which the student opts for the specialisations. The
total marks for a student in this case is 2200.
In all cases, the scores of all the courses successfully completed by a student would be
reflected in the final mark sheet. The division indicated in the final mark sheet would be
determined as per the rules described in the general brochure.
2.9 Stipend
On admission, each student would receive the institute specified stipend, subject to the
rules described in the general brochure for all non-JRF degree courses, with the following
modification in respect of ‘Performance in coursework’ criterion. A student is eligible for
a full stipend
• in the first semester, only if (s)he registers for at least four courses;
• in the second semester, only if (s)he obtains at least 45% in each of at least four
courses in the first semester with an average score of 60% in the best four courses;
• in the third semester, only if (s)he obtains at least 45% in each of at least nine courses
in the first two semesters with an average score of 60% in the best nine courses;
• in the fourth semester, only if (s)he obtains at least 45% in each of at least fourteen
courses (or twelve courses and minor project) in the three semesters with an average
score of 60% in the best fourteen courses. Additionally, (s)he must pass the mid-term
evaluation for the dissertation.
9
Further, a student who is ineligible for a full stipend in the second, third or fourth semester,
may be eligible for a half stipend in that semester (i.e., second, third or fourth semester) if
(s)he gets at least 45% score in each of the minimum number of individual courses as per
the schedule listed above.
• Pool a list of Projects and Dissertations from the faculty members and circulate the
list among the students. However, the students will be free to choose a topic outside
the list, provided some faculty member agrees to supervise such a project.
10
• Ascertain for each student that the problems of Minor project and Dissertation are
different. An approval of the committee would be mandatory before a student is
assigned a Minor Project.
• Ascertain that the topic of Minor Project of a student is in the area of second special-
isation (s)he is opting for.
2.13 Prizes
At the end of four semesters of a particular batch, the Mentor Committee may nominate
a few students for their outstanding academic performances during this period, for certain
amount of cash awards as prize money. There will be no consideration of prize for semester-
specific performance.
Prerequisites. Many courses will have suggested prerequisite(s). The prerequisite courses
have to be either done at ISI or in the undergraduate and postgraduate degrees done before
joining ISI. The students need to confirm with the concerned teacher if his/her prerequisite
courses satisfy the demands of the particular course being taught at ISI. It is the responsi-
bility of the students to verify from the concerned teacher that they satisfy the prerequisites.
Marks distribution. The marks distribution given against each course is suggestive for
the teacher. A teacher of a course will announce within two weeks of the start of the course
the explicit marks distribution for the course. The marks distribution for each course has
two components –
11
Examination: This includes the mid semestral and end semestral examination. Except for
the Computing Laboratory course that has no mid and end semestral examination, all
other courses will have at least 50% weightage for the end semestral examination.
Lab/assignment: This component includes the internal assessment marks and will include
programming assignments, written assignments, class tests, quizzes, etc.
(a) Automata and Languages: Finite automata, regular languages, regular expressions,
deterministic and non-deterministic finite automata, minimization of finite automata,
closure properties, Kleene’s Theorem, pumping lemma and its application, Myhill-
Nerode theorem and its uses; Context-free grammars, context-free languages, Chom-
sky normal form, closure properties, pumping lemma for context-free languages,
push down automata.
Computability: Computable functions, primitive and recursive functions, universal-
ity, halting problem, recursive and recursively enumerable sets, parameter theorem,
diagonalisation, reducibility, Rice’s Theorem and its applications. Turing machines
and variants; Equivalence of different models of computation and Church-Turing
thesis.
Introduction to Complexity: Discussions on time and space complexities; P and NP,
NP-completeness, Cook’s Theorem, other NP-Complete problems; PSPACE; poly-
nomial hierarchy.
Examination: 80%
Laboratory/assignment: 20%
(e) References:
12
2. M. D. Davis, R. Sigal and E. J. Weyuker, Complexity, Computability and Lan-
guages, Academic Press, New York, 1994.
3. J. E. Hopcroft and J. D. Ullman, Introduction to Automata Theory, Languages
and Computation, Addison-Wesley, California, 1979.
4. J. E. Hopcroft, J. D. Ullman and R. Motwani, Introduction to Automata Theory,
Languages and Computation, Addison-Wesley, California, 2001.
5. H. R. Lewis and C. H. Papadimitriou, Elements of The Theory of Computation,
Prentice Hall, Englewood Cliffs, 1981.
6. M. Sipser, Introduction to The Theory of Computation, PWS Pub. Co., New
York, 1999.
7. M. R. Garey and D. S. Johnson, Computers and Intractability: A Guide to The
Theory of NP- Completeness, Freeman, New York, 1979.
Compiler Construction
13
Misc. topics (depending on time available): basic concepts of compiling object-
oriented and functional languages; just in time compiling; interpreting byte code;
garbage collection.
Examination: 60%
Laboratory/assignment: 40% (inclusive of 10% for assignments and 30% for projects)
(e) References:
Computer Architecture
(a) Introduction and Basics, Design for Performance, Fundamental Concepts and ISA,
ISA Trade-offs, Case Study
Introduction to Microarchitecture Design, Single Cycle Microarchitecture, Micro-
programmed Microarchitecture, Case Study
Pipelining, Data and Control Dependence Handling, Data and Control Dependence
Handling, Branch Prediction, Branch Handling and Branch Prediction II, Precise
Exceptions, State Maintenance and State Recovery, Case Study
Out-of-order execution, Out-of-order execution and Data Flow
SIMD Processing (Vector and Array Processors), GPUs, VLIW, DAE, Case Study:
Nvdia GPUs, Cray-I
Memory Hierarchy and Caches, Advanced Caches, Virtual Memory, DRAM, Mem-
ory Controllers, Memory Management, Memory Latency Tolerance, Prefetching and
Runahead execution, Emerging Memory Technologies, Case Study
14
Multiprocessors, Memory Consistency and Cache Coherence
Interconnection Networks
Examination: 60%
Laboratory/assignment: 40% (inclusive of projects and a mandatory laboratory
component)
(e) References:
Computer Networks
(a) Introduction: Use of computer networks, Network hardware and software, Classi-
fications of computer networks, Layered network structures, Reference models and
their comparison.
Data transmission fundamentals: Analog and digital transmissions, Channel charac-
teristics, Various transmission media, Different transmission impairments, Different
modulation techniques.
Communication networks: Introduction to LANs, MANs, and WANs; Switching
techniques: Circuit-switching and Packet-switching; Topological design of a net-
work, LAN topologies, Ethernet, Performance of Ethernet, Repeaters and bridges,
Asynchronous Transfer Mode.
Data link layer: Services and design issues, Framing techniques, Error detection
and correction, Flow control: Stop-and-wait and Sliding window; MAC Protocols:
ALOHA, CSMA, CSMA/CD, Collision free protocols, Limited contention protocol;
Wireless LAN protocols: MACA, CSMA/CA;
15
Network Layer: Design issues, Organization of the subnet, Routing, Congestion con-
trol, IP protocol, IP addressing.
Transport Layer: Design issues, Transport service, elements of transport protocol,
Connection establishment and release, TCP, UDP, TCP congestion control, QoS.
Application Layer: Email, DNS, WWW.
Labs: Interprocess communications and socket programming: Implementation and
realization of simple echo client-server over TCP and UDP, proxy web server, FTP,
TELNET, Chat programs, DNS and HTTP. Implementation of client-server applica-
tions using remote procedure call. Create sockets for handling multiple connections
and concurrent server. Simulating PING and TRACEROUTE commands
Examination: 70%
Laboratory/assignment: 30% (inclusive of a mandatory laboratory component)
(e) References:
Computer Organization
(a) Binary Systems: Information representation, number systems – binary, octal and hex-
adecimal numbers; number base conversion; complements, binary codes.
Boolean algebra: Postulates and fundamental theorems, Representation of Boolean
functions using Karnaughs map, truth tables, duality and complementation, canonical
forms, fundamental Boolean operations - AND, OR, NAND, NOR, XOR, Universal
Gates.
16
Minimization of Boolean functions: Using fundamental theorems, Karnaugh Maps,
Mcclusky method.
Combinational Logic: Adders, Subtractors, code conversion, comparator, decoder,
multiplexer, ROM, PLA.
Sequential Logic: Finite state models for sequential machines, pulse, level and clocked
operations; flip-flops, registers, shift register, ripple counters, synchronous counters;
state diagrams, characteristics and excitation tables of various memory elements,
state minimization for synchronous and asynchronous sequential circuits.
ALU Design: Addition of numbers – carry look-ahead and pre-carry vector ap-
proaches, carry propagation-free addition. Multiplication - using ripple carry adders,
carry save adders, redundant number system arithmetic, Booth’s algorithm. Division
- restoring and non-restoring techniques, using repeated multiplication. Floating-
point arithmetic – IEEE 754-1985 format, multiplication and addition algorithms.
ALU design, instruction formats, addressing modes.
Processor Design: ISA and Microarchitecture design, hardware control unit de-
sign, hardware programming language, microprogramming, horizontal, vertical and
encoded-control microprogramming, microprogrammed control unit design, pipelin-
ing.
Memory Organization: Random and serial access memories, static and dynamic
RAMs, ROM, Associative memory.
I/O Organization: Different techniques of addressing I/O devices, data transfer tech-
niques, programmed interrupt, DMA, I/O channels, channel programming, data trans-
fer over synchronous and asynchronous buses, bus control.
Examination: 70%
Laboratory/assignment: 30%
(e) References:
1. Z. Kohavi, Switching and Finite Automata Theory, 2nd ed., McGraw Hill, New
York, 1978.
17
2. E. J. Mcclusky, Logic Design Principles, Prentice Hall International, New York,
1986.
3. N. N. Biswas, Logic Design Theory, Prentice-Hall of India, New Delhi, 1994.
4. A. D. Freedman and P. R. Menon, Theory and Design of Switching Circuits,
Computer Science Press, California, 1975.
5. T. C. Bartee, Digital Computer Fundamentals, 6th ed., McGraw Hill, New York,
1985.
6. J. P. Hayes, Computer Architecture and Organization, 2nd ed., McGraw Hill,
New York, 1988
7. P. Pal Choudhury, Computer Organization and Design, Prentice Hall of India,
New Delhi, 1994.
8. M. M. Mano, Computer System Architecture, 3rd ed., Prentice Hall of India,
New Delhi, 1993.
9. Y. Chu, Computer Organization and Micro-Programming, Prentice Hall, En-
glewood Cliffs, 1972.
10. W. Stallings, Computer Organization and Architecture: Principles of Structure
and Function, 2nd ed., Macmillan, New York, 1990.
Computing Laboratory
(a) This laboratory course has to be run in coordination with the Data and File Structures
course. The assignments are to be designed based on the coverage in that course. The
initial programming language can be C or can be decided by the instructor based on
the background of the students. The laboratory sessions should include but are not
limited to:
Programming techniques: Problem solving techniques like divide-and-conquer,
dynamic programming, recursion, etc. are to be covered.
Data Structures:
Arrays: Implementation of array operations
Stacks and Queues, Circular Queues: Adding, deleting elements
Merging Problem: Evaluation of expressions, operations on multiple stacks and
queues.
Linked lists: Implementation of linked lists, inserting, deleting, and inverting a linked
list. Implementation of stacks and queues using linked lists. Polynomial addition and
multiplication. Sparse Matrix multiplication and addition.
18
Trees: Recursive and non-recursive traversal of trees; implementation of balanced
search trees, e.g. AVL tree, Red-Black tree, etc.
Hashing: Hash table implementation, searching, inserting and deleting
Searching and sorting techniques
Object oriented programming: Introduction to object oriented programming, classes
and methods, polymorphism, inheritance.
Introduction to other programming languages: Python, R, etc.
In addition, the following concepts need to be covered during the course of the lab
session: (i) testing the program, developing test-plan, developing tests, concept of re-
gression; (ii) version management, concept of CVS/SVN; (iii) concept of debugging;
(iv) concept of writing automation scripts, using bash/tcsh; (v) concept of makefiles;
Examination: –
Laboratory/assignment: 100% (inclusive of assignments (50%) and two/three lab-
oratory tests (50%))
(e) References:
19
8. A. Aho, J. Hopcroft, and J. Ullman: Data Structures and Algorithms, Addison-
Wesley, Reading, Mass., 1983.
9. B. Salzberg: File Structures: An Analytical Approach, Prentice Hall, New Jer-
sey, 1988.
10. T. Harbron: File System Structure and Algorithms, Prentice Hall, New Jersey,
1987.
11. P. E. Livadas: File Structure: Theory and Practice, Prentice Hall, New Jersey,
1990.
12. T. Cormen, C. Leiserson, R. Rivest and C. Stein: Introduction to Algorithms,
PHI Learning Pvt. Ltd., New Delhi, 2009.
13. S. Sahni: Data Structure, Algorithms and Applications in JAVA, Universities
Press (India) Pvt. Ltd., New York, 2005.
14. D. Wood: Data Structure, Algorithms and Performance, Addison-Wesley, Read-
ing, Mass., 1993.
15. M. T. Goodrich, R. Tamassia and David Mount: Data Structures and Algorithms
in C++, 2nd ed., Wiley, 2011.
16. B. W. Kernighan and D. M. Ritchie: The C Programming Language, Prentice
Hall of India, 1994.
17. B. Gottfried: Programming in C, Schaum Outline Series, New Delhi, 1996.
18. B. W. Kernighan and R. Pike: The Unix Programming Environment, Prentice
Hall of India, 1996.
19. R. G. Dromey: How to Solve it by Computers, Pearson, 2008.
(a) Introduction: Asymptotic notations; Idea of data structure design (in terms of static
and dynamic data), and the basic operations needed; Initial ideas of algorithms and its
resource usage in terms of space and time complexity; ideas of worst case, average
case and ammortized case analysis; Initial ideas of memory model, RAM model,
memory hierarchy;
Construction and manipulation of basic data structures: Idea of Abstract Data Types
and its concrete implementation; Basic data structures – List, Array, Stack, Queue,
Dequeue, Linked lists; binary tree and traversal algorithms, threaded tree, m-ary tree,
its construction and traversals; Priority Queue and heap;
20
Data Structures for searching: Binary search trees, Height-Balanced binary search
trees; Weight-Balanced binary search tree; Red-Black Tree; Binomial Heap; Splay
Tree; Skip list; Trie; Hashing – separate chaining, linear probing, quadratic probing.
Advanced data structures: Suffix array and suffix tree; Union Find for set operations;
Data structures used in geometric searching – Kd tree, Range tree; Quadtree; Data
structures used for graphs;
External memory data structures: B tree; B+ tree.
Programming practices: Apart from theoretical analysis of data structures, program-
ming implementations should be done as assignments. The Computing Laboratory
course acts as a supplement to this course.
Examination: 80%
Laboratory/assignment: 20%
(e) References:
21
8. A. Aho, J. Hopcroft, and J. Ullman: Data Structures and Algorithms, Addison-
Wesley, Reading, Mass., 1983.
9. B. Salzberg: File Structures: An Analytical Approach, Prentice Hall, New Jer-
sey, 1988.
10. T. Harbron: File System Structure and Algorithms, Prentice Hall, New Jersey,
1987.
11. P. E. Livadas: File Structure: Theory and Practice, Prentice Hall, New Jersey,
1990.
12. T. Coreman, C. Leiserson, R. Rivest and C. Stein: Introduction to Algorithms,
PHI Learning Pvt. Ltd., New Delhi, 2009.
13. S. Sahani: Data Structure, Algorithms and Applications in JAVA, Universities
Press (India) Pvt. Ltd., New York, 2005.
14. D. Wood: Data Structure, Algorithms and Performance, Addison-Wesley, Read-
ing, Mass., 1993.
15. M. T. Goodrich, R. Tamassia and David Mount: Data Structures and Algorithms
in C++, 2nd ed., Wiley, 2011.
(a) Introduction: Purpose of database systems, data abstraction and modelling, instances
and schemes, database manager, database users and their interactions, data definition
and manipulation language, data dictionary, overall system structure.
Relational model: Structure of a relational database, operation on relations, relational
algebra, tuple and domain relational calculus, salient feature of a query language.
SQL: domain types, construction, alteration and deletion of tables, query structure
and examples, natural joins and other set operations, aggregations, nested subqueries,
inserting, modifying and deleting data, advanced joins, views, transactions, integrity
constraints, cascading actions, autorization and roles. Hands on and practical assign-
ments.
Entity - relationship model: Entities and entity sets, relationships and relationship
sets, mapping constraints, E - R diagram, primary keys, strong and weak entities,
reducing E - R diagrams to tables.
Introduction to hierarchical and network model: Data description and tree structure
diagram for hierarchical model, retrieval and update facilities, limitations; Database
22
task group (DDBTG) model, record and set constructs retrieval and update facilities,
limitations.
Databases in application development: cursors, database APIs, JDBC and ODBC,
JDBC drivers, Connections, Statements, ResultSets, Exceptions and Warnings. Prac-
tical case studies.
Normalization: Anomalies in RDBMS, importance of normalization, functional,
multi-valued and join dependencies, closures of functional dependencies and at-
tribute sets, 1NF, 2NF, 3NF and BCNF; (Optionally) 4NF and 5NF; Discussion on
tradeoff between performance and normalization. Database tuning: Index selection
and clustering, tuning of conceptual schema, denormalization, tuning queries and
views;
Query optimization: Importance of query processing, equivalence of queries, join
ordering, cost estimation, cost estimation for complex queries and joins, optimizing
nested subqueries, I/O cost models, external sort.
Crash recovery: Failure classification, transactions, log maintenance, check point
implementation, shadow paging, example of an actual implementation. Concurrency
Control in RDBMS: Testing for serializability, lock based and time - stamp based
protocols; Deadlock detection and Recovery.
NoSQL: Introduction to noSQL databases, ACID vs BASE requirements, practical
exercises with one noSQL system (for example MongoDB).
MapReduce and Hadoop: Basics of MapReduce, Basics of Hadoop, Matrix-vector
multiplication using MapReduce, relational algebra using MapReduce, matrix mul-
tiplication using MapReduce, combiners, cost of MapReduce algorithms, basics of
Spark, practical exercises using Spark.
Examination: 60%
Laboratory/assignment: 40%
(e) References:
23
2. Database Management Systems, Third Edition: by Raghu Ramakrishnan and
Johannes Gehrke. http://pages.cs.wisc.edu/˜dbbook/
3. Mining of Massive Datasets: by Jure Leskovec, Anand Rajaraman, Jeff Ullman.
http://www.mmds.org
(a) Introduction and basic concepts: Complexity measures, worst-case and average-case
complexity functions, problem complexity, quick review of basic data structures and
algorithm design principles.
Sorting and selection: Finding maximum and minimum, k largest elements in order;
Sorting by selection, tournament and heap sort methods, lower bound for sorting,
other sorting algorithms – radix sort, quick sort, merge sort; Selection of k-th largest
element.
Searching and set manipulation: Searching in static table – binary search, path
lengths in binary trees and applications, optimality of binary search in worst cast and
average-case, binary search trees, construction of optimal weighted binary search
trees; Searching in dynamic table -randomly grown binary search trees, AVL and (a,
b) trees.
Hashing: Basic ingredients, analysis of hashing with chaining and with open ad-
dressing.
Union-Find problem: Tree representation of a set, weighted union and path compression-
analysis and applications.
Graph problems: Graph searching -BFS, DFS, shortest first search, topological sort;
connected and biconnected components; minimum spanning trees, Kruskal’s and
Prim’s algorithms, Johnson’s implementation of Prim’s algorithm using priority queue
data structures.
Algebraic problems: Evaluation of polynomials with or without preprocessing. Wino-
grad’s and Strassen’s matrix multiplication algorithms and applications to related
problems, FFT, simple lower bound results.
String processing: String searching and Pattern matching, Knuth-Morris-Pratt algo-
rithm and its analysis.
NP-completeness: Informal concepts of deterministic and nondeterministic algo-
rithms, P and NP, NP-completeness, statement of Cook’s theorem, some standard
NP-complete problems, approximation algorithms.
24
(b) Prerequisites: Nil
(c) Hours: Three lectures and one two-hour tutorial per week
Examination: 70%
Laboratory/assignment: 30% (at least one assignment involving implementation
of several algorithms of same asymptotic complexity for a problem and their
empirical comparisons.)
(e) References:
Discrete Mathematics
25
non-homogeneous relations and examples, generating functions and their applica-
tion to linear homogeneous recurrence relations, non-linear recurrence relations, ex-
ponential generating functions, brief introduction to Polya theory of counting.
Graph Theory: Graphs and digraphs, complement, isomorphism, connectedness and
reachability, adjacency matrix, Eulerian paths and circuits in graphs and digraphs,
Hamiltonian paths and circuits in graphs and tournaments, trees; Minimum spanning
tree, rooted trees and binary trees, planar graphs, Euler’s formula, statement of Kura-
towski’s theorem, dual of a planar graph, independence number and clique number,
chromatic number, statement of Four-color theorem, dominating sets and covering
sets.
Logic: Propositional calculus – propositions and connectives, syntax; semantics –
truth assignments and truth tables, validity and satisfiability, tautology; Adequate set
of connectives; Equivalence and normal forms; Compactness and resolution; Formal
reducibility, natural deduction system and axiom system; Soundness and complete-
ness. Introduction to Predicate Calculus: Syntax of first order language; Semantics
– structures and interpretation; Formal deductibility; First order theory, models of a
first order theory (definition only), validity, soundness, completeness, compactness
(statement only), outline of resolution principle.
Examination: 80%
Laboratory/assignment: 20%
(e) References:
26
6. J. A. Bondy and U. S. R. Murty: Graph Theory with Applications, Macmillan
Press, London, 1976.
7. N. Deo: Graph Theory with Applications to Engineering and Computer Sci-
ence, Prentice Hall, Englewood Cliffs, 1974.
8. Douglas B. West: Introduction to Graph Theory, Pearson, 2000
9. Reinhard Diestel: Graph Theory, Springer, 2010
10 Frank Harary: Graph Theory, Narosa Publishing House, 2001
11. E. Mendelsohn: Introduction to Mathematical Logic, 2nd ed. Van-Nostrand,
London, 1979.
12. L. Zhongwan: Mathematical Logic for Computer Science, World Scientific,
Singapore, 1989.
13. Fred S. Roberts, Barry Tesman: Applied Combinatorics, Chapman and Hall/CRC;
2 edition, 2008.
14. Lewis and Papadimitriou: Elements of Theory of Computation (relevant chap-
ter on Logic), Prentice Hall, New Jersey, 1981.
(a) Introduction: Sets, operations on sets, relations, equivalence relation and partitions,
functions, induction and inductive definitions and proofs, cardinality of a set, count-
able and uncountable sets, diagonalisation argument.
Groups: Binary operations, groupoids, semi-groups and monoids, groups, subgroups
and cosets, Lagrange’s theorem, cyclic group, order of an element, normal subgroups
and quotient groups, homomorphism and isomorphism, permutation groups and di-
rect product.
Rings and sub-rings: Introduction to rings, sub-rings, ideals and quotient rings, ho-
momorphism and isomorphism, integral domains and fields, field of fractions, ring
of polynomials.
Field extensions: Finite dimensional, algebraic and transcendental; splitting field of
a polynomial, existence and uniqueness of finite fields, application to Coding Theory.
27
Examination: 80%
Laboratory/assignment: 20%
(e) References:
28
Examination: 50% (including programming tests)
Laboratory/assignment: 50%
1. References:
(a) B.W. Kernighan and D.M. Ritchie: The C Programming Language, Prentice
Hall, 1980.
(b) B. Gottfried: Programming in C, Schaum Outline Series, 1996.
(c) B. Stroustrup: The C++ Programming Language, 2nd ed., Addison-Wesley,
1995.
(d) Cay S. Horstmann: Core Java Volume I – Fundamentals, 11th ed., Prentice
Hall, 2018.
(e) Joshua Bloch: Effective Java, 3rd ed., Addison-Wesley, 2018.
(f) B.W. Kernighan and R. Pike: The Practice of Programming, Addison-Wesley.
(g) B.W. Kernighan and P.J. Plauger: The Elements of Programming Style, McGraw-
Hill.
(h) J. Bentley: Programming Pearls, Addison-Wesley, 1986.
(i) J. Bentley: More Programming Pearls, Addison-Wesley, 1988.
(j) B.W. Kernighan and R. Pike: The Unix Programming Environment, Prentice
Hall.
Linear Algebra
(a) Matrices and System of Equations: System of Linear Equations, Row Echelon Form,
Matrix Arithmetic, Matrix Algebra, Elementary Matrices, Partitioned Matrices, De-
terminant and its properties.
Vector Spaces: Definition and Examples, Subspaces, Linear Independence, Basis
and Dimension, Change of Basis, Row Space, Column Space, Null space
Inner product spaces: The Euclidean dot product, Orthogonal Subspaces, Least
Squares Problems, Orthonormal Sets, The Gram-Schmidt Orthogonalization Pro-
cess, Orthogonal Polynomials
Linear Transformations: Definition and Examples, Matrix Representations of Linear
Transformations, Similarity
Eigenvalues and Eigenvectors: System of Linear Differential Equations, Diagonal-
ization, Hermitian Matrices, Singular Value Decomposition.
29
Quadratic Forms: Classifiction and characterisations, Optimisation of quadratic forms.
Algorithms: Gaussian Elimination with different Pivoting Strategies; Matrix Norms
and Condition Numbers; Orthogonal Transformations; The Eigenvalue Problem;
Least Squares Problems.
Examination: 80%
Laboratory/assignment: 20%
(e) References:
Operating Systems
30
methods, file protection, file allocation strategies.
Protection: Goals, policies and mechanisms, domain of protection, access matrix and
its implementation, access lists, capability lists, Lock/Key mechanisms, passwords,
dynamic protection scheme, security concepts and public and private keys, RSA en-
cryption and decryption algorithms.
A case study: UNIX OS file system, shell, filters, shell programming, programming
with the standard I/O, UNIX system calls.
Examination: 80%
Laboratory/assignment: 20%
(e) References:
31
Functional Languages: Typed-calculus, higher order functions and types, evalua-
tion strategies, type checking, implementation.
Logic Programming: Computing with relation, first-order logic, SLD-resolution,
unification, sequencing of control, negation, implementation, case study.
Concurrency: Communication and synchronization, shared memory and message
passing, safety and liveness properties, multithreaded program.
Formal Semantics: Operational, denotational and axiomatic semantics, languages
with higher order constructs and types, recursive type, subtype, semantics of non-
determinism and concurrency.
Assignments: Using one or more of the following as much time permits: C++ / Java
/ OCAML / Lisp / Haskell / Prolog
Examination: 70%
Laboratory/assignment: 30%
(e) References:
(a) Introduction: Sample space, probabilistic models and axioms, conditional probabil-
ity, Total probability theorem and Bayes’ rule, independence.
32
Discrete random variables: basics, probability mass functions, functions of random
variables, expectation, variance, idea of moments, joint probability mass functions,
conditioning and independence, notions of Bernoulli, Binomial, Poisson, Geometric,
etc., covariance and correlation, conditional expectation and variance, some intro-
duction to probabilistic methods.
Continuous random variables: basics, probability density functions, cumulative dis-
tribution function, normal random variable.
Moments and deviations: Markov’s inequality, Chebyshev’s inequality, Chernoff
bounds
Limit theorems: Weak law of large numbers, central limit theorem, strong law of
large numbers (proofs under some restrictive setups, if possible.)
Stochastic processes: Bernoulli and Poisson processes, branching processes. Markov
chains, classification of states, ideas of stationary distributions. Introduction to Mar-
tingales and stopping times.
Applications to computer science: Balls and bins, birthday paradox, hashing, sorting,
random walks, etc.
Examination: 80%
Laboratory/assignment: 20%
(e) References:
33
6. Michael Mitzenmacher and Eli Upfal: Probability and Computing (2nd Edi-
tion), Cambridge University Press, 2017
Statistical Methods
Examination: 70%
Laboratory/assignment: 30%
(e) References:
34
2. D. Freedman, R. Pisani and R. Purves: Statistics.
3. M. Tanner: An Investigation for a Course in Statistics.
4. M. G. Kendall and A. Stuart: The Advanced Theory of Statistics, Vol. I and II.
5. J. F. Kenney and E. S. Keeping: Mathematics of Statistics.
6. G. U. Yule and M. G. Kendall: An Introduction to the Theory of Statistics.
7. C. R. Rao: Linear Statistical Inference and its Applications.
8. C. E. Croxton and D. J. Cowden: Applied General Statistics.
9. A. M. Goon, M. Gupta and B. Dasgupta: Fundamentals of Statistics, Vol I.
10. P. G. Hoel, S. C Port and Charles J. Stone: Introduction to Statistical Theory.
11. W. A. Wallis and H. V. Roberts: Statistics: A New Approach.
12. P. J. Bickel and K. A. Doksum: Mathematical Statistics.
13. L. Wasserman: All of Statistics: A Concise Course in Statistical Inference.
14. C. Casella and R. L. Berger: Statistical Inference
35
CSMA/CD, Collision free protocols, Limited contention protocol; Wireless LAN
protocols: MACA, CSMA/CA; Comparative analysis of different MAC protocols.
Internetworking and IP: Design issues, Organization of the subset, Routing: Static
and dynamic routing, Shortest path routing, Flooding, Unicast and multicast routing,
Distance-vector routing, Linkstate routing; Congestion control: choke packets, leaky
bucket, token bucket; IP protocol, IPV4, IPV6, IP addressing, CIDR, NAT, Internet
control protocols: ICMP, ARP, RARP.
Transport and Reliable Delivery: Design issues, Port and socket, Connection estab-
lishment and release, TCP, UDP, TCP congestion control, TCP timer management,
RPC.
Examination: 60%
Laboratory/assignment: 40%
(e) References:
(a) Monadic second order logic: syntax, semantics, truth, definability, relationship be-
tween logic and languages, Büchi-Elgot-Trakhtenbrot theorem.
Automata on infinite words: Büchi automata, closure properties, Müller automata,
Rabin automata, Streett automata, determinization, decision problems, Linear tem-
poral logic and Büchi automata, Finite and infinite tree automata, closure properties,
36
decision problems, complementation problem for automata on infinite trees, alter-
nation, Rabin’s theorem. Modal mu-calculus: syntax, semantics, truth, finite model
property, decidability, Parity Games, model checking problem, memoryless determi-
nacy, algorithmic issues, bisimulation, Janin/Walukiewicz theorem.
Examination: 80%
Laboratory/assignment: 20%
(e) References:
(a) The instructor may select only some of the following topics, and include other topics
of current interest
Operating systems structures: monolithic, microkernel, ExoKernel, multi kernel.
System calls, interrupts, exceptions.
37
Symmetric Multi Processor (SMP) systems: scheduling, load balancing, load sharing,
process migration; synchronisation in SMP systems.
Interprocess communication: signals, message passing.
Naming in distributed systems: directory services, DNS.
Remote Procedure Calls (RPC): model, stub generation, server management, param-
eter passing, call semantics, communication protocols, client-server binding, excep-
tion handling, security, optimization.
Distributed shared memory: architecture, consistency model, replacement strategy,
thrashing, coherence.
File systems: Fast File System (FFS), Virtual File System (VFS), log-structured file
systems and journalling, RAID; Distributed File Systems (DFS), stateless and state-
ful DFS, Andrew File System (AFS), Network File Systems (NFS).
Virtualisation: introduction, nested virtualisation, case study.
Device drivers.
Fault tolerance.
Clusters, cloud computing.
Protection and security.
Projects and real systems implementations
Examination: 60%
Laboratory/assignment: 40%
(e) References:
(a) Thomas Anderson and Michael Dahlin: Operating Systems Principles and Prac-
tice, 2nd ed., Recursive Books, 2014.
(b) Daniel P. Bovet and Marco Cesati: Understanding the Linux Kernel, 3rd ed.,
O’Reilly 2005/2008.
(c) Robert Love: Linux Kernel Development, 3rd ed., Addison-Wesley Profes-
sional, 2010.
38
(d) Jonathan Corbet, Alessandro Rubini and Greg Kroah-Hartman: Linux Device
Drivers, 3rd ed., O’Reilly, 2005.
(e) Research articles as prescribed by the instructor.
Examination: 60%
Laboratory/assignment: 40%
(e) References:
1. Avrim Blum, John Hopcroft, and Ravindran Kannan: Foundations of Data Sci-
ences by, https://www.cs.cornell.edu/jeh/book.pdf
2. Dimitri P. Bertsekas and John N. Tsitsiklis: Introduction to Probability (2nd
Edition), Athena Scientific, 2008.
3. Michael Mitzenmacher and Eli Upfal: Probability and Computing (2nd Edi-
tion), Cambridge University Press, 2017
4. Noga Alon and Joel Spencer: The Probabilistic Method (4th Edition), Wiley-
Blackwell, 2016
39
Algorithms for Electronic Design Automation
(a) Introduction: VLSI design, design styles and parameters, popular technologies.
Logic synthesis: PLA minimization, folding, testing. Role of BDDs.Logic design
tools- ESPRESSO, SIS, OCTOOLS.
High level synthesis: Design description languages – introduction to features in
VHDL, Verilog; Scheduling algorithms; Allocation and Functional binding.
Layout synthesis: Design rules, partitioning, placement and floor planning, routing
in ASICs, FPGAs; CAD tools
Advanced Topics: Design for Hardware Security and IP Protection; Design of Man-
ufacturability
Examination: 75%
Laboratory/assignment: 25%
(e) References:
40
9. B. T. Preas and M. Lorenzetti: Physical Design automation of VLSI Systems,
Benjamin Cummings Pub., 1988.
10. T. Ohtsuki (ed): Layout Design and Verification, North Holland, Amsterdam,
1986.
11. Bhunia, Swarup, Ray, Sandip, Sur-Kolay, Susmita (Eds.): Fundamentals of IP
and SoC Security: Design, Verification, and Debug
Coding Theory
(a) Introduction: Basic definitions: codes, dimension, distance, rate, error correction,
error detection.
Linear Codes: Properties of linear codes; Hamming codes; Efficient decoding of
Hamming codes; Dual of a linear code
Gilbert Varshamov bound; Singleton bound; Plotkin bound
Shannon’s Theorems: Noiseless coding; Noisy Coding; Shannon Capacity
Algebraic codes: Reed-Solomon codes; Concatenated codes; BCH codes; Reed-
Muller codes; Hadamard codes; Dual BCH codes.
Algorithmic issues in coding: Decoding Reed-Solomon Codes; Decoding Concate-
nated Codes
List Decoding: List decoding; Johnson bound; List decoding capacity; List decoding
from random errors. List decoding of Reed-Solomon codes.
Advanced Topics: Graph Theoretic Codes; Locality in coding: Locally decodable
codes, locally testable codes; codes and derandomization.
Examination: 80%
Laboratory/assignment: 20%
(e) References:
41
1. Venkatesan Guruswami, Atri Rudra, Madhu Sudan: Essential Coding The-
ory, internet draft available at: https://cse.buffalo.edu/faculty/
atri/courses/coding-theory/book/web-coding-book.pdf
2. F.J. MacWilliams and Neil J.A. Sloane: Theory of Error-Correcting Codes,
North Holland Publishing Co., 1977.
3. Jacobus H. van Lint: Introduction to Coding Theory, Springer, 1973.
4. Vera S. Pleaa and W. Cary Huffman (Eds.): Handbook of Coding Theory
Examination: 80%
42
Laboratory/assignment: 20%
(e) References:
43
15. S. Winograd: The Arithmetic Complexity of Computations, SIAM, 1980.
16. R. Zippel: Effective Polynomial Computation, Kluwer Academic Press, Boston,
1993. B18. Lambda-
Computational Complexity
(a) Introduction: Review of machine models, Turing machines and its variants, reduction
between problems and completeness, time and space complexity classes
Structural Results: Time and space hierarchy theorems, polynomial hierarchy, Lad-
ner’s theorem, relativization, Savitch’s theorem
Circuit Complexity: Circuits and non-uniform models of computation, parallel com-
putation and NC, P-completeness, circuit lower bounds, AC 0 and parity not in AC 0 ,
Hastad’s Switching Lemma, introduction to natural proof barrier
Random Computation: Probabilistic computation and complexity classes and their
relations with other complexity classes, BPP=P?
Interactive proofs: Introduction to Arthur-Merlin Games, IP=PSPACE, multiprover
interactive proofs, introduction to PCP theorem
Complexity of counting: Complexity of optimization problems and counting classes,
Toda’s theorem, inapproximability, application of PCP theorem to inapproximability
and introduction to unique games conjecture
Cryptography: Public-key cryptosystems, one-way functions, trapdoor-functions,
application to derandomization
Examination: 80%
Laboratory/assignment: 20%
(e) References:
44
3. M. Sipser: Introduction to Theory of Computation, PWS Pub.Co, New York,
1999.
4. C. H. Papadimitriou: Computational Complexity, Addison-Wesley, Reading,
Mass., 1994.
5. J. E. Hopcroft and J. D. Ullman: Introduction to Automata Theory, Languages
and Computation, Addison-Wesley, Reading, Mass., 1979.
6. O. Goldreich: Lecture Notes on Computational Complexity, Tel Aviv Univ.,
1999.
7. S. Arora and B. Barak: Computational Complexity: A Modern Approach, Cam-
bridge University Press, 2009.
Computational Finance
(a) Basic Concepts: (i) Arbitrage, Principle of no arbitrage, Law of one price; Friction-
less / Efficient market, Transaction cost, Contingent contracts, Concept of complete
market (ii) Time value of money, discounting: deterministic and stochastic; Mar-
tingale, Risk neutral valuation, Equivalent martingale measure; (iii) Mean Variance
utility / Normal distributed returns; Capital Asset pricing Model (CAPM), Exten-
sions, test for efficiency
Contracts: Forwards, Futures, Options (Call, Put, European, American, Exotics),
Combinations; Risk neutral portfolio construction
Valuation of contracts in discrete time models. Computation using Binomial tree.
Link with the continuous time model: Brownian motion, Black Scholes option pric-
ing and hedging.
High frequency trading (Machine learning, Neural networks), Algorithmic trading
Examination: 80%
Laboratory/assignment: 20%
(e) References:
45
2. Hull, J.C.: Options, Futures, and Other Derivatives
3. Prisman, E.: Pricing Derivative Securities
4. Oksendal, B.: Stochastic Differential Equations, An Introduction with Applica-
tions
5. Selected research papers
(a) Computing in Games: Basic Solution Concepts and Computational Issues, Strate-
gies, Costs and Payoffs, Basic Solution Concepts, Equilibria and Learning in Games.
Refinement of Nash: Games with Turns and Subgame Perfect Equilibrium. Nash
Equilibrium without Full Information: Bayesian Games Cooperative Games Markets
and Their Algorithmic Issues
The Complexity of Finding Nash Equilibria
Equilibrium Computation for Two-Player Games in Strategic and Extensive Form
Learning, Regret Minimization, and Equilibria: External Regret Minimization, Generic
Reduction from External to Swap Regret, Partial Information Model
Combinatorial Algorithms for Market Equilibria
Introduction to Mechanism Design: Social Choice, Mechanisms with Money
Cost Sharing: Cooperative Games and Cost Sharing, Group-Strategy proof Mecha-
nisms and Cross-Monotonic Cost-Sharing Schemes, The Shapley Value and the Nash
Bargaining Solution
Online Mechanisms
Examination: 80%
Laboratory/assignment: 20%
(e) References:
46
Computational Geometry
Examination: 80%
Laboratory/assignment: 20%
(e) References:
47
3. Geometric Approximation Algorithms, S. Har-Peled, American Mathematical
Society, 2010.
4. Lectures on Discrete Geometry, J. Matousek, Springer, 2002.
5. David Mount’s Lecture notes on Computational Geometry (CMSC 754)
Examination: 70%
Laboratory/assignment: 30%
(e) References:
48
1. C. Setubal and J. Meidanis: Introduction to Computational Molecular Biology,
PWS Publishing Company, Boston, 1997.
2. P. A. Pevzner: Computational Molecular Biology – An Algorithmic Approach,
MIT Press, 2000.
3. R. Durbin, S. R. Eddy, A. Krogh and G. Mitchison: Biological Sequence Analy-
sis – Probabilistic Models of Proteins and Nucleic Acids, Cambridge University
Press, 1998.
4. D. Gusfield: Algorithms on Strings, Trees, and Sequences, Cambridge Univer-
sity Press, USA, 1997.
5. H. Lodish, A. Berk, S. L. Zipursky, P. Matsudaira, D. Baltimore and J. Darnell:
Molecular Cell Biology, W. H. Freeman, USA, 2000.
6. C.-I. Branden, J. Tooze: Introduction to Protein Structure, Garland Publishing,
1998.
7. A. Kowald, C.hristoph Wierling, E. Klipp, and W. Liebermeister: Systems Bi-
ology, Wiley-VCH, 2016.
8. B.O. Palsson: Systems Biology – Constraint based Reconstruction and Analy-
sis, Cambridge Univer- sity Press, 2015.
Computational Topology
(a) Topics:
i. Planar graphs
ii. Introduction to classification of surfaces, and graphs embedded on surfaces.
iii. Introduction to homotopy and homology
iv. Computational problems in homotopy and homology
v. Introduction to computational 3-manifold theory
vi. Complexity issues in high dimensional computational topology
vii. Persistent homology and its applications
viii. Distance to a measure
(b) Prerequisites: Design and Analysis of Algorithms, Probability and Stochastic Pro-
cesses
49
(d) Marks Distribution:
Examination: 80%
Laboratory/assignment: 20%
(e) References:
Computer Graphics
(a) Overview of Graphics Systems: displays, input devices, hard copy devices, GPU,
graphics software, graphics programming language, e.g. openGL
Line drawing algorithms, circle and ellipse drawing algorithms, polygon filling, edge
based fill algorithms, seed fill algorithms
2D and 3D camera geometry, Affine and Projective transformations, Orthographic
and Perspective view transformations, object to image projection, pin-hole camera
model, 3D scene reconstruction, epipolar geometry
2D and 3D clipping, subdivision line-clipping algorithms, line clipping for convex
boundaries. Sutherland-Hodgman algorithm, Liang-Barsky algorithm
Hidden line and hidden surfaces algorithms, ray tracing and z-buffer algorithm, Float-
ing horizon algorithm, list priority and backface culling algorithms
2D and 3D object representation and visualization, Bezier and B-Spline curves and
surfaces, 2D and 3D surface mesh representation and drawing, sweep representa-
tions, constructive solid geometry methods, Octrees, BSP trees, Fractal geometry
50
methods, Visualization of datasets - visual representations for scalar, vector, and ten-
sor fields
Different colour representations, transformation between colour models, halftoning
Rendering, Illumination models, Gouraud shading, Phong shading, transparency,
shadows, image and texture mapping and synthesis, Radiosity lighting model
Raster animations, key frame systems, inbetweening, morphing, motion and pose
interpolation and extrapolation
Graphical user interface and interactive input methods, interactive picture construc-
tion techniques, virtual reality environments.
Projects and Assignments: At least two assignments and one class project, assign-
ments should include implementation of graphics algorithm using a programming
language
Examination: 80%
Laboratory/assignment: 20%
(e) References:
1. Fundamentals of Computer Graphics, Peter Shirley, et al., CRC Press, 4th Edi-
tion, 2016
2. Computer Graphics Principles and Practice, John F. Hughes, et al., Pearson, 3rd
Edition, 2014
4. Computer Graphics with Open GL, 4th Edition, Hearn, Baker and Carithers,
Pearson, 2014.
Computer Vision
(a) Machine vision systems, introduction to low, mid and high level vision, low and
mid level image processing, edge detection, image segmentation, image and texture
features
Camera geometry, object to image geometric transformations, orthographic and per-
spective view transformations, camera calibration
51
Binocular vision system, epipolar geometry, 3D scene reconstruction, recovering
shape from stereo
Human vision structure, neurovisual model, scale space representation
Motion estimation and tracking, active contours, recovering shape from motion,
video processing
Reflectance map and photometric stereo, surface reflectance model, recovering albedo
and surface orientation, recovering shape from shading
Machine learning for computer vision, Classification models for vision, deep learn-
ing architectures for vision, Model based recognition system
Object recognition, recognition of arbitrary curved object sensed either by stereo or
by range sensor, Recognition under occlusion, Aspect graph of an arbitrary 3D object
viewed from different directions, Recognition of 3D objects based on 2D projections
Projects and Assignments: At least two assignments and one class project, assign-
ments should include implementation of computer vision algortihm using a program-
ming language
Examination: 70%
Laboratory/assignment: 30%
(e) References:
52
Computing Systems Security I
Examination: 60%
Laboratory/assignment: 40%
(e) References:
1. Ross Anderson: Security Engineering, 2nd ed., Wiley. Available online: http:
//www.cl.cam.ac.uk/˜rja14/book.html.
2. C.P. Pfleeger, S.L. Pfleeger, J. Margulies: Security in Computing, 5th ed., Pren-
tice Hall, 2015.
3. David Wheeler: Secure Programming HOWTO. Available online: https:
//www.dwheeler.com/secure-programs/.
53
4. Michal Zalewski: Browser Security Handbook, Michael Zalewski, Google.
Available online: https://code.google.com/archive/p/browsersec/
wikis/Main.wiki.
5. B. S. Schneier: Applied Cryptography: Protocols, Algorithms, and Source
Code in C, 2nd Edition, John Wiley and Sons, New York, 1995.
6. A. Menezes, P. C. Van Oorschot and S. A. Vanstone: Handbook of Applied
Cryptography, CRC Press, Boca Raton, 1996.
(a) Cellular networks, Access Technologies, GSM, CDMA, GPRS, 3G networks, Wire-
less LAN, WLAN security.
Operating Systems Security: (i) Access Control Fundamentals (ii) Generalized Se-
curity Architectures (iii) Analysis of security in Unix/Linux and problems with the
design of its security architecture (iv) Analysis of security in Windows and prob-
lems with its security architecture (v) Security Kernels: SCOMP design and analy-
sis, GEM-SOS design (vi) Difficulties with securing Commercial Operating Systems
(Retrofitting Security) (vii) Security issues in Virtual Machine Systems (viii) Secu-
rity issues in sandboxing designs: design and analysis of Android.
Database Security: (i) Introduction: Security issues faced by enterprises (ii) Security
architecture (iii) Administration of users (iv) Profiles, password policies, privileges
and roles (v) Database auditing
Examination: 70%
Laboratory/assignment: 30%
(e) References:
1. Ross Anderson: Security Engineering, 2nd ed., Wiley. Available online: http:
//www.cl.cam.ac.uk/˜rja14/book.html.
2. C.P. Pfleeger, S.L. Pfleeger, J. Margulies: Security in Computing, 5th ed., Pren-
tice Hall, 2015.
54
3. David Wheeler: Secure Programming HOWTO. Available online: https:
//www.dwheeler.com/secure-programs/.
4. Michal Zalewski: Browser Security Handbook, Michael Zalewski, Google.
Available online: https://code.google.com/archive/p/browsersec/
wikis/Main.wiki.
5. B. S. Schneier: Applied Cryptography: Protocols, Algorithms, and Source
Code in C, 2nd Edition, John Wiley and Sons, New York, 1995.
6. A. Menezes, P. C. Van Oorschot and S. A. Vanstone: Handbook of Applied
Cryptography, CRC Press, Boca Raton, 1996.
Cryptology I
(a) Classical ciphers and their cryptanalysis; Information Theoretic Security, one time
pad; Stream ciphers; Block ciphers; Cryptanalysis of Block and Stream Ciphers;
Formal models for block and stream ciphers: Pseudorandom generators, Pseudoran-
dom functions and permutations; Symmetric key encryption: Notion of CPA and
CCA security with examples; Cryptographic hash functions; Symmetric key authen-
tication; Modern modes of operations: Authenticated Encryption, Tweakable Enci-
phering schemes.
Introduction to public key encryption; computational security and computational as-
sumptions;The Diffie Hellman key exchange; The RSA, ElGamal, Rabin and Paillier
encryption schemes; Digital Signatures Introduction to Elliptic Curve Cryptosys-
tems; Public key infrastructure.
Examination: 80%
Laboratory/assignment: 20%
(e) References:
55
3. Dan Boneh, Victor Shoup: A Graduate Course in Applied Cryptography, online
draft available at http://toc.cryptobook.us/.
4. B. S. Schneier: Applied Cryptography: Protocols, Algorithms, and Source
Code in C, 2nd Edition, John Wiley and Sons, New York, 1995.
5. A. Menezes, P. C. Van Oorschot and S. A. Vanstone: Handbook of Applied
Cryptography, CRC Press, Boca Raton, 1996.
Cryptology II
Examination: 80%
Laboratory/assignment: 20%
(e) References:
56
5. Rafael Pass and Abi Shelat: A Course in Cryptography, Lecture notes. Avail-
able online: https://www.cs.cornell.edu/courses/cs4830/2010fa/
lecnotes.pdf
6. Daniele Micciancio, Shafi Goldwasser, Complexity of Lattice Problems: A
Cryptographic Perspective, Kluwer, 2002.
7. Lawrence C. Washington, Elliptic Curves: Number Theory and Cryptography,
Second Edition, CRC Press 2008.
8. S. Chatterjee, P. Sarkar: Identity-Based Encryption, Springer, 2011.
Cyber-Physical Systems
(a) Cyber-Physical Systems (CPS) in the real world, Basic principles of design and
validation of CPS, AUTomotive Open System Architecture (AutoSAR), Industrial
Internet-of-things (IIoT) implications, Building Automation, Medical CPS
CPS - Platform components: CPS Hardware platforms - Processors, Sensors, Ac-
tuators, CPS Network, Control Area Network (CAN), Automotive Ethernet, CPS,
Software stack, Real Time Operating Systems (RTOS), Scheduling Real Time con-
trol tasks
Principles of Automated Control Design (basic control theory): Dynamical Systems
and Stability, Controller Design Techniques, Stability Analysis: Common Lyapunov
Function (CLF), Multiple Lyapunov Function (MLF), stability under slow switching,
Performance under Packet drop and Noise
CPS implementation: From features to software components, Mapping software
components to Electronic Control Units (ECU), CPS Performance Analysis - ef-
fect of scheduling, bus latency, sense and actuation faults on control performance,
network congestion
Safety and Security Assurance of Cyber-Physical Systems: Advanced Automata based
modelling and analysis, Timed and Hybrid Automata, Definition of trajectories, Zenoness,
Formal Analysis, Flow-pipe construction, reachability analysis, Analysis of CPS
Software, Weakest Pre-conditions, Bounded Model checking
Secure Deployment of CPS: Attack models, Secure Task mapping and Partitioning,
State estimation for attack detection, Automotive Case Study
CPS Case studies and Tutorials
57
• Hybrid Automata Modeling : Flowpipe construction using Flowstar, SpaceX
and Phaver tools
• CPS Software Verification: Frama-C, CBMC
• Automotive and Avionics : Software controllers for Antilock Braking Systems
(ABS), Adaptive Cruise Control (ACC), Lane Departure Warning, Suspension
Control, Flight Control Systems
• Heathcare: Artificial Pancreas/Infusion Pump/Pacemaker
• Green Buildings : automated lighting, Air-Condition (AC) control
Examination: 60%
Laboratory/assignment: 40%
(e) References:
58
Transfer function: Poles and zeroes, interpretation of causality and stability, fre-
quency response for rational transfer functions, minimum phase and all-pass systems.
Transform analysis of discrete signals: Discrete Fourier series, discrete Fourier trans-
form, relationships with Fourier transform of sequences.
Structures for discrete time systems: Block diagrams, signal flow graphs, direct, cas-
cade and parallel forms, transposed forms, structures for FIR filters, lattice filters.
Effects of finite precision: Coefficient quantization, round-off noise, analysis of vari-
ous structural forms, limit cycles in IIR filters.
Filter design: Filter specifications, design using analog filters, impulse invariance,
bilinear transformation, frequency transformation of low-pass IIR filters, computer-
aided design, FIR filter design by windowing.
Computation of DFT: Direct computation, FFT and other implementations, finite
precision effects.
Applications of DFT: Fourier analysis of signals using DFT, DFT analysis of sinu-
soidal signals, spectral estimation, analysis of non-stationary signals.
Some advanced topics.
Practical exercises using MATLAB or other software.
Examination: 80%
Laboratory/assignment: 20%
(e) References:
59
Discrete and Combinatorial Geometry
(a) Topics:
Examination: 80%
Laboratory/assignment: 20%
(e) References:
60
Distributed Computing
Examination: 80%
Laboratory/assignment: 20%
(e) References:
61
test generation, D-algorithm, PODEM, FAN, Boolean difference, testability anal-
ysis, fault sampling, random pattern testability, testability-directed test generation,
scan path, syndrome and parity testing, signature analysis; CMOS and PLA testing,
delay fault testing, system-on-a chip testing, core testing; BDDs. Formal verification:
Introduction, Overview of Digital Design and Verification, Verilog HDL, Simulators,
Test Scenarios and Coverage, Assertions, Binary Decision Diagrams (BDD), State
Machines and Equivalence Checking, Model Checking, Bounded Model Checking,
Counter Example Guided Abstraction Refinement; case studies.
Examination: 75%
Laboratory/assignment: 25%
(e) References:
(a) Topics:
62
vi. Algorithms for Big Data: Streaming, Sketching, and Sampling
vii. Clustering
viii. Wavelets
(b) Prerequisites: Design and Analysis of Algorithms and Probability and Stochastic
Processes
Examination: 80%
Laboratory/assignment: 20%
(e) References:
Graph Algorithms
(a) Shortest path (SP) problems: Single source SP problem, SP tree, Ford’s labelling
method, labelling and scanning method, efficient scanning orders – topological order
for acyclic networks, shortest first search for non-negative networks (Dijkstra), BFS
search for general networks, correctness and analysis of the algorithms; All pair SP
problem – Edmond-Karp method, Floyd’s algorithm and its analysis.
Flows in Networks: Basic concepts, maxflow-mincut theorem, Ford and Fulkerson
augmenting path method, integral flow theorem, maximum capacity augmentation,
Edmond-Karp method, Dinic’s method and its analysis, Malhotra-Kumar-Maheswari
method and its analysis, Preflow-push method (Goldberg Tarjan) and its analysis;
Better time bounds for simple networks.
Minimum cost flow: Minimum cost augmentation and its analysis.
Matching problems: Basic concepts, bipartite matching – Edmond’s blossom shrink-
ing algorithm and its analysis; Recent developments.
Planarity: Basic fact about planarity, polynomial time algorithm.
63
Graph isomorphism: Importance of the problem, backtrack algorithm and its com-
plexity, isomorphism complete problems, polynomial time algorithm for planar graphs,
group theoretic methods.
NP-hard optimization problems: Exponential algorithms for some hard problems –
dynamic programming algorithm for TSP, recursive algorithm for maximum inde-
pendent set problem; Review of NP-completeness of decision problems associated
with TSP, bin packing, knapsack, maximum clique, maximum independent set, min-
imum vertex cover, scheduling with independent task, chromatic number etc; For-
mulation of the concept of NP-hard optimization problem, perfect graphs and poly-
nomial time algorithms for hard problems on graphs, approximation algorithms and
classification of NP-optimization problems with respect to approximability.
Examination: 80%
Laboratory/assignment: 20%
(e) References:
64
8. C. M. Hoffman: Group Theoretic Algorithms and Graph Isomorphisms, Springer-
Verlag, Berlin, 1982.
9. C. H. Papadimitriou and K. Stiglitz: Combinatorial Optimization: Algorithms
and Complexity, Prentice Hall of India, New Delhi, 1997.
10. R. E. Tarjan: Data Structures and Network Algorithms, SIAM, Philadelphia,
1983
11. E. Horowitz and S. Sahni: Fundamentals of Computer Algorithms, Galgotia
Pub, New Delhi, 1985.
Image Processing I
65
(d) Marks Distribution:
Examination: 80%
Laboratory/assignment: 20%
(e) References:
Image Processing II
(a) 2-D transformations of images and frequency filtering: Frequency domain analysis,
discrete Fourier transform, fast Fourier transform, convolution and correlation in fre-
quency domain, frequency domain filtering; Walsh transform; Hadamard transform;
Discrete cosine transform; Hotelling transform.
Enhancement/restoration: Edge-preserving smoothing, anisotropic diffusion; Least
square restoration, constrained least-squares restoration, Wiener filter; Blind decon-
volution; Superresolution.
66
Morphological image processing : Erosion, dilation, opening, closing, Hit-or-Miss
transformation; Gray-scale morphology, area morphology; Watershade algorithm.
Segmentation : Model-based - facet model, active contour, semantic (saliency based)
region grouping; Interactive segmentation - growcut, graphcut.
Image analysis : Pattern spectrum; Structural features - Fourier descriptor, polygo-
nal approximation; Shape matching, template matching, shape metric, image under-
standing.
Multi-resolution image analysis : Spatial/frequency domain analysis, Gabor trans-
form; Continuous wavelet analysis, dyadic wavelet, fast wavelet analysis, fast inverse
wavelet analysis, 1D/2D wavelets; Wavelet packets.
Compression : Transform domain compression; block transform coding, vector quan-
tization, wavelet based compression, JPEG, JBIG.
Image databases: Attribute list, relational attributes, indexing, storage and retrieval.
Some applications (from the following but not restricted to): (i) Biomedical image
processing; (ii) Document image processing; (iii) Fingerprint classification; (iv) Dig-
ital water-marking; (v) Image fusion; (vi) Image dehazing; (vii) Face detection; (viii)
Face recognition; (ix) Content aware resizing; (x) Content based image retrieval.
Examination: 80%
Laboratory/assignment: 20%
(e) References:
67
5. K. R. Castleman: Digital Image Processing, Prentice Hall, Englewood Cliffs,
1996.
6. A. R. Rao: Taxonomy for Texture Description, Springer-Verlag, Berlin, 1990.
7. R. M. Haralick and L. G. Shapiro; Computer and Robot Vision, Vol. 1and 2,
Addison- Wesley, Reading, Mass., 1992.
8. A. Rosenfeld and A. C. Kak; Digital Picture Processing, 2nd ed., (Vol. 1and 2),
Academic Press, New York, 1982.
9. B. B. Chaudhuri and D. Dutta Majumder: Two-tone Image Processing and
Recognition, Wiley-Eastern, New Delhi, 1993.
10. A. Blake and A. Zisserman: Visual Reconstruction, MIT Press, Cambridge,
1987.
11. W. K. Pratt: Digital Image Processing, 2nd ed., John Wiley, New York, 1992.
12. A. N. Netravali and B. G. Haskell: Digital Pictures, 2nd ed., Plenum Press,
1995.
13. K. Sayood: Data Compression, Morgan Kaufmann, San Mateo, 1986.
14. H. C. Andrews and B. R. Hunt: Digital Image Restoration, Prentice Hall, En-
glewood Cliffs, 1977.
15. M. Vetteerli and J. Kovacevic: Wavelet and Sub-Band Coding, Prentice Hall,
EC, 1995.
16. A. B. Watson: Digital Images and Human Vision, MIT Press, Cambridge, 1993.
17. C. A. Glasbey and G. H. Horgen: Image Analysis for Biomedical Sciences,
John Wiley, New York, 1995.
18. S. Khoshafian and A. B. Baker: Multimedia and Imaging Databases, Morgan
Kaufmann, San Mateo, 1996.
19. S. K. Pal, A. Ghosh and M. K. Kundu: Soft Computing for Image Processing,
Physica Verlag (Springer), Heidelberg, 1999.
20. M. Sonka, V. Hlavac and R. Boyle, Image Processing: Analysis and Machine
Vision, PWS Pub. Co., London, 1998.
Information Retrieval
(a) The instructor may select only some of the following topics, and include other topics
of current interest.
68
Introduction: overview of applications, brief history; text representation, indexing:
tokenisation, stopword removal, stemming, phrase identification; index structures,
index creation.
Models: Boolean retrieval, ranked retrieval, vector space model: term weighting;
probabilistic models for IR; language modeling for IR: query likelihood, KL diver-
gence, smoothing.
Evaluation: recall, precision, average precision, NDCG, other commonly used met-
rics; test collections, evaluation forums, sound experimental methods.
Query expansion: query expansion in the vector space model: relevance feedback,
Rocchio’s method, variations, pseudo relevance feedback; query expansion in the
probabilistic model; relevance based language models and variations; automatic the-
saurus generation; matrix decompositions; latent semantic analysis.
Web search: Web document preprocessing: parsing, segmentation, deduplication,
shingling; crawling, focused crawling, metacrawlers; link analysis: hubs and author-
ities, Google PageRank; query auto completion; search log analysis; search result
diversification; computational advertising.
Machine learning in IR: text categorisation: vector space methods, nearest neigh-
bours, naive Bayes, support vector machines, feature selection; text clustering: ag-
glomerative clustering, k-means, search result clustering; learning to rank.
(c) Hours: Four lectures per week, one lab-session per week during the second half of
the course.
Examination: 60%
Laboratory/assignment: 40%
2. References:
69
(c) W. Bruce Croft, D. Metzler, T. Strohman: Search Engines: Information Re-
trieval in Practice, Pearson, 2010.
(d) ChengXiang Zhai and Sean Massung: Text Data Management: A Practical
Introduction to Information Retrieval and Text Mining, ACM and Morgan &
Claypool Publishers, 2016.
(e) ChengXiang Zhai: Statistical Language Models for Information Retrieval A
Critical Review, NOW Publishers.
(f) Christopher Olston, Marc Najork: Web Crawling, NOW Publishers.
(g) Fei Cai, Maarten de Rijke: A Survey of Query Auto Completion in Information
Retrieval, NOW Publishers.
(h) Fabrizio Silvestri: Mining Query Logs: Turning Search Usage Data into Knowl-
edge, NOW Publishers.
(i) R.L.T. Santos, Craig Macdonald, Iadh Ounis: Search Result Diversification,
NOW Publishers.
(j) Jun Wang, Weinan Zhang, Shuai Yuan: Display Advertising with Real-Time
Bidding (RTB) and Behavioural Targeting, NOW Publishers.
(k) Tie-Yan Liu: Learning to Rank for Information Retrieval, NOW Publishers.
(l) Bo Pang, Lillian Lee: Opinion Mining and Sentiment Analysis, NOW Publish-
ers.
(m) Ani Nenkova, Kathleen McKeown: Automatic Summarization, NOW Publish-
ers.
Information Theory
(a) Introduction: Historical perspective; Entropy; Mutual Information; Chain rule; Rel-
ative entropy and its non-negativity
Compression: Asymtotic Equipartition Property(AEP); Markov Models; AEP for
Markov Models; Krafts Inequality; Prefix Codes; Huffman Codes; Arithmetic Codes;
Lempel-Ziv Codes
Communication: Communication over noisy channels; Channel capacity; Converse
to the noisy channel coding theorem; Sphere packing view of the coding theorem;
Polar codes; Gaussian channel; Information measures for continuous variables; En-
tropy maximization
70
Kolmogorov Complexity: Models of computation; Definition and examples of Kol-
mogorov Complexity; Kolmogorov Complexity and Entropy; Algorithmically Ran-
dom and Incompressible sequences; Universal Probability; Minimum description
length principle.
Applications to Statistics and Machine Learning
Examination: 80%
Laboratory/assignment: 20%
(e) References:
Learning Theory
(a) Topics:
i. General introduction
ii. Introduction to Probability theory and Stochastic Processes
iii. PAC model; Occam’s razor
iv. Sample complexity: VC dimension, Sauer-Shelah Lemma, infinite hypothesis
spaces, supervised learning, agnostic learning, lower bounds, and Rademacher
complexity
71
v. Computational hardness results
vi. Online learning
vii. Boosting and margins theory
viii. Support-vector machines and kernels
ix. Mistake-bounded algorithms, halving algorithm and weighted majority algo-
rithm
x. Learning and game theory
xi. Linear-threshold algorithms
xii. Maximum entropy modeling
xiii. Portfolio selection; Cover’s algorithm
xiv. Introduction to Active learning
xv. Introduction to semi-supervised learning
xvi. Introduction to Distributed learning
xvii. Introduction to Differential privacy and Statistical query learning
(b) Prerequisites: Design and Analysis of Algorithms, Probability and Stochastic Pro-
cesses
Examination: 80%
Laboratory/assignment: 20%
(e) References:
(a) Syntax and semantics of first order logic; Proof procedures – Hilbert system, natural
deduction and sequent calculus, resolution methods, soundness and completeness;
Prenex normal form and skolemization; Compactness, Lowenheim Skolem theorem,
72
Herbrand’s theorem, undecidability and incompleteness; Peano and Presburger arith-
metics, incompleteness of first order number theory. Introduction to Modal and Tem-
poral Logic with applications.
(d) Marks:
Examination: 80%
Laboratory/assignment: 20%
(e) References:
73
Machine Learning I
(a) Basic Mathematical and Statistical concepts: (i) Metric, Positive definite matrix,
Eigen values and eigen vectors, mean, median, mode, variance, co-variance, cor-
relation, dispersion matrix, binomial distribution, normal distribution, multi-variate
normal distribution, basic concepts in probability theory such as Bayes theorem,
Chebyshev’s inequality, Laws of large numbers, Central limit theorem, (ii) Unbiased
estimate, consistent estimate, maximum likelihood estimate.
Classification: (i) Bayes decision rule, examples, normal distribution cases, training
and test sets, prob. of misclassification, estimation of parameters for normal dis-
tribution, minimum distance classifier, standardization, normalization, Mahalanobis
distance, Naive-Bayes rule, (ii) K-NN decision rule, its properties, (iii) Density esti-
mation, (iv) Perceptron (linear separable case), MLP, (v) Assessment of classifiers
Clustering: Similarity measures, minimum within cluster distance criterion, K-means
algorithm, Hierarchical clustering, Density based clustering, FCM, cluster validation.
Dimensionality reduction: (i) Feature selection: Different criterion functions, Algo-
rithms, BBA (ii) Feature extraction: PCA (iii) LDA
Decision trees, Random forests
Examination: 80%
Laboratory/assignment: 20%
(e) References:
74
Machine Learning II
Examination: 80%
Laboratory/assignment: 20%
(e) References:
75
Mobile Computing
(a) Introduction: Overview of wireless and mobile systems; Basic cellular concepts
and architecture; Design objectives and performance issues; Radio resource man-
agement; Radio interface; Radio propagation and pathloss models; Channel interfer-
ence and frequency reuse; Cell splitting; Channel assignment strategies; Overview
of generations of cellular systems:- 1G to 5G.
Location and handoff management: Introduction to location management (HLR and
VLR); Mobility models characterizing individual node movement (Random walk,
Fluid flow, Markovian, Activitybased); Mobility models characterizing the move-
ment of groups of nodes (Reference point based group mobility model, Commu-
nity based group mobility model); Static location management schemes (Always
vs. Never update, Reporting Cells, Location Areas); Dynamic location management
schemes (Time, Movement, Distance, Profile Based); Terminal Paging (Simultane-
ous paging, Sequencial paging); Location management and Mobile IP; Introduction
to handoffs; Overview of handoff process; Factors affecting handoffs and perfor-
mance evaluation metrics; Handoff strategies; Different types of handoffs (soft, hard,
horizontal, vertical).
Wireless transmission fundamentals: Introduction to narrowband and wideband sys-
tems; Spread spectrum; Frequency hopping; Introduction to MIMO; MIMO Channel
Capacity and diversity gain; Introduction to OFDM; MIMO-OFDM system; Multi-
ple access control (FDMA, TDMA, CDMA, SDMA); Wireless local area network;
Wireless personal area network (Bluetooth and zigbee).
Mobile Ad hoc networks: Characteristics and applications; Coverage and connectiv-
ity problems; Routing in MANETs.
Wireless sensor networks: Concepts, basic architecture, design objectives and ap-
plications; Sensing and communication range; Coverage and connectivity; Sensor
placement; Data relaying and aggregation; Energy consumption; Clustering of sen-
sors; Energy efficient Routing (LEACH).
Cognitive radio networks: Fixed and dynamic spectrum access; Direct and indirect
spectrum sensing; Spectrum sharing; Interoperability and co-existence issues; Appli-
cations of cognitive radio networks.
D2D communications in 5G cellular networks: Introduction to D2D communica-
tions; High level requirements for 5G architecture; Introduction to the radio resource
management, power control and mode selection problems; Millimeterwave commu-
nication in 5G.
76
Labs: Development and implementation of different network protocols using net-
work simulators such as NS-3 and OMNET++.
Examination: 80%
Laboratory/assignment: 20%
(e) References:
77
12. Edgar H. Callaway, Jr. and Edgar H. Callaway, ”Wireless Sensor Networks:
Architectures and Protocols,” CRC Press.
13. https://www.nsnam.org/docs/manual/html/index.html
(a) Introduction to NLP and language engineering, Components of NLP systems; Basics
of probability; language modelling, smoothing; Hidden Markov Model (HMM) and
its use in POS tagging; EM algorithm, IBM models (Model 1 and 2) for machine
translation; probabilistic CFG, constraint Grammar- bracketed corpus, tree banks;
discussion of different NLP tools: chunker, NER tagger, stemmer, lemmatizer, word
sense disambiguation (WSD), anaphora resolution, etc.; neural language processing:
word embedding, use of word embeddings in designing NLP tools, Recurrent Neu-
ral Nets, GRU, LSTM, sequence-to-sequence learning; Social media text analysis,
Noisy text analysis.
(b) Prerequisites: The student should have had a formal course on Programming and
Data Structures, and basic familiarity with Probability, Statistics and Neural Net-
works, as determined by the teacher.
Examination: 75%
Laboratory/assignment: 25%
(e) References:
Neural Networks
(a) Inspiration and lessons from the brain, introduction to biological neural networks,
Models of artificial neurons, threshold logic, binary neurons and pattern dichotomiz-
ers, perceptron, it learning rule and convergence.
78
Multilayered perceptron, learning algorithms, function approximation, generaliza-
tion, regularization networks, Radial Basis Function (RBF) network and learning.
VC-dimension, Structural Risk minimization, support vector machines (regression
and classification).
Recurrent neural networks, simulated annealing, mean-field annealing, Boltzmann
machine, restricted Boltzmann machine (RBM), and their learning. Temporal learn-
ing, backpropagation through time, temporal backpropagation, real-time recurrent
learning (RTRL).
Self-organizing maps, Hebbian and competitive learning, learning vector quantiza-
tion, principal component analysis networks.
Deep learning, deep neural networks, architectures, autoencoder, stacked autoen-
coder, denoising autoencoder, activation function, learning, contrastive divergence,
deep belief network, Long Short term memory – LSTM, Sequence modeling, word2vec.
Convolutional Neural network, architecture, activation function, learning, popular
convolutional networks like AlexNnet / GoogleNet.
(b) Prerequisites: Design and Analysis of Algorithms, Probability and Stochastic Pro-
cesses
Examination: 70%
Laboratory/assignment: 30%
(e) References:
79
6. M. H. Hassoun, Fundamentals of artificial neural networks. MIT press, 1995
7. J. Hertz, A. Krogh, and R. G. Palmer: Introduction to the Theory of Neural
Computation, Addison- Wesley, California, 1991.
Optimization Techniques
Examination: 80%
Laboratory/assignment: 20%
(e) References:
80
3. C. H. Papadimitriou and K. Steiglitz: Combinational Optimization, Prentice
Hall, Englewood Cliffs, 1982.
4. R. Garfinkel and G. Nemhauser: Integer Programming, John Wiley, New York,
1976.
5. G. Nemhauser and L. Wolsey: Integer and Combinational Optimization, Wiley,
New York, 1988.
6. D. Bertsekas: Non-Linear Programming. Athena Scientific, Belmont, Mass.,
1995.
7. S. Nash and A. Sofer: Linear and Non-Linear Programming, McGraw Hill,
New York, 1996.
8. F. Hillier and G. Liebermann: Introduction to Mathematical Programming, Mc-
Graw Hill, 1995.
9. K. G. Murty: Linear and Combinatorial Programming, John Wiley, New York,
1976.
10. M. Bazaraa, J. Jarvis and H. Sherali: Linear Programming and Network Flows,
Wiley, New York, 1977.
11. W. I. Zangwill: Non-Linear Programming, Prentice Hall, New Jersey, 1969.
12. R. Fletcher: Practical Methods of Constrained Optimization, John Wiley, Chich-
ester, 1981.
13. J. Matoušek and Bernd Gärtner: Understanding and Using Linear Program-
ming, Springer, 2007.
14. S. Boyd and L. Vandenberghe: Convex Optimization, Cambridge University
Press, 2009.
15. V. Chvátal: Linear Programming, W. H. Freeman & Co. Ltd., 1983.
(a) A. Schrijver: Theory of Linear and Integer Programming, Wiley, 1998.
16. G. M. Ziegler: Lectures on Polytopes, Springer, 2012.
17. A. Schrijver: Combinatorial Optimization (3 volume A, B and C), Springer,
2003.
Quantum Computations
(a) Mathematical foundations: Linear space, Scalar product, Hilbert space, Self adjoint
operator, Projection operator, Unitary operator, Eigen-values and Eigen basis.
81
Basics of Quantum Mechanics: (i) Postulates, Uncertainty principle, Complementary
principle, Unitary Dynamics. (ii) Multipartite quantum system, Quantum entangle-
ment, Schmidt decomposition (optional), No-Cloning Theorem, Distinguishing non-
orthogonal quantum states. (iii) General quantum operations, Kraus representation
theorem (optional).
Quantum computing in practice:
Part 1: Introduction to quantum gates and circuits: (i) qubits and physical realisation,
(ii) quantum gates as unitary matrices,(iii) circuits, circuit identities and universality,
(iv) Introduction to IBM Quantum Experience.
Part 2: Quantum Algorithms — pseudo-code, success probability and complexity: (i)
Deutsch’s algorithm, (ii) Deutsch-Jozsa algorithm, (iii) Quantum Fourier Transform
and Quantum phase estimation, (iv) Shor’s algorithm, (v) Simon’s algorithm, (vi)
Grover’s algorithm.
Part 3: Quantum Protocols: (i) Quantum teleportation, (ii) superdense coding, (iii)
remote state preparation, (iv) Basics of Quantum key distribution (BB84 and Ekert’s
entanglement - optional)
Part 4: Applications: (i) Solving linear systems of equations, (ii) solving combinato-
rial optimization problems using QAOA, (iii) binary classification using VQE.
Part 5: Advanced: Noisy intermediate scale quantum computing (NISQ) era and its
challenges, Quantum Error Correcting Codes (QECC), Quantum Memory.
(b) Prerequisites: (for Non-CS Stream) Co-curricular semester long courses on Design
and Analysis of Algorithms, Data and File Structures, and preferably Computer Or-
ganization.
(e) References:
82
4. John Watrous: Quantum Computation lecture notes, https://cs.uwaterloo.
ca/˜watrous/QC-notes/.
5. N. David Mermin: Quantum Computer Science, Cambridge University Press,
2007.
(a) Introduction: A brief history about the emergence of quantum information; a brief
review of fundamental concepts of the quantum theory: axioms of Hilbert space
quantum mechanics; indeterminism; interference; uncertainty; superposition; entan-
glement.
The Noiseless and Noisy Quantum Information: (i) Noiseless Quantum Information:
quantum Bits and qudits; Reversible evolution; Measurement; Schmidt decomposi-
tion; HJW theorem, Kraus representation theorem. (ii) Noisy Quantum Information:
Density operators; General quantum measurement(POVM); Evolution of quantum
states; Quantum channels and their examples; Purification; Isometric evolution.
Basic Quantum Protocols and Resource Inequalities: Entanglement Distribution;
Super dense coding; Teleportation; Optimality of quantum protocols; Extension to
qudits; Quantum nonlocality and contextuality and their applications - Bell theorem
and CHSH game.
Basic Tools and Information Measures in Quantum Information: Distance Measures:
Trace Distance; Fidelity; Purified Distance; Relationship between various distance
measures; Gentle measurement lemma. Information Measures and their Properties:
Shannon entropy; Relative entropy; Von Neumann entropy; Quantum relative en-
tropy; Coherent information; Hypothesis testing divergence; Max relative entropy;
additivity and sub-additivity property of various information measures.
Quantum Shannon Theory: Noiseless Tasks: Schumacher Compression; Distillation
of entanglement; State merging; State splitting; State redistribution. Noisy Tasks:
Classical capacity of quantum channels; Holevo information; Private capacity of
quantum Channels; Entanglement assisted classical capacity of quantum channels;
Quantum capacity of quantum channel.
Quantum Cryptography: Privacy amplification and Information reconciliation; Quan-
tum key distribution; Privacy and Quantum information; Security of quantum key
distribution.
(b) Prerequisites: Basic familiarity with linear algebra and probability as determined
by the teacher.
83
(c) Hours: Two lectures each of two hours duration
Examination: 80%
Laboratory/assignment: 20%
(e) References:
(a) The course has two parts – Randomized Algorithms and Approximation Algorithms.
The instructor can choose from the broad list given against the two parts. The course
can, if needed, start with a brief introduction to (i) NP completeness, strong NP
completeness; (ii) Linear programs – strong and weak duality.
Randomized Algorithms: The syllabus consists of several tools from the theory
of randomization and its application to several branches of computer science like
graphs, geometry, discrepancy, metric embedding, streaming, random graphs, etc.
Tools: Linearity of expectations; moments and deviations, tail inequalities – Markov’s
inequality, Chebyshev’s inequality, Chernoff and Hoeffding bounds; concentration
of measure; Sampling techniques – (Vitter, Knuth, Reif-Vitter, reservoir sampling,
D2-sampling); Martingales – tail inequalities, Azuma Hoeffding inequality, Tala-
grand’s inequality, Kim-Vu theorem; Markov chains; Random walks; Poisson pro-
cess, branching process; Monte Carlo methods; Pairwise independence; Probabilistic
methods;
Topics: Applications (can be chosen from the following list):
(1) Computational Geometry – Randomized incremental construction; backward anal-
ysis; random sampling – VC dimension, epsilon-nets; convex polytopes; geometric
data structures
84
(2) Streaming algorithms – estimating the number of distinct elements; estimating
frequency moments; geometric streams and core-sets; metric stream and clustering;
graph streams; proving lower bounds from communication complexity;
(3) Metric embedding and dimension reduction – Johnson-Lindenstrauss lemma,
Noga’s lower bound, Bourgain embedding, Bartal’s result
(4) Discrepancy – Combinatorial discrepancy for set systems; VC dimension and
discrepancy;
(5) Probabilistic methods – Linearity of expectation; alteration; second moment; Lo-
vasz local lemma – existential and constructive proofs; derandomization techniques;
expander graphs; random graphs
(6) Miscellaneous topics – Data structures; Hashing and its variants; Primality test-
ing; approximate counting; graph algorithms; randomized rounding; etc.
Approximation Algorithms:
(1) Greedy algorithms and local search – k-center problem; TSP; minimum degree
spanning tree;
(2) Rounding and Dynamic Programming – knapsack; bin-packing; scheduling jobs
on identical parallel machines
(3) Deterministic rounding of linear programs – solving large linear programs in
polynomial time via ellipsoid method; prize collecting Steiner tree; uncapacitated
facility location
(4) Random Sampling and randomized rounding of linear programs – derandomiza-
tion; linear and non-linear randomized rounding; integrality gap; MAX-CUT, MAX-
SAT; prize collecting Steiner tree; uncapacitated facility location; integer multicom-
modity flows
(5) Semidefinite programming – introduction; randomized rounding in semidefinite
programming; finding large cuts; approximating quadratic programs
(6) Primal Dual method – introduction; feedback vertex set; shortest s-t path; La-
grangean relaxation and k-median problem
(7) Cuts and metrics – multiway cut, multiple cut, balanced cut, probabilistic ap-
proximation of metrics by tree metrics; spreading metrics, tree metrics and linear
arrangement
(8) Iterative rounding – generalized assignment problem, discrepancy based methods,
etc.
85
(9) Geometric approximation algorithms – well separated pair decomposition; VC
dimension, epsilon-net, epsilon sampling, discrepancy; random partition via shifting;
Euclidean TSP; approximate nearest neighbor search; core-sets
(10) Hardness of approximation – approximation preserving reduction; use of PCP;
unique games conjecture
Examination: 80%
Laboratory/assignment: 20%
(e) References:
86
Specification and Verification of Programs
(a) Modeling of Systems: Modeling of concurrent systems, timed systems, hybrid sys-
tems and probabilistic systems.
Specification Languages: Linear time properties, Linear Temporal Logic (LTL),
Computation Tree Logic (CTL), Timed Computation Tree Logic (TCTL), Proba-
bilistic Computational Tree Logic (PCTL) and their variants.
Abstract Interpretation, Weakest Precondition, Floyd-Hoare Logic, Separation Logic;
Shape Analysis
Techniques for verification: Explicit-State Model Checking, Symbolic Model Check-
ing, Bounded Model Checking, Equivalence checking, Partial Order Reduction, Sym-
bolic execution, Counterexample guided abstraction refinement, probabilistic model
checking.
Program Testing: program testing basics, automatic test-case generation, directed
testing.
Decision Diagrams, SAT Solvers, Satisfiability Modulo Theories (SMT) Solvers.
Software Tools: Popular formal methods tools such as Spin, NuSMV, SAL, UP-
PAAL, SpaceX, Prism, Z3 and CUDD.
(c) Hours: Three lectures and one tutorial (hands-on) per week
Examination: 70%
Laboratory/assignment: 30%
(e) References:
1. C. Baier and J.-P. Katoen. Principles of Model Checking. The MIT Press, 2008.
2. E. M. Clarke, Jr., O. Grumberg, and D. A. Peled. Model Checking. MIT Press,
1999.
3. C. Barrett, R. Sebastiani, S. A. Seshia, and C. Tinelli. Satisfiability modulo the-
ories. In Armin Biere, Hans van Maaren, and Toby Walsh, editors, Handbook
of Satisfiability, IOS Press, 2009.
4. Michael Huth and Mark Ryan, Logic in Computer Science: Modelling and
Reasoning about Systems. Cambridge University Press
87
Statistical Computing
Examination: 70%
Laboratory/assignment: 30%
(e) References:
88
14. W.R. Gilks, S. Richardson and D. J. Spiegelhater: Markov Chain Monte Carlo
in Practice.
Topics in Privacy
Examination: 80%
Laboratory/assignment: 20%
(e) References:
89