0% found this document useful (0 votes)
13 views6 pages

Summary Assignment1

The document discusses research contributions from India in the field of theoretical computer science across several areas. Some key highlights include: 1) India made significant contributions to algorithms research, including the development of max-flow algorithms in the 1970s. 2) Indian institutions like IIT Delhi conducted pioneering research in solving scheduling and facility location problems modeled as multi-commodity flow problems. 3) India hosted the first international event focused on parameterized algorithms and complexity, a field where Indian researchers made several advances. 4) Indian researchers also made advances in data structures, complexity theory, algebraic complexity, logic and automata theory, and other areas of theoretical computer science.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views6 pages

Summary Assignment1

The document discusses research contributions from India in the field of theoretical computer science across several areas. Some key highlights include: 1) India made significant contributions to algorithms research, including the development of max-flow algorithms in the 1970s. 2) Indian institutions like IIT Delhi conducted pioneering research in solving scheduling and facility location problems modeled as multi-commodity flow problems. 3) India hosted the first international event focused on parameterized algorithms and complexity, a field where Indian researchers made several advances. 4) Indian researchers also made advances in data structures, complexity theory, algebraic complexity, logic and automata theory, and other areas of theoretical computer science.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/343696169

Assignment 1 - Computer Science Theory

Article · August 2020

CITATIONS READS

0 2,472

1 author:

Ali Rahman
University Malaysia of Computer Science and Engineering
2 PUBLICATIONS 0 CITATIONS

SEE PROFILE

All content following this page was uploaded by Ali Rahman on 17 August 2020.

The user has requested enhancement of the downloaded file.


ASSIGNMENT
COVER SHEET

Student Name & ID ALI RAHMAN BIN ROSLI


B02190004
Course Code

Course Title COMPUTER SCIENCE THEORY


Assignment No 1
Assignment Title

Prepared For DR. NURHIZAM SAFIE


Due Date 17/8/2020

DECLARATION

This assignment is my own original work.

No part of this work has been copied from any other source or person except where
due acknowledgement is made, and no part of the work has been previously submitted
for assessment at this or any other institution. I have read the Student Academic
Integrity Policy and understand its implications. For the purposes of assessment and
standards, I give the University permission to retain this assignment; provide a copy to
other assessors; and evaluate its academic integrity through the use of a plagiarism
checking service (which may store a copy of the assignment on its database for future
plagiarism checks).

Student’s Signature
(representative)

Date:

To be filled by Lecturer.

Marks

Comments

Date Received
Summary
As we know, India is a country that is very famous for its mathematical skills and traditions.
In the past 30 years, India had thoroughly researched in computing field, specifically in
theoretical computer science. Even during 1980s and 1990s where hardware was limited,
theory has offered unique opportunity to keep up with international research in computing.
There are many things that can be highlighted through the research of theoretical computer
science, one of it is Algorithms. A problem is studied thoroughly, to maximize the flow that
can be routed in a network together with immense practical applicability. In 1970s, Sachin
Maheshwari and V.M Malhotra and M.Pramodh Kumar managed to develop a max-flow
algorithm that matched the best bounds at that time. However, it was simpler conceptually,
making it ideal for exposition.
The Indian Institute of Technology (IIT) Delhi was the forefront of international research in
solving scheduling and facility location problems, as it often cast as multi-commodity flow
problems, and are NP-hard. To solve this problem, they try to devise efficient approximation
problems using ideas that can be gained from flows and linear programming.
India first international event wholly devoted in parameterized algorithms and complexity,
which is a relatively recent field that mainly focuses on multivariate analysis of algorithm
performance and development of algorithms for hard problems, where combinatorial explosion
is confined to specified parameters. They have seen cutting-edge contributions from India,
mainly from instate of Mathematical Sciences, Chennai, and Chennai Mathematical Institute.
Indian researchers also made huge contribution to achieved combinatorial characterizations,
developing new algorithms, and helping to understand the parallel complexity of these
problems.
Data structures are crucial if many state-of-the-art algorithms are to be efficient. Indian
researchers were part of the group developing data structures for static, succinct representations
and for the preservation of dynamic data, as well as proving non-trivial lower limits on
sophistication of queries and space needs.
Secondly, the highlight of the research of theoretical computer can be seen in Complexity
Theory. Primality research has been conducted since antiquity Greece, at least. Nontrivial
concepts for evaluating primes, however, only emerged within the last two centuries. In
addition to academic interest, primacy research has gained enormous practical significance due
to the need for arithmetic modulo prime and pseudo-prime numbers in various cryptographic
implementations, error correction codes and other codes of fundamental importance.
While randomized polynomial-time algorithms are adequate for this reason, the fundamental
problem of derandomisation remained open until 2002 when Agrawal et al.1 at IIT Kanpur
proved the breakthrough result PRIMES is in P. Agrawal was already a well-established
complexity theorist, while graduate students Kayal and Saxena were about to begin their Ph.D.
thesis. This paper eventually appeared in the Mathematics Annals, and was awarded both the
EATCS-SIGACT Godel Prize and the AMS Fulkerson Prize.
The theory of algebraic complexity deals with the mathematical computation of formal
polynomials in such structures as circuits. Such models' mathematical research requires an
interaction between computer science and algebra, which enriches all fields. In this technically
challenging field, the recent contributions of Indian researchers at CMI, IIT Bombay, IIT
Kanpur, IIT Madras, Indian Institute of Science, Bangalore (IISc), IMSc, and Microsoft
Research have been remarkable, with numerous foundational results and evidence techniques
being produced.
Algebraic methods are also used to demonstrate that by proving lower limits, certain problems
are hopelessly hard. For instance, the infamous P = NP problem involves proving a lower
algorithmic limit. Algebraic circuits have similar lower bound problems. In this field the theory
research group in India has made steady progress.
Machine learning is a potential area for putting into practice the insights obtained from
algebraic complexity. An artificial neural network (ANN) is an algebraic, threshold gate circuit.
Thus, a proper understanding of threshold circuits in observational learning will lead to better
back - propagation algorithms and higher, lower bound results. Indian researchers have already
begun designing algorithms to reconstruct circuits.
Problems with isomorphism about structures often appear in computer science. Some examples
of structures are problems with the NP hard, graphs, fields, algebras, and polynomials. Indian
scholars recognize these closely, and some of the best-known results have been verified.
Complexity of interaction studies the interaction needed when the information is passed
through several parties to solve a problem. Indian researchers have made leading contributions
to this area, notably at Tata Institute of Fundamental Research, Mumbai (TIFR).
Thirdly, logic and automata theory are also highlighted in this research. Buchi was the first to
identify the close interplay between the automata theory and logic. Pnueli introduced the
temporal logic as a language for defining reactive device properties. Emerson, Clarke, and
Sifakis developed model checking: algorithmically deciding if a formal model follows a
specification of the temporal logic.
Commonly, reactive systems are composed of several interacting components. Viewing the
system as a sequential automaton results in the issue of state explosion and seriously limits the
effectiveness of model testing. However, for a set of concurrent acts temporal logics interpreted
over sequences are forced to think for an infinite number of similar interleaving.
Mazurkiewicz proposed to enrich alphabets with a relation of independence. This commutes
analogous independent actions, producing equivalence groups of words called traces. Traces
are labeled as partial order of bounded width and in many respects generalize words smoothly.
Zielonka specified asynchronous automatons, a distributed model that captures standard trace
languages with precision. This led to a natural issue of model checking asynchronous
automatons in respect of trace-defined temporal logics.
In CMI the first temporal logic over traces, TrPTL, was formulated. The problem of model
testing was solved using the gossip automaton, which uses a limited collection of timestamps
to keep dynamic track of changes between communication systems.
Temporary logic expressively equals series theory of first order. It is unclear if TrPTL catches
trace theory of the first order. CMI researchers later developed the first expressively complete
temporal logic over traces, in collaboration with European colleagues.
Findings from trace theory generalize to the contact with bounded channels of finite-state
machines. Message sequence charts (MSCs) describe interactions among buffer-
communicating agents. At CMI a robust theory was developed of regular MSC languages.
Synthesis is the converse of model checking: construct an automaton that satisfies a logical
specification. This was overcome in the sequential setting by Buchi and Landweber. Pnueli and
Rosner demonstrated strong uncertain results in the distributed setting that stem from the
regulation of global specifications across loosely coupled agents. The determinability of the
distributed synthesis according to local requirements is still open. CMI and IMSc managed to
produce some of the massive positive results.
The theory and logic of automatons has extended to include other elements. IMSc and TIFR
identified a variety of timed extensions to the temporal logic. In parallel, study was also being
performed on distributed timed automatons at CMI and IISc, as well as on timed versions of
finite-state communication machines at CMI and IIT Bombay. At IMSc there was work on
automata and logic over data terms, performing computations over infinite datatypes. At CMI
and IIT Kanpur, research has also been done to extend model testing from finite-state systems
to infinite-state systems such as pushdown automatons.
In the second article, it mainly talks about history and funding sources of four perspectives of
theoretical computer science which are state machines, computational complexity, program
correctness, and cryptography.
Firstly, state machines. State machines are models that can be seen everywhere that describe
and implement many areas of computing. Theory and implementation techniques that have
improvised over the years helped state machines to provide accurate construction and analysis
of applications, including compilers, text-search engines, and operating systems. Basically,
state machine is a system that is assumed by the set of conditions that it may assume. Then,
numerous inputs are used to provide outputs or different states, depending on current state. For
example, calling can identify many states such as idle, dial tone, dialling, ringing, and talking.
In 1960s, formal languages were used to implement state machines in software and it quickly
became the focus of research. One of the most notable research was the “pushdown automata”,
a state machine equipped with an auxiliary memory stack which centralized the mechanical
parsing performed to interpret sentences in high-level languages. When the researchers
managed to fully understand parsing, mechanizing programming language was formalized into
a routine task. The building of parsers was also made automatic, making it able to facilitate the
first of many steps in converting compiler writing from craft into a science. State machines
were added to the everyday toolkit of computing while the use of state machines to create a
model for communication systems became commonplace for electrical and communications
engineers.
State machines were built by a group of academic and industrial researchers. Initially, the
research started as a theoretical construct, but it is now naturalized throughout computer
science as an organizing principle and specification tool.
Secondly, computational complexity. Computability theory was researched to find out the
answer to finding effective procedures for deciding mathematical questions. Algorithms that
were created for the sake of manual computing were usually characterized by operation counts.
Only by 1970, algorithms analysis had established as an aspect of computer science. As time
passes by, study on complexity theory has evolved from concerns about the space required to
perform it to issues such as number of random bits that are needed to encrypt a message.
Thirdly, it also focuses on verifying program correctness. As time progresses, computer was
made from solving mathematical problems, to verifying program correctness. In full
verification method, program’s specifications are being described mathematically, and a formal
proof of the program will realize the specifications that are carried through. To ensure the
validity of the proof, the proof will be checked mechanically.
In formal verification process, Computer programs become objects of the study of
mathematics. A program is perceived as affecting the state of the data it interacts with. The
program 's purpose is to transform a state with known properties (the precondition) into a state
with properties initially unknown but desired (the postcondition). A software consists of
elementary operations, such as quantity addition or contrast. Through elementary operation is
known for its transforming effect. Verification requires showing, through logical deduction,
that the sequence of program steps beginning with the precondition will inescapably lead to the
desired postcondition.
When programs require many repetitions of the same simple ideas, applied to several different
data elements or several transformation stages beginning from some initial data, verification
involves demonstrating once and for all that, no matter what the data is or how many steps it
takes, a program must ultimately achieve the postcondition. Such an argument takes the form
of a mathematical inference which asserts that the state is an acceptable starting state for the
next repetition after each repetition. The argument that the state remains fit from repetition to
repetition is called a "invariant" assertion.
Last but not least, this study also focuses on Cryptography. Cryptography has been used in
military for quite a long time, but only it has been implemented in businesses and personal
applications. It secures transactions and maintain privacy of personal communications.
Cryptography is a study about theoretical work that has clear implications for practice and vice
versa. Coding and decoding are the main attention in discussing Cryptography.
In 1970s, government agencies were the first to focus pursuing research in cryptography. An
independent movement to study cryptographic was developed. It was driven by the availability
and needs of computing. Existing computing power made experimenting on cryptography
feasible. Now, we have few types of cryptography such as public-key cryptography.
In conclusion, theoretical computer science is so important, and it has been supported by
federal government and industry. There are many advances that have emerged from research
in theoretical computer science. It has contributed a lot in computing practices while also being
informed by that practice.

View publication stats

You might also like