0% found this document useful (0 votes)
25 views3 pages

IITDelhi Notes

The document discusses the author's academic and professional background, including degrees in economics and development studies and work experience at an NGO and investment bank. It then provides details about the author's work on a Python library for modeling pollutant dispersion and an example of using Levenshtein distance to identify client identities at J.P. Morgan. The author expresses their interest in pursuing a PhD in computational social choice theory.

Uploaded by

Cerebral Reviews
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views3 pages

IITDelhi Notes

The document discusses the author's academic and professional background, including degrees in economics and development studies and work experience at an NGO and investment bank. It then provides details about the author's work on a Python library for modeling pollutant dispersion and an example of using Levenshtein distance to identify client identities at J.P. Morgan. The author expresses their interest in pursuing a PhD in computational social choice theory.

Uploaded by

Cerebral Reviews
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Introduction

I hold a Bachelor's degree in Economics and a Master's degree in Development Studies. I


have a total of three years of work experience. I have interned at an NGO called Saajha,
subsequently I worked in the ed-tech sector for one and a half year and currently, I am
working as a Data Management Specialist, at the Corporate and Investment Banking
Division of J.P. Morgan. Additionally, I am passionate about environmental issues, leading
me to develop a Python library called "gplume".

My library encapsulates: executing forward modelling, wherein I derive contaminant


concentrations at distinct receptor locales by assimilating emission rates and wind dynamics,
the inverse modelling framework entails deducing emission rates through analysis of
observed concentrations.

The concentration of pollutants from a source is approximated by a bell-shaped curve, with


the highest concentration occurring at the centre of the plume and decreasing symmetrically
as distance from the source increases, that is why it is called gaussian. A "plume" refers to
the visible or invisible trail of pollutants emitted into the atmosphere from a source. Data
used is from OpenWeatherMap API data.

Academic Background - M.A. Development Studies

Development studies is an interdisciplinary field that focuses on analysing social, economic


and political factors that contribute to the development of societies.

Example of work at J.P. Morgan - There were regulatory fines on J.P. Morgan was imposed
by the federal reserve to block trade with a few firms, which it failed to achieve. One of the
major impediments that the company faced was it was unable to recognise the Client’s
identity at the ground level. The application that we use is called party central to filter all the
client’s data and our complete data is stored in RDM (Reference Data Mart). I first sorted
both datasets alphabetically.

I used Levenshtein distance to recognise the client’s identity. The words were broken into
single character tokens to do so. Then I matched both the lists. The Levenshtein distance
between two words is the minimum number of single-character edits like insertions, deletions
or substitutions required to change one word into another. The lower the distance, the better
the match. After performing the UAT testing I concluded that the threshold should be 3 or
below to consider it as a match, which was a feature pushed into production.

Why I want to do a Phd

Pursuing a PhD in Computational Social Choice Theory will provide me better professional
opportunities, especially at an Investment Bank. Additionally, a PhD is much more difficult
than other courses like pursuing FRM and CFA which makes a person with a PhD much
more competitive than those courses, as the number of PhDs is likely to remain low.

How I selected this topic

A transaction refers to an action initiated by an externally-owned account, i.e. an account


managed by a human, not a contract. A hash value is a fixed-size alphanumeric string
generated by a hash function) applied to data of any size. It's essentially a unique identifier
for that specific data.
Blocks are batches of transactions with a hash of the previous block in the chain. This links
blocks together (in a chain) because hashes are cryptographically derived from the block
data.

Block producer publishes a block based on the transactions.

Blockchain protocols such as Bitcoin and Ethereum have finite processing power, so when
demand for transaction processing exceeds the available supply, a strict subset of the
submitted transactions must be chosen for processing. To encourage the selection of the
“most valuable” transactions, the transactions chosen for processing are typically charged a
transaction fee. The component of a blockchain protocol responsible for choosing the
transactions to process and what to charge for them is called its transaction fee mechanism
(TFM).

Maximal Extractable Value (MEV): It is the maximum value that can be extracted from block
production in excess of the transaction fee.

I worked on a research project at J.P. Morgan where our team experimented on the
Ethereum mainnet. The objective of the study was to understand constraints of MEV. The
team made few observations in the study. In the case of MEV extraction, no non-trivial
transaction fee mechanism can be simultaneously incentive compatible for both users and
active block producers.

Existing investigations predominantly focus on passive block producers motivated solely by


net rewards at the consensus layer. In my research, I will study the model incorporating
active block producers, who possess private valuations for blocks, reflecting additional value
derived from the application layer. This introduces the concept of ”maximal extractable value
(MEV)” into the framework.

Building upon this understanding, I will work on a model of block production, distinguishing
between ”searchers” and ”proposers.” Searchers actively identify opportunities for value
extraction from the application layer, acting as an ”MEV oracle” for transaction fee
mechanisms.

Definitions

Mechanism design is the study of setting rules of the game given the desired
objects/outcomes.

An indirect mechanism is a collection of message spaces and a decision rule. A direct


mechanism is the specific case where, message space is type space and decision rule is the
outcome function. It is indirect because we are not directly dealing with types.

The private preference over the set of possible outcomes is called the type of agent

Social choice theory is the field of scientific inquiry that studies the aggregation of individual
preferences toward a collective choice.

A mathematical proof of a proposition is a chain of logical deductions leading to the


proposition from a base set of axioms. A proposition is a statement (communication) that is
either true or false. Axioms are those that are accepted to be true. Logical deductions are
the collections of rules for proving new propositions using previously known ones.
Well Ordering Principle - Every nonempty set of nonnegative integers has a smallest
element.

Efficiency implies that the total value of agents is maximised in all the states.

Passive block producers: Block producers who do not extract MEV. Active block producers:
Block producers who extract MEV
Private Value Model: The preference of each agent i.e. type is his preference over the set of
possible outcomes for the agent, which is completely known to him and private information
to her. This is called the private values model in mechanism design.

Non-trivial - Users must at least in some cases pay a nonzero amount for transaction
inclusion.

Incentive Compatible for users: Dominant Strategy Incentive Compatible (DSIC) - A direct
mechanism is dominant strategy incentive compatible (DSIC) if for every agent, the
truth-telling strategy is a dominant strategy.

Trusted hardware and cryptographic techniques means to detect the tampering of block data
and flagging it or to flag MEV detection.

Revelation principle is a central result in mechanism design. One of its implications is that if
we wish to find out what social choice functions can be implemented in dominant strategies,
we can restrict attention to direct mechanisms. This is because, if some non-direct
mechanism implements a social choice function in dominant strategies, the revelation
principle says that the corresponding direct mechanism is also DSIC.

It is approximately welfare-maximising, because the aim is to achieve outcomes as close to


welfare-maximisation.

MEV Oracle is a system designed to capture value created by block specific situations on a
blockchain.

Normal Distribution

Normal probability distribution is a continuous, symmetrical, unimodal, asymptotic.


probability distribution and it represents the frequency with which a variable occurs.

Continuous means that the random variable can take on any value within a certain range.
Symmetrical means that the left half of the normal curve is a mirror image of the right half.
Unimodal means only one peak or point of maximum frequency that points in the middle of
the curve.

The curve is asymptotic. Asymptotic means that the tails of the distribution never intersect
with the X axis. It is Important because it means that it is possible for even very extreme
values to occur by chance, at least in theory. It signifies that extreme values become
increasingly unlikely as we move away from the mean, but they are never impossible
according to the normal distribution. This property makes the normal curve particularly useful
for modelling phenomena where extreme events are possible but rare.

The Central Limit Theorem states that the sum of the independent random variables tends
toward a normal distribution, regardless of the distribution of the original variables.

You might also like