0% found this document useful (0 votes)
66 views17 pages

Risk Management Techniques Have Appeared Since The Middle of The Twentieth Century

The document discusses risk management techniques and the Value at Risk (VAR) approach. It provides three key points: 1) VAR emerged as a global risk indicator used widely by financial institutions since 1997 to measure potential losses over a given time period and confidence level. 2) VAR is defined as the maximum expected loss over a time horizon that will not be exceeded with a given probability, typically 95-99%. 3) VAR calculates potential losses based on historical market data and profit/loss distributions to determine the worst loss for a given confidence level, providing a simple quantitative measure of risk exposure.

Uploaded by

Abbes Marouan
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
66 views17 pages

Risk Management Techniques Have Appeared Since The Middle of The Twentieth Century

The document discusses risk management techniques and the Value at Risk (VAR) approach. It provides three key points: 1) VAR emerged as a global risk indicator used widely by financial institutions since 1997 to measure potential losses over a given time period and confidence level. 2) VAR is defined as the maximum expected loss over a time horizon that will not be exceeded with a given probability, typically 95-99%. 3) VAR calculates potential losses based on historical market data and profit/loss distributions to determine the worst loss for a given confidence level, providing a simple quantitative measure of risk exposure.

Uploaded by

Abbes Marouan
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 17

Introduction :

Risk management techniques have appeared since the middle of the twentieth century.

Those techniques aim to study the possible factors that may, eventually, threaten
a financial institution’s wealth in order to bring strategic solutions to reduce or
even eliminate that risk.

Financial risk exists as various risk classes such as: market risk , credit risk
,operational risk ,etc.

Each of the latter risks is subdued to strict directives which determine its
appropriate estimation and which apply crafty protection measures to just pass
by that risk.

Supervised by the bank of international settlements ,the Basel committee had


elaborated the previously mentioned directives.

The Basel 2 agreement ,constrained financial organisms to set their own internal
models to evaluate ,their quantity of statutory funds.

There exist numerous methods permitting to describe in a quantitative way the


risk related to the detention of a financial asset. Amongst those methods ,we
have to state ,notably, the mean-variance approach given by “the notorious”
Markowitz , aside with the Sharpe approach. But the latter ones appeared to be
unsatisfactory when trying to evaluate the risk of portfolios containing various
kinds of assets such as : stocks , bonds ,derivatives ,etc.
It is back to when derivatives had been introduced and developed aside with the
increase of the financial markets volatility and the 90’s spectacular bankruptcies,
that financial institutions had elaborated a financial risk indicator, which is
global and synthetic at a time. The value at risk (VAR) groups together all the,
forsaken, characteristics of an excellent risk indicator, and had been adopted by,
merely, most of the financial organisms since 1997.

Chapter 1 :the value at risk (var):

Since 1997 ,various factors has led the VAR to be considered as the indicator
related to risk measurement.

According to Mr Paul Glasserman ,three major events were conducive to its


adoption :

-the first event was ,the free publication ,in 1994, of Risk Metrics system by the
American banc “JPMorgan”. The other enterprises and financial organisms have
been able since then to use the risk metrics calculator and exploit it within their
own risk management systems.

-the second event ,dated from 1995, with the first meeting of Basel committee.
Adopted in 1996,the new Basel reform incited banks to develop their internal
system to estimate the VAR that will make it possible to determine their
necessity to statutory proper funds .In fact, with the absence of an internal
model, the calculation of proper funds is standardized and gives values that are
higher than those gotten from the internal models ,much more locally adapted to
enterprises.

-The third event ,has certainly less impact nowadays ,but had led to the adoption
of the VAR by the American enterprises.
In 1997,the “security and exchanges commission” (SEC) imposed some
communication rules to the American enterprises ,concerning the risk associated
with the use of derivatives . Three solutions had been suggested to enterprises to
face that risk, and the VAR was one of them.

But the VAR , also ,has been adopted for its straightforward approach and its
characteristics : the global/synthetic duality. In fact it permits a simple and
comprehensible appraisal for neophytes ,by providing a quantitative indicator
expressed in a monetary way. Moreover the VAR does have numerous
advantages over the other classical methods of risk measurement :

-it gives a simple perception of the possible losses breadth.

-it is not subjugated to the “profit and loss” distribution of the portfolio ,or to
any underlying asset ,contrarily to lot of models based on a normal distribution
of their profits and losses.

-takes into consideration the profit and loss distribution asymmetry.

1.2: Var principle :

to just wrap the reader’s head around what VAR is , we thought of ,presenting a
concrete (pretty humble !) example. Let’s say we have invested 10000 Dinars in
our stocks portfolio. How can we ,ever, have an idea about the maximum loss
that the portfolio may, eventually, undergo within a month ?

The very logic answer would be as follow :


We could loose the whole stuff !!! (well that would be ridiculous ,isn’t it ?!)

Nevertheless , a total loss event is far to occur. A further realistic answer ,would
be for example :”in the case of absence of an unusual extreme event, there is a
chance of 5 % to lose 1000 Dinars “.

 it is the kind of information that the VAR provides.

Well, after having felt our little, slight but concise example, we are going to get a
little deeper into details.
(we know it is a bit boring !!!,that’s why ,we will try to flip it, twist it ,melt it and
get it back to life in another brand new flavour ,which you will appreciate !!!)
There existed lot of definitions of VAR ,we will only pick up two of them !! :

1: According to Calvet [2000] ,the VAR of a portfolio of financial assets


corresponds to the amount of the maximum loss over a given time horizon, if we
exclude a set of unfavourable events having a slight, feeble probability to occur.

2:According to Esh,Kieffer and Lopz [1997] as well as to Jonion [2000] ,the VAR
of a portfolio or a simple asset,for a given duration T and a probability level
“alpha”,is specified to be the amount of the expected loss in a way that, during
the given period [0,T], it wont ,by any mean, overstep the VAR amount, and this
with a probability 1-alpha.

Particularities :

Two elements are common to both of the previously stated definitions and it is of
crucial importance to choose thoroughly :the time horizon and the confidence
level.

The time horizon is mainly influenced by three major factors :

-it has to be adopted to the detention duration of the asset or the portfolio,
subjects of the estimation.

-it has to be sufficiently short ,to respect the portfolio’s composition’s invariance
hypothesis.

-it has to be sufficiently short so that the available quantity of data permits to
estimate a relevant and accurate VAR over this horizon.

But ,it also could be fixed by statutory standards. The financial organisms use a
ten days VAR , but that horizon could reach several months, when it is about
bonds.

The confidence level ,as for it, is influenced by two factors :

-it must not be a way elevated ,otherwise the risk of occurrence would be
sufficiently low to be uninteresting as an indicator.

-it does have to reflect the aversion degree of risk managers to the risk of
extreme events occurrence.

But like the horizon, the confidence level can be obtained by the statutory
standards.
Practically, VAR is estimated based on a confidence level going from 90 to 99%.

The statutory standards imposed by Basel 2 requires that the VAR is calculated
by banks using a confidence level of 99% and a two weeks to ten days horizon.

Classical approaches :
After choosing a horizon and a confidence level, it still one more task to fulfil
before proceeding to the VAR estimation.
In fact we have to proceed to retrieve data upon which the VAR will be calculated
and reprocess them.

We can calculate VAR through the portfolio’s outcomes as well as through the so
called “profit and loss” distribution which corresponds to daily profits and
losses.
At this level ,we are going to introduce the “maximum loss” theory which by
turn will lead us to a clear overview of the VAR.

The maximum loss theory :

Consider some portfolio of risky assets and a fixed time horizon  ,and let’s
denote by F ( l )=P( L≤ l) the distribution function of the corresponding loss
distribution.

We want to define a statistic based on F ( l ) which measures the severity of the


risk of holding our portfolio over the time horizon .

An obvious candidate is the maximum possible loss given by:

Inf { l ∈ IR : F ( l )=1 }

However, in most models of interest ,the support of F is “UNBOUNDED” so that


the maximum loss is simply ∞.
Moreover ,by using the maximum loss ,we neglect any probability information in
F.

From “maximum loss “ theory to VAR :

Value at risk is a straightforward extension of maximum loss ,which takes these


criticisms into account.

You can say “u are just brawling ,I don't really see any big deal !!”.
Well, my answer is that u are so impatient!! , because the idea is to simply
replace “maximum loss” by “maximum loss which is not exceeded with a given
high probability”.

 the so called confidence level .

and thus, the adapted definition of VAR based on the new knowledge we
got is :

”given some confidence level α ∈(0,1) .The VAR of our portfolio at the
confidence level α is given by the smallest number l, such that the
probability that the loss L exceeds l is no longer than (1- α).
FORMALLY:
Var=inf { l∈ IR : P ( L>l ) ≤1−α }=inf { l ∈ IR : F ( l ) ≥ α }

In probability terms VAR is simply a quintile of the loss distribution.

Typical values for α are : α =0.95


α =0.99

In market risk management ,the time horizon Δ is usually 1 or 10 days.

In credit risk management and operational risk management Δ is usually


a year.

Now that u knew what ,virtually, VAR is ,we are going to break it down a
little more :

Considering a portfolio composed of various stocks which we know the


daily quotations .we note by wt the value of the portfolio at the moment t.
The VAR of this portfolio over a one day horizon and a confidence level α
corresponds to the loss ∆Wt+1 = -( Wt+1 – Wt ) observed for the portfolio with
a probability (1 − α).Otherwise VaRt(α) verify the following equation :

P[∆Wt+1 > VaRt(α)]=1−α


Which also means P[∆Wt+1 ≤VaRt(α)]=α

By introducing the distribution function Ft we obtain :

Ft (VaRt(α))= α which leads also to : VaR (α) = F−1(α) .

Generally ,we try to reach to a standard probability distribution which could be assimilated
to yields law.to do so, we reduce and center ∆Wt+1 :

(∆ W t +1−E (∆ W )) (VaR t (α )−E(∆ W ))


p [ σ ( ∆W )

σ (∆ W )
=α ]
(VaR t (α )−E (∆ W ))
let zα = , where zα is the α-quintile of the the normal standard distribution.
σ(∆ W )

Leading the latter calculations upon a daily distribution of yields ,we are
expressing a single day horizon VAR . To scale the VAR to a horizon of x
days ,we use the following formula :
varx=var 1∗√( x)

issues of non-normal distributions :


In reality, yields distribution of a portfolio is rarely normally
distributed ,and the assumption of its normality is just simplistic.
Actually ,in order to characterize a normal probability law ,only two
parameters are used : the central tendency reflected by the mean ,and the
variance which represent respectively the first and the second
distribution functionals. But, in real life ,two more parameters are
required to make it possible to work with the normal assumption which
are:
Skewness and Kurtosis coefficients .
Skewness(asymmetry coefficient)
Skewness coefficient ,which we denote by Sk, characterizes the
distribution asymmetry .it is associated to the third moment of the
distribution. Here is its model: (check formula)

Where n represents the number of the available data.

We, practically compare the empirical Skewness to the normal one which
which values 0.

-if sk=0 then the distribution is symmetric which assumes that it does
have a great chance to be normally distributed.
-if sk>0 the the distribution is spread towards the right side and it is
called of positive symmetry .

- if sk>0 then the distribution is spread towards the right side and it is
called of positive symmetry .

-if sk< 0 then the distribution is spread towards the left side and it is
called of negative symmetry .

Applying the theory : a Java based GUI application:

In order to apply what we have seen across the previous chapters as pure
theory ,it is time to bring everything together and build a “cuty !” GUI
application to offer certain services that we will see later on.

We ,honestly, were waffling about which technology to get use of to build


a solid application. We thought of using Microsoft Excel ,but we found out
that it wont really make our legacy ! since it is a kind of straightforward
platform ,and everyone pretends to master it to the fullest, that’s why we
dropped the idea .We also thought of trying the notorious C++ “QT”
library ,it was actually a great idea at the very beginning ,because we have
been told of how powerful it is ,but we swiftly figured that it is a bit
complicated to grasp in a short while ,knowing that we have time
constraint ,so we toggled right after to our beloved friend “JAVA” ! ,it just
saved the day !!! .

One of the reasons we picked Java was its large portability (“code once,
run everywhere “ principle ) which implies that ,if the application will
,ever, be published ,it will be available on any OS embedding a JVM (java
virtual machine).
Another reason to tell, is that a large community is working behind the
scenes producing “tick” API’s that really makes it easy for the “folks” to
build magnificent applications.
In our case ,we are proud to tell that we used three major API’s :
1)Swing API (http://en.wikipedia.org/wiki/Swing_(Java))
2)JScience API (http://jsci.sourceforge.net/)
3)JFreeChart (http://www.jfree.org/jfreechart/)

well ,after uncovering our magical tools ! let’s get into the core tasks our
application is intended to fulfil .

what our application is eligible to do :

our application is destined for offering services revolving around assets


portfolio optimization and in order to get that done ,the application has to
fulfil the following objectives:
- being able to constitute one’s own portfolio given assets selection.
- Retrieve assets historical quotes through internet without any user
intervention.
- Processing the raw data :
 Homogenizing dates.
 Calculating the portfolio valuation, putting into account the
exchange rates.
- displaying classical indicators upon each asset (yields distribution
functionals,best performance,worst performance,value at risk)

-we finally suggest an optimal portfolio in value at risk term.

A great effort has been dispatched to make a straightforward and high-


usability graphical interface.

Portfolio assets selection :

Our application offers the possibility to a potential user to compose his


portfolio with assets from the American (dowJones) and the French (cac40)
stocks markets.

The application main window actually displays two table items of four
columns showing the available assets information.
The user has to indicate for each stock company he selected the
corresponding weight.

A button lying down the window will be enabled that allows the user to
toggle to the next window if he finished composing.

Portfolio assets preview and selection confirmation:

Clicking on the 1st button will cause the application to get the user to the
next window which is about displaying the customized portfolio
composition .Two buttons are lying there permitting the user to whether
get back to the previous screen and alter the composition or confirming
the customized combination .if the confirmation button is clicked upon the
program will notify the user that it is about to download the selected assets
historical data.
Our original, powerful and tremendous module that plays with yahoo!! :

Well ,we have to tell u that ,the thing that we are proud of is that we coded a
little tiny module that gets to yahoo finance server ,download the historical
data ,process it ,seal it (well that is a joke !) ,and put it right away in the
database. The funny “thingee” is that before coming up with the idea ,we
were seeking for a WEBSERVICE or an API that offers that service, but they
were all big money !! ,but it is now tax free service , thanks to our get-it-
from-yahoo module !!

Well ,the downloading operation could take several minutes according to


the server availability and the internet connection behaviour ! (it may play
up a little !).

Then after downloading , a mandatory task is waiting ,which is processing


the downloaded data, so lets see what it is all about !

From raw data to structured organised useful information (the processing


machine !!) with love !! :

Well ,if there was something that made us on edge then it will be the raw
data (excessively raw !) processing .the downloaded data ,which are called
raw data, are missed up a little ,that’s why we coded procedures to
rearrange them in a way it will be easy to extract accurate information
from them such as :yields’ central tendency ,yields’ standard deviation
,etc..

We hopefully overstepped that big burden and managed to subdue the wild
data to our will !!

Now, after having downloaded the data , the program will change to
another screen which is the portfolio analysis.

The big show : the art of analysing ! :

After processing the data , results are needed to be displayed and


optimization process is waiting to run too !
Well ,we are going to break it down for you so that u can wrap your head
around the core results that the user is waiting for .

First result : having an overview over the integral portfolio statistics

Well the user wants to know information about his customized portfolio
such as knowing the central tendency of the yields distribution which is
about the expected yield value that all the likely yields values are
revolving around! .Another important thing is to know , approximately ,
how much yields values are distant from the yields empirical distribution
mean. Normally ,the last two information are just enough ,but in the
empirical world (statistics belongs to it !) ,two subsequent parameters are
needed to characterise an empirical distribution which are :Skewness and
kurtosis parameters.

Skew.. what ? (hey, what are u talking of? I have never heard of this ,so
baby steps please !!)
Alright ,I will go baby steps !
Skewness parameter actually characterises the asymmetry of the
empirical distribution, as for the Kurtosis parameter ,it aims to measure
the heavy tails of the distribution.
The kurtosis and skewness parameters determine how much an empirical
distribution fits the standard normal distribution.

After that ,the user has to know the minimum loss value that can occur with
a given probability over a given horizon ,this is actually the VAR.
We have actually 4 different VAR values to calculates ,standard ones :which
are three and the fourth is calculated using variance-covariance calculation
method (seen previously).

Standard methods of VAR calculation:

classical estimation :
(copier les formules page 19)

2) VaR de Cornish Fisher (2000):


we consider that the assets yields are normally distributed and that the quintile za is corrected as
follow :
wq = zq + 1(zq2 − 1)Sk

the portfolio VAR is thus the weighted sum of all assets VAR

3) improved VaR of Cornish Fisher (2001) :


the normal law quintile is corrected to take in consideration the kurtosis and skewness parametres
wq = zq + 1(zq2 − 1)Sk + 1 (zq3 − 3zq)K − 1 (2zq3 − 5zq)Sk2

the analysing section offers ,also, the possibility to display the portfolio
valuation’s chart over a whole year ,and it offers as well the contribution of each
asset to the integral portfolio

the application ,offers another screen to make the same analysis


upon a single investment (a single asset)

and finally we have the optimization section which is about a


button that ,when clicked upon, triggers the optimization
process , that will tell of the right assets composition
combination.

You might also like