Introduction To Copulas: Mark S. Tenney Mathematical Finance Company July 18, 2003

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Introduction to Copulas

Mark S. Tenney
Mathematical Finance Company
July 18, 2003
Copyright c 2003 All rights reserved Mark S. Tenney. 4313 Lawrence
Street, Alexandria Virginia, 22309. Phone number 703 799 0518.
http://www.mathematical-finance.com
Contents
1 Distributions 3
2 Some Copulas 3
3 The Problem 4
4 Uniform Distributions 4
5 Transforming to Uniforms 4
6 Wish 5
7 First Try: Dening Copula from Distribution 5
8 2nd Try Dening Copula with Measure Theory 5
9 Special Increasing Functions 6
10 3rd Try: Dening Copula with Special Increasing Functions 7
11 Distribution to Copula 8
12 Copula to Distribution 8
1
13 URLs 8
XIII 1 Embrechts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
XIII 2 Nelsen . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
XIII 3 Venter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Bibliography9
2
1 Distributions
We closely follow Chapter 2 of Nelsen [2] and Chapter 2 of Embrechts,
Lindskog and McNeil [1].
Denition 1.1 (One Variable Distribution Function). The probability
that a random variable is less than or equal to z is F(z). F(z) is between
0 and 1. The variable z is the random outcome and z is called a random
variable.
An example is the normal distribution. We note that it need not have
mean 0 or variance 1.
Denition 1.2 (Normal Distribution Function). We dene the cumu-
lative normal distribution or just distribution function N by
N(z; ,
2
) =

(x)
2
2
2
(1)
We can write F(z) = N(z; ,
2
) to illustrate our earlier notation F.
Denition 1.3 (Multifactor Distribution Function). The joint proba-
bility that the i-th random variable is less than or equal to z
i
for i=1,...,n is
F(z
1
, ..., z
n
). F(z
1
, ..., z
n
) is between 0 and 1. The vector (z
1
, ..., z
n
) is the
random outcome and is a random variable.
Denition 1.4 (One factor Marginal Distribution Function). The
probability that the i-th random variable is less than or equal to z
i
is F
i
(z
i
).
F(z
i
) is between 0 and 1. F(z
i
) =

F(z
1
, ..., z
i
, .., z
n
)dV (i) where dV (i)
symolizes integration over all variables except z
i
.
In addition to the one factor marginals, there are marginals of dimension
k for any k between 1 and n-1. All of these are distributions as well.
2 Some Copulas
Denition 2.1 (Bivariate Independent Copula). The Bivariate Inde-
pendent Copula
C(u
1
, u
2
) = u
1
u
2
(2)
The next example is from p25 of Embrechts, Linkskog and McNeil [1]
3
Denition 2.2 (Bivariate Normal Copula). The Bivariate Normal Cop-
ula is
C
Ga
R
(u
1
, u
2
) =

N
1
(u
1
;0,0)

N
1
(u
2
;0,0)

1
2(1 R
2
12
)
1/2
e

s
2
2R
12
st+t
2
2(1R
2
12
)
dsdt
(3)
The next example is from p26 of Embrechts, Linkskog and McNeil [1]
Denition 2.3 (Bivariate t-Copula). The Bivariate t-Copula is function
N by
C
t
,R
(u
1
, u
2
) =

t
1

(u
1
;0,0)

t
1

(u
2
;0,0)

1
2(1 R
2
12
)
1/2
(1+
s
2
2R
12
st +t
2
(1 R
2
12
)
)
(+2)/2
dsdt
(4)
3 The Problem
C 3.1 (Problem). I have one factor distributions for a collection of vari-
ables but dont have a multifactor distribution.
C 3.2 (Goal). I want a multifactor distribution but I want to keep my
marginal distributions.
4 Uniform Distributions
Denition 4.1 (Uniform). By this we shall mean a random variable with
an equal probability to fall in any subinterval of equal size of the interval 0
to 1.
Theorem 4.1 (Uniforms Distribution). The probability that the out-
come of a draw from a uniform is between 0 and u, where u is between 0
and 1 is itself u. So if F is the cumualative density function, then F(u) = u
for u between 0 and 1.
5 Transforming to Uniforms
C 5.1 (Solution Step 1). Transform each of the one factor distributions
to be uniforms. This is done by setting u
i
= F
i
(z
i
). The random variable
u
i
is between 0 and 1 because F
i
is.
4
6 Wish
C 6.1 (Wish). We wish we could take the u
i
variables and just stick them
into dierent choices for some acceptable joint distribution function.
C 6.2 (Wish Benets). If our wish comes true, then we can transform
the random outcomes u
i
, i = 1, ..., n back to z
i
by using the inverse of
the marginal cumulative distribution functions, z
i
= F
1
(u
i
). Note that
F
1
(u
i
) is dened for u
i
from 0 to 1, but the output variable z
i
can vary
from minus innity to plus innity if that is the range of F
1
.
C 6.3 (Technical Nit Pick 1). For the general case, F
1
i
(u
i
) may have
multiple values over certain ranges of u. In this case, you can pick any of
those choices and get an acceptable joint distribution. Any such choice is
called a quasi-inverse function.
C 6.4 (Wish Comes True). We can take the u
i
variables and just stick
them into dierent choices for some acceptable joint distribution function
called a copula.
7 First Try: Dening Copula from Distribution
Denition 7.1 (Copula). We can take the u
i
variables and form them into
a joint distribution function C(u
1
, ..., u
n
) that varies between 0 and 1. We
have the interpretation that the joint probability that each u
i
is less than
or equal to U
i
is C(U
1
, ..., U
n
).
C 7.1 (How do I get a Copula?). We want to have a recipe for a function
on n-variables that each are between 0 and 1 for it to be a Copula. We have
dened a Copula as a joint probability distribution. Instead we want to
dene it in terms of a recipe and then have as a theorem that its a joint
cumulative probability distribution. So lets start over with our denition
of Copula. We start with denitions of special increasing functions. Those
let us build our Copula denition from scratch.
8 2nd Try Dening Copula with Measure Theory
C 8.1 (Measure Theory Denition of Distribution Function). If we
start with measure theory, we can dene a joint probability distribution
function as a set function with certain properties. We dene a set function
as one that maps sets to the non-negative real numbers. Measure theory has
5
a triple of objects, a sample space of outcomes, a set of subsets of the sample
space and a set function from each of these subsets to the real numbers. We
require that the probability measure or set function satisfy the following for
any set in the collection of subsets:
1. The probability of any set is between 0 and 1 inclusive.
2. The probability of the null set is 0.
3. The probability of the entire sample space is 1.
4. The probability of a countable union of disjoint subsets of the sample
space equals the sum of their probabilities.
Using measure theory gets us to cumulative distribution functions that are
valid. The cumulative distribution function is the set function dened above.
This is the more general approach to increasing set functions. In measure
theory, we use the subset relation to dene increasing function, instead of
a distribution. Measure theory allows us to work with broader sets and
to deal with combinations of discrete and continuous probability or point
mass probabilities in the middle of continuous ranges. We can do this with
increasing functions by using the Riemann-Stieltjes integral. Measure theory
uses the Lesbesgue integral which is already Stieltjied.
C 8.2 (Dening Copula from Measure Theory). We now consider a
distribution dened on the unit hypercube in n-dimensions. Such a distri-
bution is a Copula.
C 8.3 (Special Increasing Functions instead of Measure Theory ).
Rather than go through the complications of measure theory, we use these
special increasing functions. If we dont have technical problems we avoid
having to use the terminology of measure theory which is more abstract and
general. We also get recipes for making joint distribution functions for the
continuous outcome case with no point masses without the fuss of measure
theory.
9 Special Increasing Functions
C 9.1 (Special Increasing Function One Dimension). In one dimen-
sion we need an increasing function that is:
1. Non-negative
6
2. That starts at 0 and goes to 1 as we vary the input variable from
its minimum value, possibly minus innity, to its maximum value,
possibly positive innity.
C 9.2 (Special Increasing Function Many Dimensions). In two or
more dimensions we need:
1. An increasing function that is
2. non-negative
3. and that starts at 0 when all the variables are at the minimum of their
range and increases as we increase any one of them holding the others
constant.
4. If we get to the maximum of all the variables we want the function to
be 1.
5. If we put all the variables but one of them to be at their maximums
we want to get a special one
dimensional increasing function.
C 9.3 (Special Increasing Functions are Distribution Functions).
Special increasing functions are distribution functions.
10 3rd Try: Dening Copula with Special Increas-
ing Functions
Denition 10.1 (Copula Redenition). We take a special multivariate
increasing function dened on the range of each input variable from 0 to 1.
We dont assume these are distribution functions, instead we prove they have
the properties of them, i.e. are acceptable for probability theory. If we are
measure theorists this means proving there exist random variables which
are dened in measure theory terms and which have a special increasing
function as its cumulative distribution function.
C 10.1 (The marginals of a Copula are distributions). The marginals
of a Copula are distributions.
7
11 Distribution to Copula
We now assume we are using the increasing function denition of a Copula.
Theorem 11.1 (Sklars Theorem Part 1). Let F(z
1
, ..., z
n
) be the joint
distribution with margins F
i
(z
i
), and let F
1
i
(u
i
) be quasi-inverses, then
there exists a copula C(u
1
, ..., u
n
)
C(u
1
, u
2
, ..., u
n
) = F
1
(F
1
1
(u
1
), F
1
2
(u
2
), ..., F
1
n
(u
n
)) (5)
If the F
i
are continuous then C is unique.
If the F
i
are not continuous, there are some technicalities that relate to
what are called sub-copulas and the range of the corresponding variables.
12 Copula to Distribution
We continue to assume we are using the increasing function denition of a
Copula.
Theorem 12.1 (Sklars Theorem Part 2). Let C(u
1
, ..., u
n
) be a Copula
and assume that F
i
(z
i
) are distribution functions. Then there exists a joint
distribution function F(z
1
, ..., z
n
) given by
F(z
1
, ..., z
n
) = C(F
1
(z
1
), F
2
(z
2
), ..., F
n
(z
n
)) (6)
and the F
i
(z
i
) are the marginal distribution functions.
13 URLs
XIII 1 Embrechts
Paul Embrechts does research in stochastic nance and insurance.
http://www.math.ethz.ch/~embrechts/
We in part follow Chapter 2 of his paper on Copulas, Modelling Depen-
dence with Copulas and Applications to Risk Management [1] is available
at the URL:
http://www.math.ethz.ch/~baltes/ftp/copchapter.pdf
8
XIII 2 Nelsen
An old URL for Nelsen is:
http://www.cs.bsu.edu/~rnelson/
His new one is:
http://www.lclark.edu/~mathsci/nelsen.html
We in part follow chapter 2 of Nelsens book [2].
The errata for his book is at
http://www.lclark.edu/~mathsci/errata.pdf
The books homepage at Springer-Verlag is
http://www.springer-ny.com/detail.tpl?ISBN=0387986235
XIII 3 Venter
A good introduction to applying copulas to reinsurance is by Gary Venter
[3]. This has many good pictures of copulas. This is available at the URL:
http://www.casact.com/coneduc/reinsure/2003/handouts/venter1.
pdf
References
[1] Paul Embrechts, Filip Lindskog, and Alexander McNeil. Correlation
and dependence in risk management: properties and pitfalls. In M.A.H.
Dempster, editor, Risk Management: Value at Risk and Beyond. Cam-
bridge University Press, Cambridge University Press, 2002.
[2] Roger B. Nelsen. An Introduction to Copulas. Springer-Verlag, New
York, 1998.
[3] Gary Venter. Quantifying correlation with copulas. Guy Carpenter,
2003.
9

You might also like