What's new

# LDA models

#### liewpw05

##### New Member
Hi David,

As i was going through the chapter on LDA modelling approach to calculate operational risk regulatory/economic capital, i came across the terms "empirical distribution" and "parametric distribution". Not too sure what's the differance between the two.

In context of the reading on "LDA at work", it was mentioned that in the modelling of the severity distribution the body of the distribution where lots of historical data exists will be modelled as an empirical distribution while the tails where there is a shortage of data, parametric distributions like the EVT Pots over threshold :generalised Pareto distribution is used. Hope you could enlighten me.

Thanks

Regards,
Peggy

#### liewpw05

##### New Member
Hi again David,

I have another question on the measurement of operational risk under the Basel II AMA approach. There are 3 further approaches under AMA namely :internal measurement approach, LDA approach and score card approach.
Are these 3 approaches mutually exclusive? Does a bank have to put aside operational risk capital using all of the 3 approaches of AMA above?

Thanks

Regards,
Peggy

#### David Harper CFA FRM

##### David Harper CFA FRM
Staff member
Subscriber
Hi Peggy,

On the first:

Another nice observation! If I had to point to one weakness in the 2008 AIMs (vs. 2007), it would be the tendency to omit some foundational ideas. Last year, there was a much better setup to for this in the Quant.

If you roll a six-sided die twice and get, say, a '2' and a '5', and call it a day, this histogram can be an *empirical* PMF distribution: P[1=0,2=50%,3=0,4=0,5=50%, and 6=0%]. Empirical = based on the raw outcomes. As opposed to the corresponding parametric PMF: P[X]=1/6.

Gujarati 2.4 contrasts classical (a priori) with empirical (relative) probability. The first "maps" to a parametric distribution; the latter to an empirical distribution.

"...while the tails where there is a shortage of data, parametric distributions like the EVT Pots over threshold"

So this is a key problem is risk measurement, and maybe the key problem in operational risk measurement: we'd love an accurate empirical distribution (P[loss @ 99%] = based on historical actual distribution) for the extreme loss tail, but that is exactly where we lack data, so we probably must settle for parametric. (It is stunning how much research has been devoted to this: finding a parametric distribution where there is no data to springboard from). We are comfortable using empirical data/distribution near the center, but we aren't focused there. Much like the two rolls example above, you don't trust my two-rolls-empirical distribution b/c it's based on so few rolls. If i rolled 100 times, maybe you'd trust it. So, in the extreme tail, where events by definition are "low frequency, high severity" we are pretty much forced to go with parametric distributions. In some respect, I'd be inclined to characterize the whole EVT approach as: "we don't have piles and piles of good data in the tail, so let us graft onto the tail these sets of parametric distributions."

Finally, another idea at work here is the idea the full (CDF) distribution can be generated by stitching together more than one sub-distributions. In this case, an empirical in the middle gives way to a parametric for the tail (you can see, as i've said, we may expect the tail to be "forced" into a parameteric). Two days ago I blogged about an interesting, helpful paper that takes this typical MIXTURE MODEL approach: http://www.bionicturtle.com/learn/article/operational_loss_dependencies_academic_paper/

Here is an example of a mixture model, I will graft a coin toss distribution onto a six-sided die distribution: P[X = 1 to 6 = 50% * 1/6; X = 7 or 8 = 50% * 1/2]. And now i have made my own sort of fat tail. You are referring to a mixture-model where the start is empirical and then "graft" or "stich" an EVT parametric.

The EVT reading this year is far too scant. (Dowd's book Model Risk has nice chapter). The two EVT distributions that Wilmott briefly touches on (GPD, GEV) are, by definition, part of a mixture model: they characterize the tail so they are attached/grafted onto something else for the middle/body (e.g., CDF P[X] < 60% or 70% or 80%). These EVT distributions attach to the tail: they start at, say, 99%. Again, for CDF P() < 95% or < 99%, could use a normal distribution. Then start an EVT for CDF P(99% < X < 100%). See how the whole EVT distribution is both parameteric (no data, need formula) and "mixed" into the tail at the end?

David

#### David Harper CFA FRM

##### David Harper CFA FRM
Staff member
Subscriber
Hi Peggy,

On the second:

Throughout Basel, the basic/advanced approaches are *generally* exclusive. In Basel, each risk category (credit, market, operational) has basic/standardized versus advanced approaches.

There will be an AIM that refers to Basel's "evoluationary aspect" (which i could point to, except GARP has forgotten the Basel AIMs from the list?!!): this refers to Basel's intention that banks start with a basic approach (BIA in Op) and then progress, as they can meet tougher qualifying critiria, to advanced approach (AMA). (Techinically, I understand there are some exceptions: with supervisor approval, a bank may mix AMA with BIA/TSA but I'll defer to a specialist on such exceptions). Also, country specific implementations may vary; e.g., here in the US, regulators have altered to dictate either Advanced or basic version (basel IA).

But, oops, sorry, I see you are asking within AMA:

Re: internal measurement approach, LDA approach and score card approach.

All three NOT required. It's even less directive than that. IMA, LDA & scorecard were named in the initial framework. But the lastest Basel II (Jun 2006) does not name AMA OpRisk approaches. The bank selects its own internal (which often will be LDA, from what i've read) method. Why so loose? Because, like each of the advanced approaches, Basel emphasizes the qualifying criteria and the 2nd pillar supervisory review. To do their own internal under AMA, bank must meet many criteria; e.g., 5 years data, backtest.

David

#### john.yam@gmail.com

##### New Member
Hi

To David's response " ...a bank may mix AMA with BIA/TSA but I’ll defer to a specialist on such exceptions)"

In the U.S there are a handful of banks (core bank) that are required to implement AMA. "Less sophiscated" banks can "Opt-in" for the AMA approach. For the core banks, they cannot mix AMA with BIA/TSA. They must implement the AMA approach.

Hopes this helps.