What's new

Copula functions (Meissner)

[email protected]

Active Member
Hi David,

I am unclear on how deep we need to go to cover the GARP requirements on the Gaussian copula function (e.g. 505.3). Also, in trying to get some depth, I wanted to clarify this narrative. I am sure I have gaps in stringing it together.

In terms of building blocks, I have seen a single factor CDO model (i.e. the most basic form I think) can be used with variables representing market return ("m"), rho ("ai") i.e. correlation with (m)... and idiosyncratic risk, z(i). We saw a similar model in Part 1. With assumptions on risk free rate, recovery rate and market correlation (rho, m), this CDO model converts the single factor output x(i) into a z-value and uses this to come up with a "market survival probability", from which we can solve (using the position's recovery rate), a time to default. This was the single factor model version I saw, but there are different flavors:

x(i) = sqrt(a(i))*m + sqrt(1-a(i))*z(i)

The CDO goes on to be valued by creating a loss distribution based on the above, with a spread being calculated based on the simulated, present valued, loss adjusted value of the various names/ tranches.

In honesty, I am struggling to then tie this back to what I had read in the notes. Where for example is the portfolio correlation/ dependency coming from and the creation of the multivariate from marginals i.e. the copula magic? Perhaps I have misunderstood/ forgotten how the single factor model works and that the rho value (xi versus m) is actually the glue? And the z-value equivalent of x(i) is the multivariate conversion process...?

Apologies this is a model and it is unfair to ask you to review it. Perhaps when I get to credit risk the picture will be become clearer... perhaps not (!) In any case I am struggling to understand the right pitch of knowledge on this complex topic. If you can advise.

Thanks

Attachments

• 473 KB Views: 16

David Harper CFA FRM

David Harper CFA FRM
Staff member
Subscriber
Hi @[email protected] I moved this out to general T5, if you don't mind. I hope you are well? I've been out a data science conference for four days, so that today, I need to continue to focus on responding to the substantial forum backlog especially as it pertains to basic questions (ie, I just won't have immediate time to review this model). But I will add this to our project manager and review ASAP depending on my core workload here in the support forum. Question: did you say, who is the source of the model? is this yours? (oops, nevermind, you do include "Source: "The Market Standard Model for Valuing CDOs, the one-facator Gaussian Copula model - Benefits and Limitations" in 'The Definitive Guide to CDOs', chapter 8" THANK YOU!)

Re: I am unclear on how deep we need to go to cover the GARP requirements on the Gaussian copula function
Right, it's a little tough to say. The reality is that there is an upper limit on the depth GARP can query here due to (i) time limitations but (ii) their own capability to test the material from a seasoned, expert yet more or less general audience perspective. Realistically, they just don't have the seasoning, in terms of stressed-tested questions and feedback loops to go deep here, if only because they've yet to ask a deep questions according to any pattern. Consequently, it would be highly surprising to see them test something deep here, because it would have no precedent. The short answer, then, is "in practical terms, you definitely do not need to go any deeper than the learning objectives," and notice that, unless I am mistaken, I believe that all current copula-related LOs are qualitative (e.g., describe, explain) and exactly none are quantitative (itself an outcome based on some minor iteration). This hints to limit on the depth, right?

I did add this thread to my backlog so I hope to come back when i can. Thank you!

Last edited:

silver7

New Member
Hi, @David Harper CFA FRM
Speaking of the Gaussian copula, could you please explain how to compute the following?

(This is a screenshot from page 48 of the Meissner node)

I am trying to figure out how to arrive at 3.44% and also potentially a glimpse into how to solve more-factor equation.

Thanks

David Harper CFA FRM

David Harper CFA FRM
Staff member
Subscriber
Hi @silver7 That calculation employs Gunter Meissner's copula model, please see this folder in my library https://www.dropbox.com/sh/b8hgub1dfmvlcwu/AADDjIs75sRsjuOpfsqRc4HRa?dl=0
... his model is the file "Meissner-Ch4-2-asset_default_time_Copula.xlsm"

I also have in this folder John Hull's approximation (bivariate normal), which I implemented based on his Technical Note 5 (in the folder). You will notice it returns the same answer of 3.44%

willyong

New Member
Hi David

I wanted to flag an issue with the Meissner spreadsheet (v10 I believe), in particular, cell Q17 where I think you may have built a custom function. The file on the study planner is an .xlsx, not an .xlsm as the spreadsheet you've attached in Dropbox which leads to the formula throwing up an error. Please refer to the screenshot below:

Staff member
Subscriber