What's new

FRM EXAM 2007—Q 28: expected annual return

Thread starter #1
Q. Consider 2 stocks, A and B / assume there annual returns are jointly normally distributed , the marginal distribution of each stock has mean 2% AND STD deviation 10%. And correlation is 0.9 . What is the expected annual return of the stock A if the annual return of stock B is 3%?

a. 2%
b. 2.9%
c. 4.7%
d. 1.1%

Answer: The info in this question can be used to construct a regression model of A and B. We have R(A)= 2% + .9(10%/10%) [R(B)-2%] + E
Next replacing R(B) by 3% gives R(A)=2% + .9(3%-2%)=2.9%

Please explain which formula is used to arrive at answer. I couldn't get it. Please.
 

David Harper CFA FRM

David Harper CFA FRM
Staff member
Subscriber
#2
Hi snigdha,

This is classic FRM question, by asking to apply understanding rather than formulas. You can get there another way, might be more intuitive:

What is slope or beta(A with respect to B)?
Slope (A regressed on B) = Beta (A with respect to B) = Covariance(A,B) /Variance(A,B) = correlation(A,B)*StdDev(A)*StdDev(B)/Variance(B) = correlation(A,B)*StdDev(A)/StdDev(B).
Slope (A regressed on B) = 0.9*10%/10% = 0.9

Then, you want to know that all OLS regression lines pass through the point of both averages (mean B, mean A):
... in more familiar terms, average Y = b1 + b2*average X (always!)

A regressed on B: average A = intercept + slope*average B;
(i.e., average A = intercept + beta(A with respect to B)* average B
In this case,
Intercept = average A - slope*average B = 2% - 0.9*2% = 0.2%

Regression (A on B): A = 0.2% intercept + 0.9 slope * B.
E[A|B=3%] = 0.2% + 0.9*3% = 2.9%

...some juicy fundamentals in this simple question. If this makes sense, the given answer is also instructive for using the line to condition from mean of B.

David
 
Thread starter #3
Thanks for the reply david.
But in your expanation

1st
average A = intercept + beta(A with respect to B)* average B is used to arrive at calculation for intercept and then
same formula is modified to
Regression (A on B): A = 0.2% intercept + 0.9 slope * B.
and
E[A|B] is calculated??
This is confusing to me. can you please guide.
 

David Harper CFA FRM

David Harper CFA FRM
Staff member
Subscriber
#4
Hi snigdha,

The beta(A with respect to B) is just to remind that is the same as the regression slope (A on B); more familiar:
Y = mx + b where m is slope and key fundamentals are:
m (slope) = beta (y with respect to x) = cov(x,y)/var(x) = correlation(x,y)*standard Dev(y)/standard deviation (x)

So we just have: average A = intercept + slope(A on B)* average B; a function of averages

Then using the regular regression

A(i) = intercept + slope*B(i) = 0.2% + 0.9*3%; i.e., this is the regression

It is the same thing to use:
A(i) = E[A|B] = A(i) = intercept + slope*B(i) = 0.2% + 0.9*3%
E[A|B] b/c the dependent/explained (i.e., A) is a conditional mean: what is A conditional on a value of B? … it is a value on the OLS line

David
 
#5
Hi David,

Understand the solution can provided using the formula for bivariate normal distribution, wherein
E[A|B]=E[A]+Beta(A,B)*(B-E(B))

Please let me know if im wrong.

Vaishnevi
 
Last edited:

David Harper CFA FRM

David Harper CFA FRM
Staff member
Subscriber
#6
@Vaishnevi Yes, sure totally! I read yours as a simple application of the univariate regression. I solved it with (simple) univariate regression, relying on the property that the OLS must pass through {E(A), E(B)} to solve for the intercept, such that using the regression we can say that E(A|B=3%) = intercept + slope*B = α + β(A, B)*B = 0.2% + 0.9*3% = 2.9%

Your starts with E(A) and "moves up/down the regression line" with Beta(A,B)*(B-E(B)). Yours is equivalent, thanks!

append: in fact, let's prove it:
  • E(A|B)= E(A) + β(A,B)*(B-E(B)); i.e., yours
  • E(A|B)= E(A) + β(A,B)*B - β(A,B)*E(B)),
  • E(A|B)= [E(A) - β(A,B)*E(B))] + β(A,B)*B, but the intercept, α = [E(A) - β(A,B)*E(B))] such that:
  • E(A|B)= α + β(A,B)*B, which is mine. I hope that's interesting!
 
Last edited:
#7
In retrospect, apologies I should have given a reference to the chapter from where i picked this formula from.

I was referring to John Hull's 'Correlation and Copulas'. Hull gives an example:
If marginal distributions of V1 and V2 are normal, it is safe to assume joint probability of V1 & V2 are bivariate normal.If we know the correlation between V1 & V2, the expected mean of V2 condition on V1 is given by:
E(V2/V1)=E(V1)+[Correlation(V1,V2)*STD(V2)/SRD(V1)*(V1-E(V1))]

Yes, extremely interesting the simple regression formula matches the above. In this scenario I have a question here, can covariance and correlation (or say simple regression) be applied only when the variables are normally distributed?

Thanks
Vaishnevi
 
Last edited:

David Harper CFA FRM

David Harper CFA FRM
Staff member
Subscriber
#8
@Vaishnevi good question, but no, in order to predict A conditional on B, E(A|B), in the regression we do not need to assume, nor do we require, that either variable is normal. This is so stringent a requirement. Now, the linear regression model does make several assumptions, but none that either variable is normal. Rather, we are merely relying on the implied linear relationship (which is also a feature of correlation and covariance). So, we do assume a linear relationship, but not normality. In this way, the "prediction" of E(A|B) = 2.9% is a conditional expectation assuming the linear relationship. Thanks!
 
Top