# P1.T2. Quantitative Analysis

Practice questions for Quantitative Analysis: Econometrics, MCS, Volatility, Probability Distributions and VaR (Intro)

Sort By:
Title
Replies Views
Last Message ↓
1. ### P1.T2.716. Central limit theore and mixture distributions (Miller, Ch 4)

Learning objectives: Describe the central limit theorem and the implications it has when combining independent and identically distributed (i.i.d.) random variables. Describe i.i.d. random variables and the implications of the i.i.d. assumption when combining random variables. Describe a mixture distribution and explain the creation and characteristics of mixture...
Learning objectives: Describe the central limit theorem and the implications it has when combining independent and identically distributed (i.i.d.) random variables. Describe i.i.d. random variables and the implications of the i.i.d. assumption when combining random variables. Describe a mixture distribution and explain the creation and characteristics of mixture...
Learning objectives: Describe the central limit theorem and the implications it has when combining independent and identically distributed (i.i.d.) random variables. Describe i.i.d. random variables and the implications of the i.i.d. assumption when combining random variables. Describe a mixture...
Learning objectives: Describe the central limit theorem and the implications it has when combining independent and identically distributed (i.i.d.) random variables. Describe i.i.d. random...
Replies:
0
Views:
12
2. ### P1.T2.715.Chi-squared distribution, Student’s t, and F-distributions (Miller Ch.4)

Learning Objectives: Distinguish the key properties among the following distributions: ... Chi-squared distribution, Student’s t, and F-distributions. Questions: For the following questions, please rely on the statistical lookup tables provided here . This document contains four lookup tables (note that each also contains an example): cumulative standard normal distribution, student's t...
Learning Objectives: Distinguish the key properties among the following distributions: ... Chi-squared distribution, Student’s t, and F-distributions. Questions: For the following questions, please rely on the statistical lookup tables provided here . This document contains four lookup tables (note that each also contains an example): cumulative standard normal distribution, student's t...
Learning Objectives: Distinguish the key properties among the following distributions: ... Chi-squared distribution, Student’s t, and F-distributions. Questions: For the following questions, please rely on the statistical lookup tables provided here . This document contains four lookup tables...
Learning Objectives: Distinguish the key properties among the following distributions: ... Chi-squared distribution, Student’s t, and F-distributions. Questions: For the following questions,...
Replies:
0
Views:
21
3. ### PQ-T2P1.T2.321. Univariate linear regression (topic review)

@msun09 There may some intuitions, but for me, β = σ(F,Q)/σ^2(Q) = ρ(F,Q)*σ(F)/σ(Q) is one of the few, fundamental relationships that I would memorize. The actual computation of regression slope is tedious (I have it shown in our learning XLS), but in almost every case, we begin with the concept that the slope coefficient is cov(.)/var(.), or equivalently because one volatility cancels,...
@msun09 There may some intuitions, but for me, β = σ(F,Q)/σ^2(Q) = ρ(F,Q)*σ(F)/σ(Q) is one of the few, fundamental relationships that I would memorize. The actual computation of regression slope is tedious (I have it shown in our learning XLS), but in almost every case, we begin with the concept that the slope coefficient is cov(.)/var(.), or equivalently because one volatility cancels,...
@msun09 There may some intuitions, but for me, β = σ(F,Q)/σ^2(Q) = ρ(F,Q)*σ(F)/σ(Q) is one of the few, fundamental relationships that I would memorize. The actual computation of regression slope is tedious (I have it shown in our learning XLS), but in almost every case, we begin with the concept...
@msun09 There may some intuitions, but for me, β = σ(F,Q)/σ^2(Q) = ρ(F,Q)*σ(F)/σ(Q) is one of the few, fundamental relationships that I would memorize. The actual computation of regression slope...
Replies:
25
Views:
383
4. ### P1.T2.213. Sample variance, covariance and correlation (Stock & Watson)

Hi David, thanks!
Hi David, thanks!
Hi David, thanks!
Hi David, thanks!
Replies:
10
Views:
229
5. ### P1.T2.713. Uniform, binomial, Poisson distributions (Miller Ch.4)

Learning Objectives: Distinguish the key properties among the following distributions: uniform distribution, Binomial distribution, Poisson distribution. Questions: 713.1. Let each A_uniform and B_uniform represent independent random uniform variables, where A_uniform falls on the interval between (0, 3) and B_uniform falls on the interval from (4, 10). Which of the following is nearest to...
Learning Objectives: Distinguish the key properties among the following distributions: uniform distribution, Binomial distribution, Poisson distribution. Questions: 713.1. Let each A_uniform and B_uniform represent independent random uniform variables, where A_uniform falls on the interval between (0, 3) and B_uniform falls on the interval from (4, 10). Which of the following is nearest to...
Learning Objectives: Distinguish the key properties among the following distributions: uniform distribution, Binomial distribution, Poisson distribution. Questions: 713.1. Let each A_uniform and B_uniform represent independent random uniform variables, where A_uniform falls on the interval...
Learning Objectives: Distinguish the key properties among the following distributions: uniform distribution, Binomial distribution, Poisson distribution. Questions: 713.1. Let each A_uniform and...
Replies:
0
Views:
66
6. ### Quiz-T2P1.T2.404. Basic Statistics

Of course! I was taking 40/6 ~= 6.67 as the standard deviation rather than the variance. Thank you for the clarification!!
Of course! I was taking 40/6 ~= 6.67 as the standard deviation rather than the variance. Thank you for the clarification!!
Of course! I was taking 40/6 ~= 6.67 as the standard deviation rather than the variance. Thank you for the clarification!!
Of course! I was taking 40/6 ~= 6.67 as the standard deviation rather than the variance. Thank you for the clarification!!
Replies:
7
Views:
260
7. ### Quiz-T2P1.T2.403. Probabilities

@champ123 @David Harper CFA FRM This has been fixed in both the pdf and the quiz.
@champ123 @David Harper CFA FRM This has been fixed in both the pdf and the quiz.
@champ123 @David Harper CFA FRM This has been fixed in both the pdf and the quiz.
@champ123 @David Harper CFA FRM This has been fixed in both the pdf and the quiz.
Replies:
16
Views:
411
8. ### P1.T2.712. Skew, kurtosis, coskew and cokurtosis (Miller, Chapter 3)

Learning objectives: Describe the four central moments of a statistical variable or distribution: mean, variance, skewness, and kurtosis. Interpret the skewness and kurtosis of a statistical distribution, and interpret the concepts of coskewness and cokurtosis. Describe and interpret the best linear unbiased estimator. Questions: 712.1. Consider the following discrete probability...
Learning objectives: Describe the four central moments of a statistical variable or distribution: mean, variance, skewness, and kurtosis. Interpret the skewness and kurtosis of a statistical distribution, and interpret the concepts of coskewness and cokurtosis. Describe and interpret the best linear unbiased estimator. Questions: 712.1. Consider the following discrete probability...
Learning objectives: Describe the four central moments of a statistical variable or distribution: mean, variance, skewness, and kurtosis. Interpret the skewness and kurtosis of a statistical distribution, and interpret the concepts of coskewness and cokurtosis. Describe and interpret the best...
Learning objectives: Describe the four central moments of a statistical variable or distribution: mean, variance, skewness, and kurtosis. Interpret the skewness and kurtosis of a statistical...
Replies:
0
Views:
31
9. ### P1.T2.711. Covariance and correlation (Miller, Ch.3)

Learning objectives: Calculate and interpret the covariance and correlation between two random variables. Calculate the mean and variance of sums of variables. Questions: 711.1. The following probability matrix displays joint probabilities for an inflation outcome, I = {2, 3, or 4}, and an unemployment outcome, U = {5, 7 or 9}. Also shown are the expected values and variances for each...
Learning objectives: Calculate and interpret the covariance and correlation between two random variables. Calculate the mean and variance of sums of variables. Questions: 711.1. The following probability matrix displays joint probabilities for an inflation outcome, I = {2, 3, or 4}, and an unemployment outcome, U = {5, 7 or 9}. Also shown are the expected values and variances for each...
Learning objectives: Calculate and interpret the covariance and correlation between two random variables. Calculate the mean and variance of sums of variables. Questions: 711.1. The following probability matrix displays joint probabilities for an inflation outcome, I = {2, 3, or 4}, and an...
Learning objectives: Calculate and interpret the covariance and correlation between two random variables. Calculate the mean and variance of sums of variables. Questions: 711.1. The following...
Replies:
0
Views:
44
10. ### P1.T2.710. Mean and standard deviation (Miller, Ch.3)

Learning objectives: Interpret and apply the mean, standard deviation, and variance of a random variable. Calculate the mean, standard deviation, and variance of a discrete random variable. Interpret and calculate the expected value of a discrete random variable. Questions: 710.1. The following probability matrix contains the joint probabilities for random variables X = {2, 7, or 12} and Y =...
Learning objectives: Interpret and apply the mean, standard deviation, and variance of a random variable. Calculate the mean, standard deviation, and variance of a discrete random variable. Interpret and calculate the expected value of a discrete random variable. Questions: 710.1. The following probability matrix contains the joint probabilities for random variables X = {2, 7, or 12} and Y =...
Learning objectives: Interpret and apply the mean, standard deviation, and variance of a random variable. Calculate the mean, standard deviation, and variance of a discrete random variable. Interpret and calculate the expected value of a discrete random variable. Questions: 710.1. The...
Learning objectives: Interpret and apply the mean, standard deviation, and variance of a random variable. Calculate the mean, standard deviation, and variance of a discrete random variable....
Replies:
0
Views:
50
11. ### P1.T2.709. Joint probability matrices (Miller Ch.2)

Learning objectives: Distinguish between independent and mutually exclusive events. Define joint probability, describe a probability matrix, and calculate joint probabilities using probability matrices. Define and calculate a conditional probability, and distinguish between conditional and unconditional probabilities. Questions: 709.1. The following probability matrix gives the joint...
Learning objectives: Distinguish between independent and mutually exclusive events. Define joint probability, describe a probability matrix, and calculate joint probabilities using probability matrices. Define and calculate a conditional probability, and distinguish between conditional and unconditional probabilities. Questions: 709.1. The following probability matrix gives the joint...
Learning objectives: Distinguish between independent and mutually exclusive events. Define joint probability, describe a probability matrix, and calculate joint probabilities using probability matrices. Define and calculate a conditional probability, and distinguish between conditional and...
Learning objectives: Distinguish between independent and mutually exclusive events. Define joint probability, describe a probability matrix, and calculate joint probabilities using probability...
Replies:
0
Views:
41
12. ### Quiz-T2P1.T2.405. Distributions I

Hi @otcfin Per this recent thread here is my summary table on the use of normal Z versus student's t, the choice essentially depends on whether we know the population variance (i.e., known variance justifies the normal, but unknown variance estimates the population by assuming the sample variance which consumes a d.f. and warrants the more conservative student's t): Re t-statistic degrees...
Hi @otcfin Per this recent thread here is my summary table on the use of normal Z versus student's t, the choice essentially depends on whether we know the population variance (i.e., known variance justifies the normal, but unknown variance estimates the population by assuming the sample variance which consumes a d.f. and warrants the more conservative student's t): Re t-statistic degrees...
Hi @otcfin Per this recent thread here is my summary table on the use of normal Z versus student's t, the choice essentially depends on whether we know the population variance (i.e., known variance justifies the normal, but unknown variance estimates the population by assuming the sample...
Hi @otcfin Per this recent thread here is my summary table on the use of normal Z versus student's t, the choice essentially depends on whether we know the population variance (i.e., known...
Replies:
18
Views:
495
13. ### P1.T2.221. Joint null hypothesis in multiple OLS regression (Stock & Watson)

I have just realized that it is in the exercise!
I have just realized that it is in the exercise!
I have just realized that it is in the exercise!
I have just realized that it is in the exercise!
Replies:
16
Views:
366
14. ### P1.T2.704. Forecasting volatility with GARCH (Hull)

Thanks - much appreciate the extra detail David. I knew that omega = weight * long-run variance. What was confusing me generally was knowing from the question (whether variance or covariance/ correlation omega), when to do the omega -> variance-only conversion i.e. when to leave omega as is or split it up
Thanks - much appreciate the extra detail David. I knew that omega = weight * long-run variance. What was confusing me generally was knowing from the question (whether variance or covariance/ correlation omega), when to do the omega -> variance-only conversion i.e. when to leave omega as is or split it up
Thanks - much appreciate the extra detail David. I knew that omega = weight * long-run variance. What was confusing me generally was knowing from the question (whether variance or covariance/ correlation omega), when to do the omega -> variance-only conversion i.e. when to leave omega as is or...
Thanks - much appreciate the extra detail David. I knew that omega = weight * long-run variance. What was confusing me generally was knowing from the question (whether variance or covariance/...
Replies:
6
Views:
102
15. ### P1.T2.703. EWMA versus GARCH volatility (Hull)

Hi [USER=48426]@ That's really just an elaboration on Hull's End of Chapter Question 11.6, and I can't see any problem with it, although it's highly tedious! I entered into into the more generic version (in our learning XLS), see here (and below) at It's true that there are two omega (ω) assumptions given but that looks okay (consistent) with Hull. Please note that GARCH is used twice...
Hi [USER=48426]@ That's really just an elaboration on Hull's End of Chapter Question 11.6, and I can't see any problem with it, although it's highly tedious! I entered into into the more generic version (in our learning XLS), see here (and below) at It's true that there are two omega (ω) assumptions given but that looks okay (consistent) with Hull. Please note that GARCH is used twice...
Hi [USER=48426]@ That's really just an elaboration on Hull's End of Chapter Question 11.6, and I can't see any problem with it, although it's highly tedious! I entered into into the more generic version (in our learning XLS), see here (and below) at It's true that there are two omega (ω)...
Hi [USER=48426]@ That's really just an elaboration on Hull's End of Chapter Question 11.6, and I can't see any problem with it, although it's highly tedious! I entered into into the more...
Replies:
6
Views:
97

Replies:
9
Views:
141
17. ### P1.T2.502. Covariance updates with EWMA and GARCH(1,1) models (Hull)

Hi @Spinozzi That's a fair observation. I did parrot Hull's language here, such that he does refer to these given assumptions as "current daily volatilties" (see emphasized text below; which is solved above in the XLS snapshot on the column next to BT 502.2). I also cross-checked his usage in OFOD 10th edition and he similarly refers to these assumptions as "current daily volatilities." (e.g.,...
Hi @Spinozzi That's a fair observation. I did parrot Hull's language here, such that he does refer to these given assumptions as "current daily volatilties" (see emphasized text below; which is solved above in the XLS snapshot on the column next to BT 502.2). I also cross-checked his usage in OFOD 10th edition and he similarly refers to these assumptions as "current daily volatilities." (e.g.,...
Hi @Spinozzi That's a fair observation. I did parrot Hull's language here, such that he does refer to these given assumptions as "current daily volatilties" (see emphasized text below; which is solved above in the XLS snapshot on the column next to BT 502.2). I also cross-checked his usage in...
Hi @Spinozzi That's a fair observation. I did parrot Hull's language here, such that he does refer to these given assumptions as "current daily volatilties" (see emphasized text below; which is...
Replies:
23
Views:
680
18. ### P1.T2.202. Variance of sum of random variables (Stock & Watson)

Hi @Arseniy Semiletenko Good point! In truth, it's a weakness of my question: I wrote this question 2012 (per the 2xx.x numbering) and, having improved my technique, I would not today write a question that has two valid answers to the self-contained question. It's not a "best practice." It's a corollary of a rule that I've employed in reviewing, and giving feedback on, GARP's own practice...
Hi @Arseniy Semiletenko Good point! In truth, it's a weakness of my question: I wrote this question 2012 (per the 2xx.x numbering) and, having improved my technique, I would not today write a question that has two valid answers to the self-contained question. It's not a "best practice." It's a corollary of a rule that I've employed in reviewing, and giving feedback on, GARP's own practice...
Hi @Arseniy Semiletenko Good point! In truth, it's a weakness of my question: I wrote this question 2012 (per the 2xx.x numbering) and, having improved my technique, I would not today write a question that has two valid answers to the self-contained question. It's not a "best practice." It's a...
Hi @Arseniy Semiletenko Good point! In truth, it's a weakness of my question: I wrote this question 2012 (per the 2xx.x numbering) and, having improved my technique, I would not today write a...
Replies:
61
Views:
1,187
19. ### P1.T2.217. Regression coefficients (Stock & Watson)

Hi @Arseniy Semiletenko Good questions, they are related I think. Please note the general form of the test statistic of the regression coefficient is given by (b - β)/se(b) where (b) is the observed regression coefficient and β is the null hypothesis. So in 217.1, where the observed regression coefficient is 1.080 and the null, let's call it β = 1.0, we have the t-stat = (1.080 - 1.0)/SE(b)...
Hi @Arseniy Semiletenko Good questions, they are related I think. Please note the general form of the test statistic of the regression coefficient is given by (b - β)/se(b) where (b) is the observed regression coefficient and β is the null hypothesis. So in 217.1, where the observed regression coefficient is 1.080 and the null, let's call it β = 1.0, we have the t-stat = (1.080 - 1.0)/SE(b)...
Hi @Arseniy Semiletenko Good questions, they are related I think. Please note the general form of the test statistic of the regression coefficient is given by (b - β)/se(b) where (b) is the observed regression coefficient and β is the null hypothesis. So in 217.1, where the observed regression...
Hi @Arseniy Semiletenko Good questions, they are related I think. Please note the general form of the test statistic of the regression coefficient is given by (b - β)/se(b) where (b) is the...
Replies:
16
Views:
259
20. ### PQ-T2P1.T2.317. Continuous distributions (Topic review)

Hello @Gdb Thank you for pointing this out. I've fixed this error in the study planner. Nicole
Hello @Gdb Thank you for pointing this out. I've fixed this error in the study planner. Nicole
Hello @Gdb Thank you for pointing this out. I've fixed this error in the study planner. Nicole
Hello @Gdb Thank you for pointing this out. I've fixed this error in the study planner. Nicole
Replies:
8
Views:
411
21. ### P1.T2.706. Bivariate normal distribution (Hull)

@David Harper CFA FRM, makes perfect sense now. thanks for taking the time again.
@David Harper CFA FRM, makes perfect sense now. thanks for taking the time again.
@David Harper CFA FRM, makes perfect sense now. thanks for taking the time again.
@David Harper CFA FRM, makes perfect sense now. thanks for taking the time again.
Replies:
8
Views:
119
22. ### L1.T2.61 Statistical dependence (Gujarati)

@vsrivast See below. The unconditional (aka, marginal) probability of passing the fundamental test, is the probability of arriving at either of the green cells below. From the unconditional perspective of the "beginning" of the tree, there are four possible outcomes (in order from top to bottom): Pass tech and pass fund, pass tech and not pass fund, not pass tech and pass fund, not pass tech...
@vsrivast See below. The unconditional (aka, marginal) probability of passing the fundamental test, is the probability of arriving at either of the green cells below. From the unconditional perspective of the "beginning" of the tree, there are four possible outcomes (in order from top to bottom): Pass tech and pass fund, pass tech and not pass fund, not pass tech and pass fund, not pass tech...
@vsrivast See below. The unconditional (aka, marginal) probability of passing the fundamental test, is the probability of arriving at either of the green cells below. From the unconditional perspective of the "beginning" of the tree, there are four possible outcomes (in order from top to...
@vsrivast See below. The unconditional (aka, marginal) probability of passing the fundamental test, is the probability of arriving at either of the green cells below. From the unconditional...
Replies:
7
Views:
137
23. ### PQ-T2P1.T2.323. Monte Carlo Simulation and GBM (Topic Review)

Oh yah my bad. I must have overlooked something. Thanks for clarification.
Oh yah my bad. I must have overlooked something. Thanks for clarification.
Oh yah my bad. I must have overlooked something. Thanks for clarification.
Oh yah my bad. I must have overlooked something. Thanks for clarification.
Replies:
8
Views:
310
24. ### L1.T2.103 Weighting schemes to estimate volatility (Hull)

Hi @s3filin Great question and, yes, I am indeed saying that "Beta [in GARCH] is a decay factor and is analogous to lambda in EWMA." Hull actually shows this specifically in Chapter 23.4; I copied it below. In this way, GARCH β is analogous to EWMA λ; and GARCH α is analogous to EWMA's (1-λ) so I would not say--and hopefully did not anywhere say something like "what's lambda for EWMA is...
Hi @s3filin Great question and, yes, I am indeed saying that "Beta [in GARCH] is a decay factor and is analogous to lambda in EWMA." Hull actually shows this specifically in Chapter 23.4; I copied it below. In this way, GARCH β is analogous to EWMA λ; and GARCH α is analogous to EWMA's (1-λ) so I would not say--and hopefully did not anywhere say something like "what's lambda for EWMA is...
Hi @s3filin Great question and, yes, I am indeed saying that "Beta [in GARCH] is a decay factor and is analogous to lambda in EWMA." Hull actually shows this specifically in Chapter 23.4; I copied it below. In this way, GARCH β is analogous to EWMA λ; and GARCH α is analogous to EWMA's (1-λ) so...
Hi @s3filin Great question and, yes, I am indeed saying that "Beta [in GARCH] is a decay factor and is analogous to lambda in EWMA." Hull actually shows this specifically in Chapter 23.4; I copied...
Replies:
11
Views:
428
25. ### PQ-T2P1.T2.318. Distributional moments (Topic review)

Indeed. Yes, Excel uses the long formula for smaller samples when calculating SKEW. Thank you for this, @David Harper CFA FRM !
Indeed. Yes, Excel uses the long formula for smaller samples when calculating SKEW. Thank you for this, @David Harper CFA FRM !
Indeed. Yes, Excel uses the long formula for smaller samples when calculating SKEW. Thank you for this, @David Harper CFA FRM !
Indeed. Yes, Excel uses the long formula for smaller samples when calculating SKEW. Thank you for this, @David Harper CFA FRM !
Replies:
12
Views:
224
26. ### PQ-T2P1.T2.316. Discrete distributions (Topic review)

Hi @FRM Mark It's given in the assumption "316.3. The current price of an asset is S(0) and its future evolution is modeled with a binomial tree. At each node, there is a 62% probability of an (jump-up) increase." Thanks!
Hi @FRM Mark It's given in the assumption "316.3. The current price of an asset is S(0) and its future evolution is modeled with a binomial tree. At each node, there is a 62% probability of an (jump-up) increase." Thanks!
Hi @FRM Mark It's given in the assumption "316.3. The current price of an asset is S(0) and its future evolution is modeled with a binomial tree. At each node, there is a 62% probability of an (jump-up) increase." Thanks!
Hi @FRM Mark It's given in the assumption "316.3. The current price of an asset is S(0) and its future evolution is modeled with a binomial tree. At each node, there is a 62% probability of an...
Replies:
7
Views:
251
27. ### P1.T2.311. Probability Distributions III, Miller

Hi @s3filin This is a typical Monte Carlo assumption: that certain risk factors are (at least a little bit) correlated. This would be used any time we want correlated normals in a Monte Carlo Simulation; it's almost not too much to say that independence (i.e., zero correlation) would be the unusual assumption. But it's super-super-easy to generate non-correlated normals, so the point is to...
Hi @s3filin This is a typical Monte Carlo assumption: that certain risk factors are (at least a little bit) correlated. This would be used any time we want correlated normals in a Monte Carlo Simulation; it's almost not too much to say that independence (i.e., zero correlation) would be the unusual assumption. But it's super-super-easy to generate non-correlated normals, so the point is to...
Hi @s3filin This is a typical Monte Carlo assumption: that certain risk factors are (at least a little bit) correlated. This would be used any time we want correlated normals in a Monte Carlo Simulation; it's almost not too much to say that independence (i.e., zero correlation) would be the...
Hi @s3filin This is a typical Monte Carlo assumption: that certain risk factors are (at least a little bit) correlated. This would be used any time we want correlated normals in a Monte Carlo...
Replies:
25
Views:
508
28. ### P1.T2.701. Regression analysis to model seasonality (Diebold)

Many thanks for you lovely comment, Brian. It is nothing special I guess, I have just read a few textbooks that's it. The modelling (implementation work) is another kettle of fish. These "goodness of fit" tests are technically quite complex. Again, thanks for your like!
Many thanks for you lovely comment, Brian. It is nothing special I guess, I have just read a few textbooks that's it. The modelling (implementation work) is another kettle of fish. These "goodness of fit" tests are technically quite complex. Again, thanks for your like!
Many thanks for you lovely comment, Brian. It is nothing special I guess, I have just read a few textbooks that's it. The modelling (implementation work) is another kettle of fish. These "goodness of fit" tests are technically quite complex. Again, thanks for your like!
Many thanks for you lovely comment, Brian. It is nothing special I guess, I have just read a few textbooks that's it. The modelling (implementation work) is another kettle of fish. These "goodness...
Replies:
11
Views:
111
29. ### P1.T2.705. Correlation (Hull)

Thank you emilioalzamora and David for such a detailed explanation.
Thank you emilioalzamora and David for such a detailed explanation.
Thank you emilioalzamora and David for such a detailed explanation.
Thank you emilioalzamora and David for such a detailed explanation.
Replies:
13
Views:
187
30. ### P1.T2.506. Covariance stationary time series (Diebold)

Hi [USER=46018]@ See below I copied Diebold's explanation for partial autocorrelation (which is excellent, in my opinion). If you keep in mind the close relationship between beta and correlation, then you can view this is analogous to the difference between (in a regression) a univariate slope coefficient and a partial multivariate slope coefficient. We can extract correlation by multiplying...
Hi [USER=46018]@ See below I copied Diebold's explanation for partial autocorrelation (which is excellent, in my opinion). If you keep in mind the close relationship between beta and correlation, then you can view this is analogous to the difference between (in a regression) a univariate slope coefficient and a partial multivariate slope coefficient. We can extract correlation by multiplying...
Hi [USER=46018]@ See below I copied Diebold's explanation for partial autocorrelation (which is excellent, in my opinion). If you keep in mind the close relationship between beta and correlation, then you can view this is analogous to the difference between (in a regression) a univariate slope...
Hi [USER=46018]@ See below I copied Diebold's explanation for partial autocorrelation (which is excellent, in my opinion). If you keep in mind the close relationship between beta and correlation,...
Replies:
6
Views:
200