# P1.T2. Quantitative Analysis

Practice questions for Quantitative Analysis: Econometrics, MCS, Volatility, Probability Distributions and VaR (Intro)

Sort By:
Title
Replies Views
Last Message
1. ### P1.T2.713. Uniform, binomial, Poisson distributions (Miller Ch.4)

Learning Objectives: Distinguish the key properties among the following distributions: uniform distribution, Binomial distribution, Poisson distribution. Questions: 713.1. Let each A_uniform and B_uniform represent independent random uniform variables, where A_uniform falls on the interval between (0, 3) and B_uniform falls on the interval from (4, 10). Which of the following is nearest to...
Learning Objectives: Distinguish the key properties among the following distributions: uniform distribution, Binomial distribution, Poisson distribution. Questions: 713.1. Let each A_uniform and B_uniform represent independent random uniform variables, where A_uniform falls on the interval between (0, 3) and B_uniform falls on the interval from (4, 10). Which of the following is nearest to...
Learning Objectives: Distinguish the key properties among the following distributions: uniform distribution, Binomial distribution, Poisson distribution. Questions: 713.1. Let each A_uniform and B_uniform represent independent random uniform variables, where A_uniform falls on the interval...
Learning Objectives: Distinguish the key properties among the following distributions: uniform distribution, Binomial distribution, Poisson distribution. Questions: 713.1. Let each A_uniform and...
Replies:
0
Views:
38
2. ### P1.T2.712. Skew, kurtosis, coskew and cokurtosis (Miller, Chapter 3)

Learning objectives: Describe the four central moments of a statistical variable or distribution: mean, variance, skewness, and kurtosis. Interpret the skewness and kurtosis of a statistical distribution, and interpret the concepts of coskewness and cokurtosis. Describe and interpret the best linear unbiased estimator. Questions: 712.1. Consider the following discrete probability...
Learning objectives: Describe the four central moments of a statistical variable or distribution: mean, variance, skewness, and kurtosis. Interpret the skewness and kurtosis of a statistical distribution, and interpret the concepts of coskewness and cokurtosis. Describe and interpret the best linear unbiased estimator. Questions: 712.1. Consider the following discrete probability...
Learning objectives: Describe the four central moments of a statistical variable or distribution: mean, variance, skewness, and kurtosis. Interpret the skewness and kurtosis of a statistical distribution, and interpret the concepts of coskewness and cokurtosis. Describe and interpret the best...
Learning objectives: Describe the four central moments of a statistical variable or distribution: mean, variance, skewness, and kurtosis. Interpret the skewness and kurtosis of a statistical...
Replies:
0
Views:
20
3. ### P1.T2.711. Covariance and correlation (Miller, Ch.3)

Learning objectives: Calculate and interpret the covariance and correlation between two random variables. Calculate the mean and variance of sums of variables. Questions: 711.1. The following probability matrix displays joint probabilities for an inflation outcome, I = {2, 3, or 4}, and an unemployment outcome, U = {5, 7 or 9}. Also shown are the expected values and variances for each...
Learning objectives: Calculate and interpret the covariance and correlation between two random variables. Calculate the mean and variance of sums of variables. Questions: 711.1. The following probability matrix displays joint probabilities for an inflation outcome, I = {2, 3, or 4}, and an unemployment outcome, U = {5, 7 or 9}. Also shown are the expected values and variances for each...
Learning objectives: Calculate and interpret the covariance and correlation between two random variables. Calculate the mean and variance of sums of variables. Questions: 711.1. The following probability matrix displays joint probabilities for an inflation outcome, I = {2, 3, or 4}, and an...
Learning objectives: Calculate and interpret the covariance and correlation between two random variables. Calculate the mean and variance of sums of variables. Questions: 711.1. The following...
Replies:
0
Views:
34
4. ### P1.T2.710. Mean and standard deviation (Miller, Ch.3)

Learning objectives: Interpret and apply the mean, standard deviation, and variance of a random variable. Calculate the mean, standard deviation, and variance of a discrete random variable. Interpret and calculate the expected value of a discrete random variable. Questions: 710.1. The following probability matrix contains the joint probabilities for random variables X = {2, 7, or 12} and Y =...
Learning objectives: Interpret and apply the mean, standard deviation, and variance of a random variable. Calculate the mean, standard deviation, and variance of a discrete random variable. Interpret and calculate the expected value of a discrete random variable. Questions: 710.1. The following probability matrix contains the joint probabilities for random variables X = {2, 7, or 12} and Y =...
Learning objectives: Interpret and apply the mean, standard deviation, and variance of a random variable. Calculate the mean, standard deviation, and variance of a discrete random variable. Interpret and calculate the expected value of a discrete random variable. Questions: 710.1. The...
Learning objectives: Interpret and apply the mean, standard deviation, and variance of a random variable. Calculate the mean, standard deviation, and variance of a discrete random variable....
Replies:
0
Views:
35
5. ### P1.T2.709. Joint probability matrices (Miller Ch.2)

Learning objectives: Distinguish between independent and mutually exclusive events. Define joint probability, describe a probability matrix, and calculate joint probabilities using probability matrices. Define and calculate a conditional probability, and distinguish between conditional and unconditional probabilities. Questions: 709.1. The following probability matrix gives the joint...
Learning objectives: Distinguish between independent and mutually exclusive events. Define joint probability, describe a probability matrix, and calculate joint probabilities using probability matrices. Define and calculate a conditional probability, and distinguish between conditional and unconditional probabilities. Questions: 709.1. The following probability matrix gives the joint...
Learning objectives: Distinguish between independent and mutually exclusive events. Define joint probability, describe a probability matrix, and calculate joint probabilities using probability matrices. Define and calculate a conditional probability, and distinguish between conditional and...
Learning objectives: Distinguish between independent and mutually exclusive events. Define joint probability, describe a probability matrix, and calculate joint probabilities using probability...
Replies:
0
Views:
25
6. ### P1.T2.708. Probability function fundamentals (Miller Ch. 2)

Learning objectives: Describe and distinguish between continuous and discrete random variables. Define and distinguish between the probability density function, the cumulative distribution function, and the inverse cumulative distribution function. Calculate the probability of an event given a discrete probability function. Questions: 708.1. Let f(x) represent a probability function (which...
Learning objectives: Describe and distinguish between continuous and discrete random variables. Define and distinguish between the probability density function, the cumulative distribution function, and the inverse cumulative distribution function. Calculate the probability of an event given a discrete probability function. Questions: 708.1. Let f(x) represent a probability function (which...
Learning objectives: Describe and distinguish between continuous and discrete random variables. Define and distinguish between the probability density function, the cumulative distribution function, and the inverse cumulative distribution function. Calculate the probability of an event given a...
Learning objectives: Describe and distinguish between continuous and discrete random variables. Define and distinguish between the probability density function, the cumulative distribution...
Replies:
0
Views:
17
7. ### P1.T2.707. Gaussian Copula (Hull)

Learning objectives: Define copula and describe the key properties of copulas and copula correlation. Explain tail dependence. Describe the Gaussian copula, Student’s t-copula, multivariate copula, and one-factor copula. Questions: 707.1. Below are the joint probabilities for a cumulative bivariate normal distribution with a correlation parameter, ρ, of 0.30. If V(1) and V(2) are each...
Learning objectives: Define copula and describe the key properties of copulas and copula correlation. Explain tail dependence. Describe the Gaussian copula, Student’s t-copula, multivariate copula, and one-factor copula. Questions: 707.1. Below are the joint probabilities for a cumulative bivariate normal distribution with a correlation parameter, ρ, of 0.30. If V(1) and V(2) are each...
Learning objectives: Define copula and describe the key properties of copulas and copula correlation. Explain tail dependence. Describe the Gaussian copula, Student’s t-copula, multivariate copula, and one-factor copula. Questions: 707.1. Below are the joint probabilities for a cumulative...
Learning objectives: Define copula and describe the key properties of copulas and copula correlation. Explain tail dependence. Describe the Gaussian copula, Student’s t-copula, multivariate...
Replies:
0
Views:
55
8. ### P1.T2.706. Bivariate normal distribution (Hull)

@David Harper CFA FRM, makes perfect sense now. thanks for taking the time again.
@David Harper CFA FRM, makes perfect sense now. thanks for taking the time again.
@David Harper CFA FRM, makes perfect sense now. thanks for taking the time again.
@David Harper CFA FRM, makes perfect sense now. thanks for taking the time again.
Replies:
8
Views:
110
9. ### P1.T2.705. Correlation (Hull)

Thank you emilioalzamora and David for such a detailed explanation.
Thank you emilioalzamora and David for such a detailed explanation.
Thank you emilioalzamora and David for such a detailed explanation.
Thank you emilioalzamora and David for such a detailed explanation.
Replies:
13
Views:
172
10. ### P1.T2.704. Forecasting volatility with GARCH (Hull)

Thanks - much appreciate the extra detail David. I knew that omega = weight * long-run variance. What was confusing me generally was knowing from the question (whether variance or covariance/ correlation omega), when to do the omega -> variance-only conversion i.e. when to leave omega as is or split it up
Thanks - much appreciate the extra detail David. I knew that omega = weight * long-run variance. What was confusing me generally was knowing from the question (whether variance or covariance/ correlation omega), when to do the omega -> variance-only conversion i.e. when to leave omega as is or split it up
Thanks - much appreciate the extra detail David. I knew that omega = weight * long-run variance. What was confusing me generally was knowing from the question (whether variance or covariance/ correlation omega), when to do the omega -> variance-only conversion i.e. when to leave omega as is or...
Thanks - much appreciate the extra detail David. I knew that omega = weight * long-run variance. What was confusing me generally was knowing from the question (whether variance or covariance/...
Replies:
6
Views:
87
11. ### P1.T2.703. EWMA versus GARCH volatility (Hull)

Hi [USER=48426]@ That's really just an elaboration on Hull's End of Chapter Question 11.6, and I can't see any problem with it, although it's highly tedious! I entered into into the more generic version (in our learning XLS), see here (and below) at It's true that there are two omega (ω) assumptions given but that looks okay (consistent) with Hull. Please note that GARCH is used twice...
Hi [USER=48426]@ That's really just an elaboration on Hull's End of Chapter Question 11.6, and I can't see any problem with it, although it's highly tedious! I entered into into the more generic version (in our learning XLS), see here (and below) at It's true that there are two omega (ω) assumptions given but that looks okay (consistent) with Hull. Please note that GARCH is used twice...
Hi [USER=48426]@ That's really just an elaboration on Hull's End of Chapter Question 11.6, and I can't see any problem with it, although it's highly tedious! I entered into into the more generic version (in our learning XLS), see here (and below) at It's true that there are two omega (ω)...
Hi [USER=48426]@ That's really just an elaboration on Hull's End of Chapter Question 11.6, and I can't see any problem with it, although it's highly tedious! I entered into into the more...
Replies:
6
Views:
81
12. ### P1.T2.699. Linear and nonlinear trends (Diebold)

@David Harper CFA FRM in question 699.2 is reported "where y(0) is 1990:Q2", that made me consider 1990:Q2 as the intercept, so I set TIME(t)=79. Shouldn't be "where y(0) is 1990:Q1", so that we have 3 Q in 1990, 4 Q from 1991 to 2009, and 1 Q in 2010: 3+4*19+1=80?
@David Harper CFA FRM in question 699.2 is reported "where y(0) is 1990:Q2", that made me consider 1990:Q2 as the intercept, so I set TIME(t)=79. Shouldn't be "where y(0) is 1990:Q1", so that we have 3 Q in 1990, 4 Q from 1991 to 2009, and 1 Q in 2010: 3+4*19+1=80?
@David Harper CFA FRM in question 699.2 is reported "where y(0) is 1990:Q2", that made me consider 1990:Q2 as the intercept, so I set TIME(t)=79. Shouldn't be "where y(0) is 1990:Q1", so that we have 3 Q in 1990, 4 Q from 1991 to 2009, and 1 Q in 2010: 3+4*19+1=80?
@David Harper CFA FRM in question 699.2 is reported "where y(0) is 1990:Q2", that made me consider 1990:Q2 as the intercept, so I set TIME(t)=79. Shouldn't be "where y(0) is 1990:Q1", so that we...
Replies:
10
Views:
144
13. ### P1.T2.702. Simple (equally weighted) historical volatility (Hull)

Learning objectives: Define and distinguish between volatility, variance rate, and implied volatility. Describe the power law. Explain how various weighting schemes can be used in estimating volatility. Questions 702.1. Consider the following series of closing stock prices over the tend most recent trading day (this is similar to Hull's Table 10.3) along with daily log returns, squared...
Learning objectives: Define and distinguish between volatility, variance rate, and implied volatility. Describe the power law. Explain how various weighting schemes can be used in estimating volatility. Questions 702.1. Consider the following series of closing stock prices over the tend most recent trading day (this is similar to Hull's Table 10.3) along with daily log returns, squared...
Learning objectives: Define and distinguish between volatility, variance rate, and implied volatility. Describe the power law. Explain how various weighting schemes can be used in estimating volatility. Questions 702.1. Consider the following series of closing stock prices over the tend most...
Learning objectives: Define and distinguish between volatility, variance rate, and implied volatility. Describe the power law. Explain how various weighting schemes can be used in estimating...
Replies:
0
Views:
32
14. ### P1.T2.701. Regression analysis to model seasonality (Diebold)

Many thanks for you lovely comment, Brian. It is nothing special I guess, I have just read a few textbooks that's it. The modelling (implementation work) is another kettle of fish. These "goodness of fit" tests are technically quite complex. Again, thanks for your like!
Many thanks for you lovely comment, Brian. It is nothing special I guess, I have just read a few textbooks that's it. The modelling (implementation work) is another kettle of fish. These "goodness of fit" tests are technically quite complex. Again, thanks for your like!
Many thanks for you lovely comment, Brian. It is nothing special I guess, I have just read a few textbooks that's it. The modelling (implementation work) is another kettle of fish. These "goodness of fit" tests are technically quite complex. Again, thanks for your like!
Many thanks for you lovely comment, Brian. It is nothing special I guess, I have just read a few textbooks that's it. The modelling (implementation work) is another kettle of fish. These "goodness...
Replies:
11
Views:
100
15. ### P1.T2.700. Seasonality in time series analysis (Diebold)

Learning objective: Describe the sources of seasonality and how to deal with it in time series analysis. Questions 700.1. Which of the following time series is MOST LIKELY to contain a seasonal pattern? a. Price of solar panels b. Employment participation rate c. Climate data data recorded from a weather station once per year d. Return on average assets (ROA) for the large commercial bank...
Learning objective: Describe the sources of seasonality and how to deal with it in time series analysis. Questions 700.1. Which of the following time series is MOST LIKELY to contain a seasonal pattern? a. Price of solar panels b. Employment participation rate c. Climate data data recorded from a weather station once per year d. Return on average assets (ROA) for the large commercial bank...
Learning objective: Describe the sources of seasonality and how to deal with it in time series analysis. Questions 700.1. Which of the following time series is MOST LIKELY to contain a seasonal pattern? a. Price of solar panels b. Employment participation rate c. Climate data data recorded...
Learning objective: Describe the sources of seasonality and how to deal with it in time series analysis. Questions 700.1. Which of the following time series is MOST LIKELY to contain a seasonal...
Replies:
0
Views:
54
16. ### P1.T2.602. Bootstrapping (Brooks)

a GARCH process is covered in the readings.... Simulations are used to produce samples from distributions that are not parametric or not in "closed form" or, perhaps better, simulations can be used to generate samples from parametric distributions when actual samples are difficult to obtain! Imagine a simulation of earthquakes or flood levels or survival in space.....
a GARCH process is covered in the readings.... Simulations are used to produce samples from distributions that are not parametric or not in "closed form" or, perhaps better, simulations can be used to generate samples from parametric distributions when actual samples are difficult to obtain! Imagine a simulation of earthquakes or flood levels or survival in space.....
a GARCH process is covered in the readings.... Simulations are used to produce samples from distributions that are not parametric or not in "closed form" or, perhaps better, simulations can be used to generate samples from parametric distributions when actual samples are difficult to obtain! ...
a GARCH process is covered in the readings.... Simulations are used to produce samples from distributions that are not parametric or not in "closed form" or, perhaps better, simulations can be...
Replies:
4
Views:
128
17. ### P1.T2.601. Variance reduction techniques (Brooks)

Learning objectives: Explain how to use antithetic variate technique to reduce Monte Carlo sampling error. Explain how to use control variates to reduce Monte Carlo sampling error and when it is effective. Describe the benefits of reusing sets of random number draws across Monte Carlo experiments and how to reuse them. Questions: 601.1. Betty is an analyst using Monte Carlo simulation to...
Learning objectives: Explain how to use antithetic variate technique to reduce Monte Carlo sampling error. Explain how to use control variates to reduce Monte Carlo sampling error and when it is effective. Describe the benefits of reusing sets of random number draws across Monte Carlo experiments and how to reuse them. Questions: 601.1. Betty is an analyst using Monte Carlo simulation to...
Learning objectives: Explain how to use antithetic variate technique to reduce Monte Carlo sampling error. Explain how to use control variates to reduce Monte Carlo sampling error and when it is effective. Describe the benefits of reusing sets of random number draws across Monte Carlo...
Learning objectives: Explain how to use antithetic variate technique to reduce Monte Carlo sampling error. Explain how to use control variates to reduce Monte Carlo sampling error and when it is...
Replies:
0
Views:
90
18. ### P1.T2.600. Monte Carlo simulation, sampling error (Brooks)

Thank you @QuantMan2318 , nice reasoning! @ (cc [USER=27903]@Nicole Manley ) The answer is given correctly as (C) which is false. But there was a typo, consistent with the text given, it should read "In regard to true (A), (B), and (D), ..." You might notice that the explanation itemizes each of the TRUE (A), (B), and (D), specifically:
Thank you @QuantMan2318 , nice reasoning! @ (cc [USER=27903]@Nicole Manley ) The answer is given correctly as (C) which is false. But there was a typo, consistent with the text given, it should read "In regard to true (A), (B), and (D), ..." You might notice that the explanation itemizes each of the TRUE (A), (B), and (D), specifically:
Thank you @QuantMan2318 , nice reasoning! @ (cc [USER=27903]@Nicole Manley ) The answer is given correctly as (C) which is false. But there was a typo, consistent with the text given, it should read "In regard to true (A), (B), and (D), ..." You might notice that the explanation itemizes each...
Thank you @QuantMan2318 , nice reasoning! @ (cc [USER=27903]@Nicole Manley ) The answer is given correctly as (C) which is false. But there was a typo, consistent with the text given, it should...
Replies:
4
Views:
144
19. ### P1.T2.512. Autoregressive moving average (ARMA) processes (Diebold)

Learning outcomes: Define and describe the properties of the autoregressive moving average (ARMA) process. Describe the application of AR and ARMA processes. Questions: 512.1. Each of the following is a motivating for an autoregressive moving average (ARMA) process EXCEPT which is not? a. AR processes observed subject to measurement error also turn out to be ARMA processes b. When we need...
Learning outcomes: Define and describe the properties of the autoregressive moving average (ARMA) process. Describe the application of AR and ARMA processes. Questions: 512.1. Each of the following is a motivating for an autoregressive moving average (ARMA) process EXCEPT which is not? a. AR processes observed subject to measurement error also turn out to be ARMA processes b. When we need...
Learning outcomes: Define and describe the properties of the autoregressive moving average (ARMA) process. Describe the application of AR and ARMA processes. Questions: 512.1. Each of the following is a motivating for an autoregressive moving average (ARMA) process EXCEPT which is not? a. AR...
Learning outcomes: Define and describe the properties of the autoregressive moving average (ARMA) process. Describe the application of AR and ARMA processes. Questions: 512.1. Each of the...
Replies:
0
Views:
82
20. ### P1.T2.511. First-order autoregressive, AR(1), process (Diebold)

[USER=38486]@ Yes, if you look at the GARP curriculum for this year, you will see that these learning objectives are still under Topic 2, Reading 16, Diebold, Chapter 8. Thank you, Nicole
[USER=38486]@ Yes, if you look at the GARP curriculum for this year, you will see that these learning objectives are still under Topic 2, Reading 16, Diebold, Chapter 8. Thank you, Nicole
[USER=38486]@ Yes, if you look at the GARP curriculum for this year, you will see that these learning objectives are still under Topic 2, Reading 16, Diebold, Chapter 8. Thank you, Nicole
[USER=38486]@ Yes, if you look at the GARP curriculum for this year, you will see that these learning objectives are still under Topic 2, Reading 16, Diebold, Chapter 8. Thank you, Nicole
Replies:
8
Views:
163
21. ### P1.T2.510. First-order and general finite-order moving average process, MA(1) and MA(q) (Diebold)

If the roots are real and not complex, I believe.
If the roots are real and not complex, I believe.
If the roots are real and not complex, I believe.
If the roots are real and not complex, I believe.
Replies:
2
Views:
224
22. ### P1.T2.509. Box-Pierce and Ljung-Box Q-statistics (Diebold)

Hi Joyce, Wonder how I made a mistake - yes, you are right, I was looking at Chi-square 95%, 24 instead of Chi-square 5%, 24 = 36.415! Thanks a tonne Jayanthi
Hi Joyce, Wonder how I made a mistake - yes, you are right, I was looking at Chi-square 95%, 24 instead of Chi-square 5%, 24 = 36.415! Thanks a tonne Jayanthi
Hi Joyce, Wonder how I made a mistake - yes, you are right, I was looking at Chi-square 95%, 24 instead of Chi-square 5%, 24 = 36.415! Thanks a tonne Jayanthi
Hi Joyce, Wonder how I made a mistake - yes, you are right, I was looking at Chi-square 95%, 24 instead of Chi-square 5%, 24 = 36.415! Thanks a tonne Jayanthi
Replies:
3
Views:
171
23. ### P1.T2.508. Wold's theorem (Diebold)

[USER=42750]@ Okay great. No worries, honestly I learn something new almost every time that I take a fresh look at something! Good luck with your studies ...
[USER=42750]@ Okay great. No worries, honestly I learn something new almost every time that I take a fresh look at something! Good luck with your studies ...
[USER=42750]@ Okay great. No worries, honestly I learn something new almost every time that I take a fresh look at something! Good luck with your studies ...
[USER=42750]@ Okay great. No worries, honestly I learn something new almost every time that I take a fresh look at something! Good luck with your studies ...
Replies:
4
Views:
264
24. ### P1.T2.507. White noise (Diebold)

Learning outcomes: Define white noise, describe independent white noise and normal (Gaussian) white noise. Explain the characteristics of the dynamic structure of white noise. Explain how a lag operator works. Questions: 507.1. In regard to white noise, each of the following statements is true EXCEPT which is false? a. If a process is zero-mean white noise, then is must be Gaussian white...
Learning outcomes: Define white noise, describe independent white noise and normal (Gaussian) white noise. Explain the characteristics of the dynamic structure of white noise. Explain how a lag operator works. Questions: 507.1. In regard to white noise, each of the following statements is true EXCEPT which is false? a. If a process is zero-mean white noise, then is must be Gaussian white...
Learning outcomes: Define white noise, describe independent white noise and normal (Gaussian) white noise. Explain the characteristics of the dynamic structure of white noise. Explain how a lag operator works. Questions: 507.1. In regard to white noise, each of the following statements is true...
Learning outcomes: Define white noise, describe independent white noise and normal (Gaussian) white noise. Explain the characteristics of the dynamic structure of white noise. Explain how a lag...
Replies:
0
Views:
138
25. ### P1.T2.506. Covariance stationary time series (Diebold)

Hi [USER=46018]@ See below I copied Diebold's explanation for partial autocorrelation (which is excellent, in my opinion). If you keep in mind the close relationship between beta and correlation, then you can view this is analogous to the difference between (in a regression) a univariate slope coefficient and a partial multivariate slope coefficient. We can extract correlation by multiplying...
Hi [USER=46018]@ See below I copied Diebold's explanation for partial autocorrelation (which is excellent, in my opinion). If you keep in mind the close relationship between beta and correlation, then you can view this is analogous to the difference between (in a regression) a univariate slope coefficient and a partial multivariate slope coefficient. We can extract correlation by multiplying...
Hi [USER=46018]@ See below I copied Diebold's explanation for partial autocorrelation (which is excellent, in my opinion). If you keep in mind the close relationship between beta and correlation, then you can view this is analogous to the difference between (in a regression) a univariate slope...
Hi [USER=46018]@ See below I copied Diebold's explanation for partial autocorrelation (which is excellent, in my opinion). If you keep in mind the close relationship between beta and correlation,...
Replies:
6
Views:
189
26. ### P1.T2.505. Model selection criteria (Diebold)

Hi [USER=48426]@ Yes, that would be a nice addition to our learning XLS library so I just quickly started this XLS (image below) It's only 20 datapoints (illustrative) You can change the "actual" Y values (in yellow) and the rest should ripple; I used Excel's array function =TREND() It shows linear (p = 1) up to polynomial p = 4. Notice how the SSR reduces; it must reduce if the fit is...
Hi [USER=48426]@ Yes, that would be a nice addition to our learning XLS library so I just quickly started this XLS (image below) It's only 20 datapoints (illustrative) You can change the "actual" Y values (in yellow) and the rest should ripple; I used Excel's array function =TREND() It shows linear (p = 1) up to polynomial p = 4. Notice how the SSR reduces; it must reduce if the fit is...
Hi [USER=48426]@ Yes, that would be a nice addition to our learning XLS library so I just quickly started this XLS (image below) It's only 20 datapoints (illustrative) You can change the "actual" Y values (in yellow) and the rest should ripple; I used Excel's array function =TREND() It shows...
Hi [USER=48426]@ Yes, that would be a nice addition to our learning XLS library so I just quickly started this XLS (image below) It's only 20 datapoints (illustrative) You can change the...
Replies:
4
Views:
294
27. ### P1.T2.504. Copulas (Hull)

Hello The practice questions that David writes are focused around the learning objectives in the GARP curriculum, but many times, his questions are more difficult. He writes them at a higher level to ensure that our members understand the concepts in depth. So while this question may be more difficult than the questions that you will see on the exam, the concepts are still testable, as they...
Hello The practice questions that David writes are focused around the learning objectives in the GARP curriculum, but many times, his questions are more difficult. He writes them at a higher level to ensure that our members understand the concepts in depth. So while this question may be more difficult than the questions that you will see on the exam, the concepts are still testable, as they...
Hello The practice questions that David writes are focused around the learning objectives in the GARP curriculum, but many times, his questions are more difficult. He writes them at a higher level to ensure that our members understand the concepts in depth. So while this question may be more...
Hello The practice questions that David writes are focused around the learning objectives in the GARP curriculum, but many times, his questions are more difficult. He writes them at a higher...
Replies:
25
Views:
963
28. ### P1.T2.503. One-factor model (Hull)

@hellohi, This is how I have solved: e1=z1= -0.88 e2= pz1 + z2*sqrt(1-p^2) e2= [0.70*(-0.88)] + [0.63*sqrt(1-(0.7)^2) e2= -0.16609 U= Mean + (SD*e1) U= 5 + [3*(-0.88)] U= 2.36 V= Mean + (SD*e2) V= 10 + [6*(-0.16609)] V= 9.00346 Thanks, Rajiv
@hellohi, This is how I have solved: e1=z1= -0.88 e2= pz1 + z2*sqrt(1-p^2) e2= [0.70*(-0.88)] + [0.63*sqrt(1-(0.7)^2) e2= -0.16609 U= Mean + (SD*e1) U= 5 + [3*(-0.88)] U= 2.36 V= Mean + (SD*e2) V= 10 + [6*(-0.16609)] V= 9.00346 Thanks, Rajiv
@hellohi, This is how I have solved: e1=z1= -0.88 e2= pz1 + z2*sqrt(1-p^2) e2= [0.70*(-0.88)] + [0.63*sqrt(1-(0.7)^2) e2= -0.16609 U= Mean + (SD*e1) U= 5 + [3*(-0.88)] U= 2.36 V= Mean + (SD*e2) V= 10 + [6*(-0.16609)] V= 9.00346 Thanks, Rajiv
@hellohi, This is how I have solved: e1=z1= -0.88 e2= pz1 + z2*sqrt(1-p^2) e2= [0.70*(-0.88)] + [0.63*sqrt(1-(0.7)^2) e2= -0.16609 U= Mean + (SD*e1) U= 5 + [3*(-0.88)] U= 2.36 V= Mean +...
Replies:
20
Views:
909
29. ### P1.T2.502. Covariance updates with EWMA and GARCH(1,1) models (Hull)

Hi @Spinozzi That's a fair observation. I did parrot Hull's language here, such that he does refer to these given assumptions as "current daily volatilties" (see emphasized text below; which is solved above in the XLS snapshot on the column next to BT 502.2). I also cross-checked his usage in OFOD 10th edition and he similarly refers to these assumptions as "current daily volatilities." (e.g.,...
Hi @Spinozzi That's a fair observation. I did parrot Hull's language here, such that he does refer to these given assumptions as "current daily volatilties" (see emphasized text below; which is solved above in the XLS snapshot on the column next to BT 502.2). I also cross-checked his usage in OFOD 10th edition and he similarly refers to these assumptions as "current daily volatilities." (e.g.,...
Hi @Spinozzi That's a fair observation. I did parrot Hull's language here, such that he does refer to these given assumptions as "current daily volatilties" (see emphasized text below; which is solved above in the XLS snapshot on the column next to BT 502.2). I also cross-checked his usage in...
Hi @Spinozzi That's a fair observation. I did parrot Hull's language here, such that he does refer to these given assumptions as "current daily volatilties" (see emphasized text below; which is...
Replies:
23
Views:
665
30. ### P1.T2.501. More Bayes Theorem (Miller)

great - thanks again
great - thanks again
great - thanks again
great - thanks again
Replies:
16
Views:
407