What's new


  1. Nicole Seaman

    P1.T2.20.20 Regression diagnostics: outliers (Cook's distance), m-fold cross-validation, and residual diagnostics

    Learning objectives: Explain two model selection procedures and how these relate to the bias-variance trade-off. Describe the various methods of visualizing residuals and their relative strengths. Describe methods for identifying outliers and their impact. Determine the conditions under which...
  2. Nicole Seaman

    P1.T2.20.19. Regression diagnostics: omitted variables, heteroskedasticity, and multicollinearity

    Learning objectives: Explain how to test whether a regression is affected by heteroskedasticity. Describe approaches to using heteroskedastic data. Characterize multicollinearity and its consequences; distinguish between multicollinearity and perfect collinearity. Describe the consequences of...
  3. Nicole Seaman

    P1.T2.20.17. Hypothesis tests of univariate linear regression model

    Learning objectives: Construct, apply, and interpret hypothesis tests and confidence intervals for a single regression coefficient in a regression. Explain the steps needed to perform a hypothesis test in a linear regression. Describe the relationship between a t-statistic, its p-value, and a...
  4. S

    Regression with one regressor

    Hey david! I had a doubt while reading the chapter "regression with a single regressor" from Schweser. There was a statement that the variance of the slope(beta) decreases with the variance of the explanatory variable. The explanation given was higher variance of the explanatory (X) variable...
  5. P

    Portfolio Systematic Risk, Breaking it down into factor % contributions

    I have a portfolio (p) of N equities, with lets say weights vector (m) at the start of the calculation period. Each equity has it's own set of factors (like corresponding country, industry index, etc.), some of the equities has the same factors. I am trying to breakdown the systematic risk into...
  6. Nicole Seaman

    YouTube T2-19b: Regression: Excel's linest array function and its goodness-of-fit measures

    Excel's linest array function returns the R^2, SER and F-ratio which are interrelated as goodness-of-fit measures. David's XLS is here: https://trtl.bz/2vmGHd2
  7. Nicole Seaman

    YouTube T2-18 Regression: Significance Test of Slope Coefficient

    The test statistic of the slope is given by (b1 - β)/SE(b1), although typically the null hypothesis is H(0):β = 0, such that the test statistic simply divides the regression coefficient by its own standard error (i.e., standard deviation of the estimate). This is compared to the student's t...
  8. Nicole Seaman

    YouTube T2-17 Regression: R-squared

    The R-squared (aka, coefficient of determination) is a goodness of fit measure. It gives the percentage of TOTAL variation that is explained by the regression line. Here is David's XLS: https://trtl.bz/2Exyu5c
  9. Nicole Seaman

    YouTube T2-16 Regression: standard error of regression

    The standard error of the regression (SER) is a key measure of the OLS regression line's "goodness of fit." The SER equals the square root of [sum of squared residuals (SSR) divided by the degrees of freedom (d.f.)], where d.f. is the number of observations minus the number of regression...
  10. A

    Multiple regression

    Hi, This might be a really simple question :) but I am trying to understand the following statement The multiple regression model permits estimating the effect on a dependent variable of changing on regressor while holding the other regressors constant. What does it mean "hold the other...