I think if i am not wrong the condition Cov(xi,ei)=0(xi is ind vars and ei are error terms for ith observation) is what the consistency condition of covariance is also one of the most important assumption of regression,so its nothing but covariance assumption of regression.
The consistency condition of covariances is covered in pretty great detail in pages 72 and 73 of Reading 6 - Correlations and Copulas (Chapter 11 of Risk Management and Financial Institutions, third edition, Hull). The condition for an N X N variance-covariance matrix, to be internally consistent is
w(transpose T)w => 0 for all N x 1 vectors w where w(transpose T) is the transpose of w. A matrix that satisfies this property is known as positive-semidefinite.
To ensure that a positive-semi-definite matrix is produced, variances and covariances should be calculated consistently. For example, if variance rates are calculated by giving equal weight to the last m items, the same should be done for covariance rates. If variance rates are updated using an EWMA model with lambda = 0.94, the same should be done for covariance rates.
I beg u pardon the consistency condition for covariances establishes the formula
Cov(n)=lambda*Cov(n-1)+(1-lambda)*x(n-1)*y(n-1) i.e it extends the ewma for variances into covariances formula as above. @CK2015 I just mentioned wrong answer please ignore my above post. I wa just not able to recall just what it meant also study notes do not cover this in detail.
Yes, you are right about the covariance updation formula as above, which applies to variance updation, also. Good point....was planning to ask David if he is going to have study notes and instructional videos for 'Correlation and copulas' - Chapter 11 of Hull.