1. Suppose we observe a sample of n outcomes y, and covariates xi, and assume the usual simple linear regression model: iid Y₁ = Bo + B₁x₁ + €i, Ei ~ N(0,0²), for i = 1, 2, ..., n and we want to compute the last squares (LS) estimators (Bo,B₁) along with corresponding 95% confidence intervals as we did in class.
(a) If the equal variance assumption (i.e., homoskedasticity) does not hold: are our LS estimators still unbiased? explain
(b) If the equal variance assumption does not hold: are our confidence intervals still valid? explain
(c) If the independence assumption does not hold: are our LS estimators still unbiased? explain