Linear regression model with and without classical assumptions (normality, constant variance, uncorrelated errors), simultaneous testing, residual analysis and regression diagnostics.

Tentative syllabus:

1. Linear model: projection and least squares estimates (LSE), Gauss-Markov theorem, estimable parameters.

2. Normal linear model: LSE properties under the normality, tests of linear hypotheses, confidence intervals and regions, prediction.

3. Submodel, tests on submodels, coefficient of determination.

4. General linear model and generalized least squares (GLS).

5. Parameterizations of numeric and categorical regressors, interpretation of a linear regression model.

6. Residual analysis and regression diagnostics: residual plots, standardized, studentized and partial residuals, leverage, outlying and influential observations, selected tests on assumptions of a linear model.

7. Consequences of a problematic regression space, multicollinearity, effect of model misspecification.

8. Strategies of model building.

9. Selected models of analysis of variance.

10.Simultaneous inference: multiple comparison procedures, methods of Tukey, Hothorn-Bretz-Westfall, confidence bands for the regression function.

11. Maximum likelihood estimates (MLE) in the normal linear model: properties of MLE, relationship to LSE.

12. Method of least squares without satisfied classical assumptions: asymptotic properties of the LSE without assumed normality and without homoscedasticity, sandwich (White) estimate of the variance of the LSE, robustness of classical confidence intervals and tests.