: N. H. Bingham, John M. Fry
: Regression Linear Models in Statistics
: Springer Verlag London Limited
: 9781848829695
: 1
: CHF 31.70
:
: Sonstiges
: English
: 293
: DRM
: PC/MAC/eReader/Tablet
: PDF
Regression is the branch of Statistics in which a dependent variable of interest is modelled as a linear combination of one or more predictor variables, together with a random error. The subject is inherently two- or higher- dimensional, thus an understanding of Statistics in one dimension is essential.


Regression: Linear Models in Statistics fills the gap between introductory statistical theory and more specialist sources of information. In doing so, it provides the reader with a number of worked examples, and exercises with full solutions.


The book begins with simple linear regression (one predictor variable), and analysis of variance (ANOVA), and then further explores the area through inclusion of topics such as multiple linear regression (several predictor variables) and analysis of covariance (ANCOVA). The book concludes with special topics such as non-parametric regression and mixed models, time series, spatial processes and design of experiments.


Aimed at 2nd and 3rd year undergraduates studying Statistics, Regression: Linear Models in Statistics requires a basic knowledge of (one-dimensional) Statistics, as well as Probability and standard Linear Algebra. Possible companions include John Haigh's Probability Models, and T. S. Blyth& E.F. Robertsons' Basic Linear Algebra and Further Linear Algebra.
Preface373
8373
Contents12
1. Linear Regression16
1.1 Introduction16
1.2 The Method of Least Squares18
1.2.1 Correlation version22
1.2.2 Large-sample limit23
1.3 The origins of regression24
1.4 Applications of regression26
1.5 The Bivariate Normal Distribution29
1.6 Maximum Likelihood and Least Squares36
1.7 Sums of Squares38
1.8 Two regressors41
Exercises43
2. The Analysis of Variance (ANOVA)48
2.1 The Chi-Square Distribution48
2.2 Change of variable formula and Jacobians51
2.3 The Fisher F-distribution52
2.4 Orthogonality53
2.5 Normal sample mean and sample variance54
2.6 One-Way Analysis of Variance57
2.7 Two-Way ANOVA No Replications
2.8 Two-Way ANOVA: Replications and Interaction67
Exercises71
3. Multiple Regression75
3.1 The Normal Equations75
3.2 Solution of the Normal Equations78
3.3 Properties of Least-Squares Estimators84
3.4 Sum-of-Squares Decompositions87
3.4.1 Coefficient of determination93
3.5 Chi-Square Decomposition94
3.5.1 Idempotence, Trace and Rank95
3.5.2 Quadratic forms in normal variates96
3.5.3 Sums of Projections96
3.6 Orthogonal Projections and Pythagoras's Theorem99
3.7 Worked examples103
Exercises108
4. Further Multilinear Regression112
4.1 Polynomial Regression112
4.1.1 The Principle of Parsimony115
4.1.2 Orthogonal polynomials116
4.1.3 Packages116
4.2 Analysis of Variance117
4.3 The Multivariate Normal Distribution118
4.4 The Multinormal Density124
4.4.1 Estimation for the multivariate normal126
4.5 Conditioning and Regression128
4.6 Mean-square prediction134
4.7 Generalised least squares and weighted regression136
Exercises138
5. Adding additional covariates and the Analysisof Covariance141
5.1 Introducing further explanatory variables141
5.1.1 Orthogonal parameters145
5.2 ANCOVA147
Interactions148
5.2.1 Nested Models151
Update151
Akaike Information Criterion (AIC)152
Step152
5.3 Examples152
Exercises157
6. Linear Hypotheses161
6.1 Minimisation Under Constraints161
6.2 Sum-of-Squares Decomposition and F-Test164
6.3 Applications: Sequential Methods169
6.3.1 Forward selection169
6.3.2 Backward selection170
6.3.3 Stepwise regression171
Exercises172
7. Model Checking and Transformation of Data175
7.1 Deviations from Standard Assumptions175
Residual Plots175
Scatter Plots175
Non-constant Variance176
Unaccounted-for Structure176
Outliers176
Detecting outliers via residual analysis177
Influential Data Points178
Cook's distance179
Non-additive or non-Gaussian errors180
Correlated Errors180
7.2 Transformation of Data180
Dimensional Analysis183
7.3 Variance-Stabilising Transformations183
Taylor's Power Law184
Delta Method185
7.4 Multicollinearity186
Regression Diagnostics189
Exercises189
8. Generalised Linear Models193
8.1 Introduction193
8.2 Definitions and examples195
8.2.1 Statistical testing and model comparisons197
8.2.2 Analysis of residuals199
8.2.3 Athletics times200
8.3 Binary models202
8.4 Count data, contingency tables and log-linear models205
8.5 Over-dispersion and the Negative Binomial Distribution209
8.5.1 Practical applications: Analysis of over-dispersed models in R209
211209
Exercises212
9. Other topics214
9.1 Mixed models214
9.1.1 Mixed models and Generalised Least Squares217
9.2 Non-parametric regression222
9.2.1 Kriging224
9.3 Experimental Design226
9.3.1 Optimality criteria226
9.3.2 Incomplete designs227
9.4 Time series230
9.4.1 Cointegration and spurious regression231
9.5 Survival analysis233
9.5.1 Proportional hazards235
9.6 p235
9.6 p235
236235
Solutions237
Dramatis Personae: Who did what when279
Bibliography281
Index288