Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Online Resource
    Online Resource
    Berlin, Heidelberg :Springer Berlin Heidelberg :
    UID:
    almahu_9947363273302882
    Format: XV, 348 p. 5 illus. , online resource.
    ISBN: 9783662250921
    Series Statement: Springer Texts in Statistics,
    Content: Any method of fitting equations to data may be called regression. Such equations are valuable for at least two purposes: making predictions and judging the strength of relationships. Because they provide a way of em­ pirically identifying how a variable is affected by other variables, regression methods have become essential in a wide range of fields, including the soeial seiences, engineering, medical research and business. Of the various methods of performing regression, least squares is the most widely used. In fact, linear least squares regression is by far the most widely used of any statistical technique. Although nonlinear least squares is covered in an appendix, this book is mainly ab out linear least squares applied to fit a single equation (as opposed to a system of equations). The writing of this book started in 1982. Since then, various drafts have been used at the University of Toronto for teaching a semester-Iong course to juniors, seniors and graduate students in a number of fields, including statistics, pharmacology, pharmacology, engineering, economics, forestry and the behav­ ioral seiences. Parts of the book have also been used in a quarter-Iong course given to Master's and Ph.D. students in public administration, urban plan­ ning and engineering at the University of Illinois at Chicago (UIC). This experience and the comments and critieisms from students helped forge the final version.
    Note: 1 Introduction -- 2 Multiple Regression -- 3 Tests and Confidence Regions -- 4 Indicator Variables -- 5 The Normality Assumption -- 6 Unequal Variances -- 7 *Correlated Errors -- 8 Outliers and Influential Observations -- 9 Transformations -- 10 Multicollinearity -- 11 Variable Selection -- 12 *Biased Estimation -- A Matrices -- A.1 Addition and Multiplication -- A.2 The Transpose of a Matrix -- A.3 Null and Identity Matrices -- A.4 Vectors -- A.5 Rank of a Matrix -- A.6 Trace of a Matrix -- A.7 Partitioned Matrices -- A.8 Determinants -- A.9 Inverses -- A.10 Characteristic Roots and Vectors -- A.11 Idempotent Matrices -- A.12 The Generalized Inverse -- A.13 Quadratic Forms -- A.14 Vector Spaces -- Problems -- B Random Variables and Random Vectors -- B.1 Random Variables -- B.1.1 Independent Random Variables -- B.1.2 Correlated Random Variables -- B.1.3 Sample Statistics -- B.1.4 Linear Combinations of Random Variables -- B.2 Random Vectors -- B.3 The Multivariate Normal Distribution -- B.4 The Chi-Square Distributions -- B.5 The F and t Distributions -- B.6 Jacobian of Transformations -- B.7 Multiple Correlation -- Problems -- C Nonlinear Least Squares -- C.1 Gauss-Newton Type Algorithms -- C.1.1 The Gauss-Newton Procedure -- C.1.2 Step Halving -- C.1.3 Starting Values and Derivatives -- C.1.4 Marquardt Procedure -- C.2 Some Other Algorithms -- C.2.1 Steepest Descent Method -- C.2.2 Quasi-Newton Algorithms -- C.2.3 The Simplex Method -- C.2.4 Weighting -- C.3 Pitfalls -- C.4 Bias, Confidence Regions and Measures of Fit -- C.5 Examples -- Problems -- Tables -- References -- Author Index.
    In: Springer eBooks
    Additional Edition: Printed edition: ISBN 9783540972112
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages