Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Online Resource
    Online Resource
    Berlin ;New York : W. de Gruyter
    UID:
    almahu_9947360000702882
    Format: Online-Ressource (xiii, 374 p) , ill
    Edition: Online-Ausg. Palo Alto, Calif ebrary 2011 Electronic reproduction; Available via World Wide Web
    ISBN: 3110138638 (pbk. : acid-free paper) , 3110140306 (acid-free paper) , 9783110140309 (acid-free paper) , 9783110889765
    Series Statement: De Gruyter textbook
    Content: Parametric Statistical Theory
    Note: Includes bibliographical references (p. [345]-359) and indexes , 2.4 Concentration of multivariate estimators2.5 Evaluating estimators by loss functions; 2.6 The relative efficiency of estimators; 2.7 Examples on the evaluation of estimators; Chapter 3 Mean unbiased estimators and convex loss functions; 3.1 Introduction; 3.2 The Rao-Blackwell-Lehmann-Scheffé-Theorem; 3.3 Examples of mean unbiased estimators with minimal convex risk; 3.4 Mean unbiased estimation of probabilities; 3.5 A result on bounded mean unbiased estimators; Chapter 4 Testing hypotheses; 4.1 Basic concepts; 4.2 Critical functions, critical regions; 4.3 The Neyman-Pearson Lemma. , 4.4 Optimal tests for composite hypotheses4.5 Optimal tests for families with monotone likelihood ratios; 4.6 Tests of Neyman structure; 4.7 Most powerful similar tests for a real parameter in the presence of a nuisance parameter; Chapter 5 Confidence procedures; 5.1 Basic concepts; 5.2 The evaluation of confidence procedures; 5.3 The construction of one-sided confidence bounds and median unbiased estimators; 5.4 Optimal one-sided confidence bounds and median unbiased estimators; 5.5 Optimal one-sided confidence bounds and median unbiased estimators in the presence of a nuisance parameter. , 5.6 Examples of maximally concentrated confidence boundsChapter 6 Consistent estimators; 6.1 Introduction; 6.2 A general consistency theorem; 6.3 Consistency of M-estimators; 6.4 Consistent solutions of estimating equations; 6.5 Consistency of maximum likelihood estimators; 6.6 Examples of ML estimators; 6.7 Appendix: Uniform integrability, stochastic convergence and measurable selection; Chapter 7 Asymptotic distributions of estimator sequences; 7.1 Limit distributions; 7.2 How to deal with limit distributions; 7.3 Asymptotic confidence bounds; 7.4 Solutions to estimating equations. , 7.5 The limit distribution of ML estimator sequences7.6 Stochastic approximations to estimator sequences; 7.7 Appendix: Weak convergence; Chapter 8 Asymptotic bounds for the concentration of estimators and confidence bounds; 8.1 Introduction; 8.2 Regular sequences of confidence bounds and median unbiased estimators; 8.3 Sequences of confidence bounds and median unbiased estimators with limit distributions; 8.4 The convolution theorem; 8.5 Maximally concentrated limit distributions; 8.6 Superefficiency; Chapter 9 Miscellaneous results on asymptotic distributions; 9.1 Examples of ML estimators. , 9.2 Tolerance bounds. , Preface; Introduction; Chapter 1 Sufficiency and completeness; 1.1 Introduction; 1.2 Sufficiency and factorization of densities; 1.3 Sufficiency and exhaustivity; 1.4 Minimal sufficiency; 1.5 Completeness; 1.6 Exponential families; 1.7 Auxiliary results on families with monotone likelihood ratios; 1.8 Ancillary statistics; 1.9 Equivariance and invariance; 1.10 Appendix: Conditional expectations, conditional distributions; Chapter 2 The evaluation of estimators; 2.1 Introduction; 2.2 Unbiasedness of estimators; 2.3 The concentration of real valued estimators.
    Language: English
    URL: Cover
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages