Beschreibung
This volume provides in-depth, step-by-step coverage of the most common matrix methods now used in statistical applications, including: eigenvalues and eigenvectors; the Moore-Penrose inverse; matrix differentiation; the distribution of quadratic forms and more. The subject matter is presented in a theorem/proof format, and every effort has been made to ease the transition from one topic to another. Proofs are easy to follow, and the author carefully justifies every step. Accessible even for readers with a cursory background in statistics, the text uses examples that are familiar and easy to understand. Key features that make this the ideal introduction to matrix analysis theory and practice include: self-contained chapters for flexibility in topic choice, extensive examples and chapter-end practice exercises and optional sections for mathematically advanced readers. The third edition includes a new chapter on inequalities. Numerous inequalities (e.g. Cauchy-Schwarz, Hadamard, Jensen's) already appear in the current text but there are many important ones that are missing and some of these are given in the new chapter. Highlighting in this chapter is a fairly substantial section on majorization and some of the inequalities that can be developed from this concept. It also includes a new section in Chapter 2 on oblique projections and their projection matrices and a new section in Chapter 3 on antieigenvalues and antieigenvectors. Finally, new theorems, proofs, examples and problems are included throughout the text.
Autorenportrait
James R. Schott, PhD, is Professor in the Department of Statistics at the University of Central Florida. He has published numerous journal articles in the area of multivariate analysis. Dr. Schott's research interests include multivariate analysis, analysis of covariance and correlation matrices, and dimensionality reduction techniques.
Leseprobe
Leseprobe