Decomposition of the Random Error Vector of a General Linear Model
Keywords:
coefficient matrix; decomposition; least squares; orthogonal complement; projection; vector space
Abstract
This paper deals with the decomposition of an error vector to identify how the error vector is related to the expected value of an observation vector under a general linear sample model since the error vector is defined as the deviance of observation vector from the expected value The main idea of the paper is in that a random error vector can be decomposed into two orthogonal components vectors i e one is in a vector space generated by the coefficient matrix of the unknown parameter vector and the other is in orthogonal complement of it As related topics to the decomposition two things are discussed partitioning an observation vector and constructing the covariance structure of it It also shows the reason why a projection method would be preferred rather than a least squares method
Downloads
- Article PDF
- TEI XML Kaleidoscope (download in zip)* (Beta by AI)
- Lens* NISO JATS XML (Beta by AI)
- HTML Kaleidoscope* (Beta by AI)
- DBK XML Kaleidoscope (download in zip)* (Beta by AI)
- LaTeX pdf Kaleidoscope* (Beta by AI)
- EPUB Kaleidoscope* (Beta by AI)
- MD Kaleidoscope* (Beta by AI)
- FO Kaleidoscope* (Beta by AI)
- BIB Kaleidoscope* (Beta by AI)
- LaTeX Kaleidoscope* (Beta by AI)
How to Cite
Published
2023-04-13
Issue
Section
License
Copyright (c) 2023 Authors and Global Journals Private Limited
This work is licensed under a Creative Commons Attribution 4.0 International License.