In the teaching and researching of linear regression analysis, it is interesting and enlightening to explore how the dependent variable vector can be inner-transformed into regression coefficient estimator vector from...In the teaching and researching of linear regression analysis, it is interesting and enlightening to explore how the dependent variable vector can be inner-transformed into regression coefficient estimator vector from a visible geometrical view. As an example, the roadmap of such inner transformation is presented based on a simple multiple linear regression model in this work. By applying the matrix algorithms like singular value decomposition (SVD) and Moore-Penrose generalized matrix inverse, the dependent variable vector lands into the right space of the independent variable matrix and is metamorphosed into regression coefficient estimator vector through the three-step of inner transformation. This work explores the geometrical relationship between the dependent variable vector and regression coefficient estimator vector as well as presents a new approach for vector rotating.展开更多
文摘In the teaching and researching of linear regression analysis, it is interesting and enlightening to explore how the dependent variable vector can be inner-transformed into regression coefficient estimator vector from a visible geometrical view. As an example, the roadmap of such inner transformation is presented based on a simple multiple linear regression model in this work. By applying the matrix algorithms like singular value decomposition (SVD) and Moore-Penrose generalized matrix inverse, the dependent variable vector lands into the right space of the independent variable matrix and is metamorphosed into regression coefficient estimator vector through the three-step of inner transformation. This work explores the geometrical relationship between the dependent variable vector and regression coefficient estimator vector as well as presents a new approach for vector rotating.