Finding Eigenvectors: Fast and Nontraditional Approach

arxiv(2020)

引用 0|浏览2
暂无评分
摘要
Diagonalizing a matrix $A$, that is finding two matrices $P$ and $D$ such that $A = PDP^{-1}$ with $D$ being a diagonal matrix needs two steps: first find the eigenvalues and then find the corresponding eigenvectors. We show that we do not need the second step when diagonalizing matrices with a spectrum, $\left|\sigma(A)\right|\leq 2$ since those vectors already appear as nonzero columns of the $\textit{eigenmatrices}$, a term defined in this work. We further generalize this for matrices with $\left|\sigma(A)\right|> 2$ and show that eigenvectors lie in the column spaces of eigenmatrices of the complementary eigenvalues, an approach without using the classical Gauss-Jordan elimination of rows of a matrix. We introduce two major results, namely, the $\textit{2-Spectrum Lemma}$ and the $\textit{Eigenmatrix Theorem}$. As a conjecture, we further generalize the Jordan canonical forms for a new class of generalized eigenvectors that are produced by repeated multiples of certain eigenmatrices. We also provide several shortcut formulas to find eigenvectors that does not use echelon forms. The method discussed in this work may be summarized with the mnemonic "Find your puppy at your neighbors'!" argument, where puppy is the eigenvector and the neighbors are the complementary eigenmatrices.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要