fbpx
Wikipedia

Schur decomposition

In the mathematical discipline of linear algebra, the Schur decomposition or Schur triangulation, named after Issai Schur, is a matrix decomposition. It allows one to write an arbitrary complex square matrix as unitarily equivalent to an upper triangular matrix whose diagonal elements are the eigenvalues of the original matrix.

Statement edit

The Schur decomposition reads as follows: if A is an n × n square matrix with complex entries, then A can be expressed as[1][2][3]

 
for some unitary matrix Q (so that the inverse Q−1 is also the conjugate transpose Q* of Q), and some upper triangular matrix U. This is called a Schur form of A. Since U is similar to A, it has the same spectrum, and since it is triangular, its eigenvalues are the diagonal entries of U.

The Schur decomposition implies that there exists a nested sequence of A-invariant subspaces {0} = V0V1 ⊂ ⋯ ⊂ Vn = Cn, and that there exists an ordered orthonormal basis (for the standard Hermitian form of Cn) such that the first i basis vectors span Vi for each i occurring in the nested sequence. Phrased somewhat differently, the first part says that a linear operator J on a complex finite-dimensional vector space stabilizes a complete flag (V1, ..., Vn).

Proof edit

A constructive proof for the Schur decomposition is as follows: every operator A on a complex finite-dimensional vector space has an eigenvalue λ, corresponding to some eigenspace Vλ. Let Vλ be its orthogonal complement. It is clear that, with respect to this orthogonal decomposition, A has matrix representation (one can pick here any orthonormal bases Z1 and Z2 spanning Vλ and Vλ respectively)

 
where Iλ is the identity operator on Vλ. The above matrix would be upper-triangular except for the A22 block. But exactly the same procedure can be applied to the sub-matrix A22, viewed as an operator on Vλ, and its submatrices. Continue this way until the resulting matrix is upper triangular. Since each conjugation increases the dimension of the upper-triangular block by at least one, this process takes at most n steps. Thus the space Cn will be exhausted and the procedure has yielded the desired result.[4]

The above argument can be slightly restated as follows: let λ be an eigenvalue of A, corresponding to some eigenspace Vλ. A induces an operator T on the quotient space Cn/Vλ. This operator is precisely the A22 submatrix from above. As before, T would have an eigenspace, say WμCn modulo Vλ. Notice the preimage of Wμ under the quotient map is an invariant subspace of A that contains Vλ. Continue this way until the resulting quotient space has dimension 0. Then the successive preimages of the eigenspaces found at each step form a flag that A stabilizes.

Notes edit

Although every square matrix has a Schur decomposition, in general this decomposition is not unique. For example, the eigenspace Vλ can have dimension > 1, in which case any orthonormal basis for Vλ would lead to the desired result.

Write the triangular matrix U as U = D + N, where D is diagonal and N is strictly upper triangular (and thus a nilpotent matrix). The diagonal matrix D contains the eigenvalues of A in arbitrary order (hence its Frobenius norm, squared, is the sum of the squared moduli of the eigenvalues of A, while the Frobenius norm of A, squared, is the sum of the squared singular values of A). The nilpotent part N is generally not unique either, but its Frobenius norm is uniquely determined by A (just because the Frobenius norm of A is equal to the Frobenius norm of U = D + N).[5]

It is clear that if A is a normal matrix, then U from its Schur decomposition must be a diagonal matrix and the column vectors of Q are the eigenvectors of A. Therefore, the Schur decomposition extends the spectral decomposition. In particular, if A is positive definite, the Schur decomposition of A, its spectral decomposition, and its singular value decomposition coincide.

A commuting family {Ai} of matrices can be simultaneously triangularized, i.e. there exists a unitary matrix Q such that, for every Ai in the given family, Q Ai Q* is upper triangular. This can be readily deduced from the above proof. Take element A from {Ai} and again consider an eigenspace VA. Then VA is invariant under all matrices in {Ai}. Therefore, all matrices in {Ai} must share one common eigenvector in VA. Induction then proves the claim. As a corollary, we have that every commuting family of normal matrices can be simultaneously diagonalized.

In the infinite dimensional setting, not every bounded operator on a Banach space has an invariant subspace. However, the upper-triangularization of an arbitrary square matrix does generalize to compact operators. Every compact operator on a complex Banach space has a nest of closed invariant subspaces.

Computation edit

The Schur decomposition of a given matrix is numerically computed by the QR algorithm or its variants. In other words, the roots of the characteristic polynomial corresponding to the matrix are not necessarily computed ahead in order to obtain its Schur decomposition. Conversely, the QR algorithm can be used to compute the roots of any given characteristic polynomial by finding the Schur decomposition of its companion matrix. Similarly, the QR algorithm is used to compute the eigenvalues of any given matrix, which are the diagonal entries of the upper triangular matrix of the Schur decomposition. Although the QR algorithm is formally an infinite sequence of operations, convergence to machine precision is practically achieved in   operations.[6] See the Nonsymmetric Eigenproblems section in LAPACK Users' Guide.[7]

Applications edit

Lie theory applications include:

Generalized Schur decomposition edit

Given square matrices A and B, the generalized Schur decomposition factorizes both matrices as   and  , where Q and Z are unitary, and S and T are upper triangular. The generalized Schur decomposition is also sometimes called the QZ decomposition.[2]: 375 [8]

The generalized eigenvalues   that solve the generalized eigenvalue problem   (where x is an unknown nonzero vector) can be calculated as the ratio of the diagonal elements of S to those of T. That is, using subscripts to denote matrix elements, the ith generalized eigenvalue   satisfies  .

References edit

  1. ^ Horn, R.A. & Johnson, C.R. (1985). Matrix Analysis. Cambridge University Press. ISBN 0-521-38632-2. (Section 2.3 and further at p. 79)
  2. ^ a b Golub, G.H. & Van Loan, C.F. (1996). Matrix Computations (3rd ed.). Johns Hopkins University Press. ISBN 0-8018-5414-8.(Section 7.7 at p. 313)
  3. ^ Schott, James R. (2016). Matrix Analysis for Statistics (3rd ed.). New York: John Wiley & Sons. pp. 175–178. ISBN 978-1-119-09247-6.
  4. ^ Wagner, David. "Proof of Schur's Theorem" (PDF). Notes on Linear Algebra.
  5. ^ Higham, Nick. "What Is a Schur Decomposition?".
  6. ^ Trefethen, Lloyd N.; Bau, David (1997). Numerical linear algebra. Philadelphia: Society for Industrial and Applied Mathematics. pp. 193–194. ISBN 0-89871-361-7. OCLC 36084666.{{cite book}}: CS1 maint: date and year (link)
  7. ^ Anderson, E; Bai, Z; Bischof, C; Blackford, S; Demmel, J; Dongarra, J; Du Croz, J; Greenbaum, A; Hammarling, S; McKenny, A; Sorensen, D (1995). LAPACK Users guide. Philadelphia, PA: Society for Industrial and Applied Mathematics. ISBN 0-89871-447-8.
  8. ^ Daniel Kressner: "Numerical Methods for General and Structured Eigenvalue Problems", Chap-2, Springer, LNCSE-46 (2005).

schur, decomposition, mathematical, discipline, linear, algebra, schur, triangulation, named, after, issai, schur, matrix, decomposition, allows, write, arbitrary, complex, square, matrix, unitarily, equivalent, upper, triangular, matrix, whose, diagonal, elem. In the mathematical discipline of linear algebra the Schur decomposition or Schur triangulation named after Issai Schur is a matrix decomposition It allows one to write an arbitrary complex square matrix as unitarily equivalent to an upper triangular matrix whose diagonal elements are the eigenvalues of the original matrix Contents 1 Statement 2 Proof 3 Notes 4 Computation 5 Applications 6 Generalized Schur decomposition 7 ReferencesStatement editThe Schur decomposition reads as follows if A is an n n square matrix with complex entries then A can be expressed as 1 2 3 A QUQ 1 displaystyle A QUQ 1 nbsp for some unitary matrix Q so that the inverse Q 1 is also the conjugate transpose Q of Q and some upper triangular matrix U This is called a Schur form of A Since U is similar to A it has the same spectrum and since it is triangular its eigenvalues are the diagonal entries of U The Schur decomposition implies that there exists a nested sequence of A invariant subspaces 0 V0 V1 Vn Cn and that there exists an ordered orthonormal basis for the standard Hermitian form of Cn such that the first i basis vectors span Vi for each i occurring in the nested sequence Phrased somewhat differently the first part says that a linear operator J on a complex finite dimensional vector space stabilizes a complete flag V1 Vn Proof editA constructive proof for the Schur decomposition is as follows every operator A on a complex finite dimensional vector space has an eigenvalue l corresponding to some eigenspace Vl Let Vl be its orthogonal complement It is clear that with respect to this orthogonal decomposition A has matrix representation one can pick here any orthonormal bases Z1 and Z2 spanning Vl and Vl respectively Z1Z2 A Z1Z2 lIlA120A22 Vl Vl Vl Vl displaystyle begin bmatrix Z 1 amp Z 2 end bmatrix A begin bmatrix Z 1 amp Z 2 end bmatrix begin bmatrix lambda I lambda amp A 12 0 amp A 22 end bmatrix begin matrix V lambda oplus V lambda perp end matrix rightarrow begin matrix V lambda oplus V lambda perp end matrix nbsp where Il is the identity operator on Vl The above matrix would be upper triangular except for the A22 block But exactly the same procedure can be applied to the sub matrix A22 viewed as an operator on Vl and its submatrices Continue this way until the resulting matrix is upper triangular Since each conjugation increases the dimension of the upper triangular block by at least one this process takes at most n steps Thus the space Cn will be exhausted and the procedure has yielded the desired result 4 The above argument can be slightly restated as follows let l be an eigenvalue of A corresponding to some eigenspace Vl A induces an operator T on the quotient space Cn Vl This operator is precisely the A22 submatrix from above As before T would have an eigenspace say Wm Cn modulo Vl Notice the preimage of Wm under the quotient map is an invariant subspace of A that contains Vl Continue this way until the resulting quotient space has dimension 0 Then the successive preimages of the eigenspaces found at each step form a flag that A stabilizes Notes editAlthough every square matrix has a Schur decomposition in general this decomposition is not unique For example the eigenspace Vl can have dimension gt 1 in which case any orthonormal basis for Vl would lead to the desired result Write the triangular matrix U as U D N where D is diagonal and N is strictly upper triangular and thus a nilpotent matrix The diagonal matrix D contains the eigenvalues of A in arbitrary order hence its Frobenius norm squared is the sum of the squared moduli of the eigenvalues of A while the Frobenius norm of A squared is the sum of the squared singular values of A The nilpotent part N is generally not unique either but its Frobenius norm is uniquely determined by A just because the Frobenius norm of A is equal to the Frobenius norm of U D N 5 It is clear that if A is a normal matrix then U from its Schur decomposition must be a diagonal matrix and the column vectors of Q are the eigenvectors of A Therefore the Schur decomposition extends the spectral decomposition In particular if A is positive definite the Schur decomposition of A its spectral decomposition and its singular value decomposition coincide A commuting family Ai of matrices can be simultaneously triangularized i e there exists a unitary matrix Q such that for every Ai in the given family Q Ai Q is upper triangular This can be readily deduced from the above proof Take element A from Ai and again consider an eigenspace VA Then VA is invariant under all matrices in Ai Therefore all matrices in Ai must share one common eigenvector in VA Induction then proves the claim As a corollary we have that every commuting family of normal matrices can be simultaneously diagonalized In the infinite dimensional setting not every bounded operator on a Banach space has an invariant subspace However the upper triangularization of an arbitrary square matrix does generalize to compact operators Every compact operator on a complex Banach space has a nest of closed invariant subspaces Computation editThe Schur decomposition of a given matrix is numerically computed by the QR algorithm or its variants In other words the roots of the characteristic polynomial corresponding to the matrix are not necessarily computed ahead in order to obtain its Schur decomposition Conversely the QR algorithm can be used to compute the roots of any given characteristic polynomial by finding the Schur decomposition of its companion matrix Similarly the QR algorithm is used to compute the eigenvalues of any given matrix which are the diagonal entries of the upper triangular matrix of the Schur decomposition Although the QR algorithm is formally an infinite sequence of operations convergence to machine precision is practically achieved in O n3 displaystyle mathcal O n 3 nbsp operations 6 See the Nonsymmetric Eigenproblems section in LAPACK Users Guide 7 Applications editLie theory applications include Every invertible operator is contained in a Borel group Every operator fixes a point of the flag manifold Generalized Schur decomposition editGiven square matrices A and B the generalized Schur decomposition factorizes both matrices as A QSZ displaystyle A QSZ nbsp and B QTZ displaystyle B QTZ nbsp where Q and Z are unitary and S and T are upper triangular The generalized Schur decomposition is also sometimes called the QZ decomposition 2 375 8 The generalized eigenvalues l displaystyle lambda nbsp that solve the generalized eigenvalue problem Ax lBx displaystyle A mathbf x lambda B mathbf x nbsp where x is an unknown nonzero vector can be calculated as the ratio of the diagonal elements of S to those of T That is using subscripts to denote matrix elements the ith generalized eigenvalue li displaystyle lambda i nbsp satisfies li Sii Tii displaystyle lambda i S ii T ii nbsp References edit Horn R A amp Johnson C R 1985 Matrix Analysis Cambridge University Press ISBN 0 521 38632 2 Section 2 3 and further at p 79 a b Golub G H amp Van Loan C F 1996 Matrix Computations 3rd ed Johns Hopkins University Press ISBN 0 8018 5414 8 Section 7 7 at p 313 Schott James R 2016 Matrix Analysis for Statistics 3rd ed New York John Wiley amp Sons pp 175 178 ISBN 978 1 119 09247 6 Wagner David Proof of Schur s Theorem PDF Notes on Linear Algebra Higham Nick What Is a Schur Decomposition Trefethen Lloyd N Bau David 1997 Numerical linear algebra Philadelphia Society for Industrial and Applied Mathematics pp 193 194 ISBN 0 89871 361 7 OCLC 36084666 a href Template Cite book html title Template Cite book cite book a CS1 maint date and year link Anderson E Bai Z Bischof C Blackford S Demmel J Dongarra J Du Croz J Greenbaum A Hammarling S McKenny A Sorensen D 1995 LAPACK Users guide Philadelphia PA Society for Industrial and Applied Mathematics ISBN 0 89871 447 8 Daniel Kressner Numerical Methods for General and Structured Eigenvalue Problems Chap 2 Springer LNCSE 46 2005 Retrieved from https en wikipedia org w index php title Schur decomposition amp oldid 1218485095 Generalized Schur decomposition, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.