Tridiagonal matrix

Tridiagonal matrix

In linear algebra, a tridiagonal matrix is a matrix that is "almost" a diagonal matrix. To be exact: a tridiagonal matrix has nonzero elements only in the main diagonal, the first diagonal below this, and the first diagonal above the main diagonal.

For example, the following matrix is tridiagonal::egin{pmatrix}1 & 4 & 0 & 0 \3 & 4 & 1 & 0 \0 & 2 & 3 & 4 \0 & 0 & 1 & 3 \end{pmatrix}

A determinant formed from a tridiagonal matrix is known as a continuant. [cite book | author=Thomas Muir | authorlink=Thomas Muir (mathematician) | title=A treatise on the theory of determinants | date=1960 | publisher=Dover Publications | pages=516-525 ]

Properties

A tridiagonal matrix is of Hessenberg type. Although a general tridiagonal matrix is not necessarily symmetric or Hermitian, many of those that arise when solving linear algebra problems have one of these properties. Furthermore, if a real tridiagonal matrix "A" satisfies "a""k","k"+1 "a""k"+1,"k" > 0, so that the signs of its entries are symmetric, then it is similar to a Hermitian matrix, and hence, its eigenvalues are real. The latter conclusion continues to hold if we replace the condition "a""k","k"+1 "a""k"+1,"k" > 0 by "a""k","k"+1 "a""k"+1,"k" ≥ 0.

The set of all "n × n" tridiagonal matrices form a "3n-2"
dimensional vector space.

Many linear algebra algorithms require significantly less computational effort when applied to diagonal matrices, and this improvement often carries over to tridiagonal matrices as well. For instance, the determinant of a tridiagonal matrix "A" of order "n" can be computed by the recursive formula for a continuant: det A = a_{n,n} det , [A] _{{1,ldots,n-1 - a_{n,n-1} a_{n-1,n} det , [A] _{{1,ldots,n-2 , ,, where det [A] _{{1,ldots,k denotes the "k"th principal minor, that is, [A] _{{1,ldots,k is the submatrix formed by the first "k" rows and columns of "A". The cost of computing the determinant of a tridiagonal matrix using this formula is linear in "n", while the cost is cubic for a general matrix.

Computer programming

A transformation that reduces a general matrix to Hessenberg form will reduce a Hermitian matrix to "tridiagonal" form. Thus, many eigenvalue algorithms, when applied to a Hermitian matrix, reduce the input Hermitian matrix to tridiagonal form as a first step.

A "tridiagonal matrix" can also be stored more efficiently than a general matrix by using a special storage scheme. For instance, the LAPACK Fortran package stores an unsymmetric tridiagonal matrix of order "n" in three one-dimensional arrays, one of length "n" containing the diagonal elements, and two of length "n" − 1 containing the subdiagonal and superdiagonal elements.

A system of tridiagonal matrix A x = b, for bin eals^n can be solved by a specific algorithm requiring "O(n)" operations (Golub and Van Loan).

References

* Roger A. Horn and Charles R. Johnson, "Matrix Analysis," Cambridge University Press, 1985. ISBN 0-521-38632-2.
* Gene H. Golub and Charles F. Van Loan, "Matrix Computations (3rd Edt.)," Johns Hopkins Univ Pr., 1996. ISBN 0-8018-5414-8.
* Bianca Beatriz Banagudos, Katherine Guerrero, and Donna Fe Gagaracruz, "Mathematics for the New Millennium," Regional Science High School for R-IX, 2008-2009, IV-Quisumbing. ISBN 0-1234-5678-9.

External links

* [http://www.netlib.org/lapack/lug/node125.html Tridiagonal and Bidiagonal Matrices] in the LAPACK manual.
* [http://math.fullerton.edu/mathews/n2003/Tri-DiagonalMod.html Module for Tri-Diagonal Linear Systems]


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Tridiagonal matrix algorithm — The tridiagonal matrix algorithm (TDMA), also known as the Thomas algorithm , is a simplified form of Gaussian elimination that can be used to solve tridiagonal systems of equations. A tridiagonal system for n unknowns may be written as:a i x {i… …   Wikipedia

  • Block matrix — In the mathematical discipline of matrix theory, a block matrix or a partitioned matrix is a matrix broken into sections called blocks. Looking at it another way, the matrix is written in terms of smaller matrices.[1] We group the rows and… …   Wikipedia

  • Band matrix — In mathematics, particularly matrix theory, a band matrix is a sparse matrix whose non zero entries are confined to a diagonal band, comprising the main diagonal and zero or more diagonals on either side. Contents 1 Matrix bandwidth 2… …   Wikipedia

  • Diagonal matrix — In linear algebra, a diagonal matrix is a matrix (usually a square matrix) in which the entries outside the main diagonal (↘) are all zero. The diagonal entries themselves may or may not be zero. Thus, the matrix D = (di,j) with n columns and n… …   Wikipedia

  • Sparse matrix — A sparse matrix obtained when solving a finite element problem in two dimensions. The non zero elements are shown in black. In the subfield of numerical analysis, a sparse matrix is a matrix populated primarily with zeros (Stoer Bulirsch 2002,… …   Wikipedia

  • Triangular matrix — In the mathematical discipline of linear algebra, a triangular matrix is a special kind of square matrix where the entries either below or above the main diagonal are zero. Because matrix equations with triangular matrices are easier to solve… …   Wikipedia

  • Lehmer matrix — In mathematics, particularly matrix theory, the n×n Lehmer matrix is the constant symmetric matrix defined by:A {ij} =egin{cases}i/j, jge i j/i, j n . The values of elements diminish toward zero away from the diagonal, where all elements have… …   Wikipedia

  • Pentadiagonal matrix — In linear algebra, a pentadiagonal matrix is a matrix that is nearly diagonal; to be exact, it is a matrix in which the only nonzero entries are on the main diagonal, and the first two diagonals above and below it. So it is of the form :… …   Wikipedia

  • Hankel matrix — In linear algebra, a Hankel matrix, named after Hermann Hankel, is a square matrix with constant (positive sloping) skew diagonals, e.g.::egin{bmatrix}a b c d e b c d e f c d e f g d e f g h e f g h i end{bmatrix}.In mathematical terms::a {i,j} …   Wikipedia

  • Arrowhead matrix — In the mathematical field of linear algebra, an arrowhead matrix is a square matrix containing zeros in all entries except for the first row, first column, and main diagonal. [cite book |author=Stewart, G. W. |title=Matrix Algorithms… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”