Symmetric matrix

Symmetric matrix

In linear algebra, a symmetric matrix is a square matrix, "A", that is equal to its transpose

:A = A^{T}. ,!

The entries of a symmetric matrix are symmetric with respect to the main diagonal (top left to bottom right). So if the entries are written as "A" = ("a""ij"), then:a_{ij} = a_{ji} ,!for all indices "i" and "j". The following 3×3 matrix is symmetric:

:egin{bmatrix}1 & 2 & 3\2 & 4 & -5\3 & -5 & 6end{bmatrix}.

A matrix is called skew-symmetric or antisymmetric if its transpose is the same as its negative. The following 3×3 matrix is skew-symmetric:

:egin{bmatrix}0 & -3 & 4\3 & 0 & -5\-4 & 5 & 0end{bmatrix}.

Every diagonal matrix is symmetric, since all off-diagonal entries are zero. Similarly, each diagonal element of a skew-symmetric matrix must be zero, since each is its own negative. The following matrix is neither symmetric nor skew-symmetric:

:egin{bmatrix}1 & -4 & 2\5 & 1 & -4\-3 & 5 & 1end{bmatrix}.

In linear algebra, a symmetric matrix represents a self-adjoint operator over a real inner product space. The corresponding object for a complex inner product space is a Hermitian matrix with complex-valued entries, which is equal to its conjugate transpose. Therefore, it is generally assumed that a symmetric matrix has real-valued entries.

Symmetric matrices appear naturally in a variety of applications, and typical numerical linear algebra software makes special accommodations for them.

Properties

One of the basic theorems concerning such matrices is the finite-dimensional spectral theorem, which says that any symmetric matrix whose entries are real can be diagonalized by an orthogonal matrix. More explicitly: For every symmetric real matrix "A" there exists a real orthogonal matrix "Q" such that "D" = "Q"T"AQ" is a diagonal matrix. Every symmetric matrix is thus, up to choice of an orthonormal basis, a diagonal matrix.

Another way of stating the real spectral theorem is that the eigenvectors of a symmetric matrix are orthogonal. More precisely, a matrix is symmetric if and only if it has an orthonormal basis of eigenvectors.

Every real symmetric matrix is Hermitian, and therefore all its eigenvalues are real. (In fact, the eigenvalues are the entries in the above diagonal matrix "D", and therefore "D" is uniquely determined by "A" up to the order of its entries.) Essentially, the property of being symmetric for real matrices corresponds to the property of being Hermitian for complex matrices.

Every square real matrix "X" can be written in a unique way as the sum of a symmetric and a
skew-symmetric matrix. This is done in the following way::X=frac{1}{2}left(X+X^ extrm{T} ight)+frac{1}{2}left(X-X^ extrm{T} ight).(This is true more generally for every square matrix "X" with entries from any field whose characteristic is different from 2.)

The sum and difference of two symmetric matrices is again symmetric, but this is not always true for the product: given symmetric matrices "A" and "B", then "AB" is symmetric if and only if "A" and "B" commute, i.e., if "AB" = "BA". So for integer "n", "An" is symmetric if "A" is symmetric. Two real symmetric matrices commute if and only if they have the same eigenspaces.

If "A"−1 exists, it is symmetric if "A" is symmetric.

Any matrix congruent to a symmetric matrix is again symmetric: if "X" is a symmetric matrix then so is "AXA"T for any matrix "A".

Denote with langle cdot,cdot angle the standard inner product on R"n". The real "n"-by-"n" matrix "A" is symmetric if and only if :langle Ax,y angle = langle x, Ay angle quad mbox{for all }x,yinBbb{R}^n.

It should be noted that this definition is independent of the choice of basis, and thus symmetry is a property that depends only on the linear operator A and a choice of inner product. In finite dimensions, the relationship between linear maps or operators and matrices is so close, that one often speaks of them almost interchangeably. But this basis independent definition of symmetry is often important. For example, in differential geometry each tangent space to a manifold may be endowed with an inner product, giving rise to what is called a Riemannian manifold. It may be convenient to work with explicit coordinates, but often it is not. If we do not wish to do so, we may require that tangent spaces be endowed with a non-degenerate symmetric form, and when a basis is fixed, this reduces to the familiar case of a symmetric matrix. Another area where this formulation is important is in infinite dimensional spaces called Hilbert spaces where it is simply not possible to write down a matix representation.

Using the Jordan normal form, one can prove that every square real matrix can be written as a product of two real symmetric matrices, and every square complex matrix can be written as a product of two complex symmetric matrices. (Bosch, 1986)

Every real non-singular matrix can be uniquely factored as the product of an orthogonal matrix and a symmetric positive definite matrix, which is called a polar decomposition. Singular matrices can be also factored, but not uniquely.

A symmetric n imes n matrix is determined by frac{n(n+1)}{2} scalars. Similarly, a skew-symmetric matrix is determined by frac{n(n-1)}{2} scalars.

Occurrence

Symmetric real "n"-by-"n" matrices appear as the Hessian of twice continuously differentiable functions of "n" real variables.

Every quadratic form "q" on R"n" can be uniquely written in the form "q"(x) = xT"A"x with a symmetric "n"-by-"n" matrix "A". Because of the above spectral theorem, one can then say that every quadratic form, up to the choice of an orthonormal basis of R"n", "looks like":q(x_1,ldots,x_n)=sum_{i=1}^n lambda_i x_i^2with real numbers λ"i". This considerably simplifies the study of quadratic forms, as well as the study of the level sets {x : "q"(x) = 1} which are generalizations of conic sections.

This is important partly because the second-order behavior of every smooth multi-variable function is described by the quadratic form belonging to the function's Hessian; this is a consequence of Taylor's theorem.

See also

* Hermitian matrix
* Normal matrix
* Hilbert matrix
* Self-adjoint operator

Other types of symmetry or pattern in square matrices have special names; see for example:

* Circulant matrix
* Hankel matrix
* Toeplitz matrix
* Centrosymmetric matrix

See also symmetry in mathematics.

References

*

External links

* [http://www.ocolon.org/editor/template.php?.matrix_split_symm_skew Template for splitting a matrix online into a symmetric and a skew-symmetric addend]


Wikimedia Foundation. 2010.

Игры ⚽ Поможем сделать НИР

Look at other dictionaries:

  • symmetric matrix — simetrinė matrica statusas T sritis fizika atitikmenys: angl. symmetric matrix; symmetrical matrix vok. symmetrische Matrix, f rus. симметрическая матрица, f pranc. matrice symétrique, f …   Fizikos terminų žodynas

  • symmetric matrix — noun A square matrix that is its own transpose, and is symmetric about the main diagonal …   Wiktionary

  • symmetric matrix — noun Date: circa 1949 a matrix that is its own transpose …   New Collegiate Dictionary

  • symmetric matrix — noun : a matrix that is its own transpose …   Useful english dictionary

  • Skew-symmetric matrix — In linear algebra, a skew symmetric (or antisymmetric) matrix is a square matrix A whose transpose is also its negative; that is, it satisfies the equation:: A T = − A or in component form, if A = ( a ij ):: a ij = − a ji for all i and j .For… …   Wikipedia

  • Matrix (mathematics) — Specific elements of a matrix are often denoted by a variable with two subscripts. For instance, a2,1 represents the element at the second row and first column of a matrix A. In mathematics, a matrix (plural matrices, or less commonly matrixes)… …   Wikipedia

  • Symmetric bilinear form — A symmetric bilinear form is, as the name implies, a bilinear form on a vector space that is symmetric. They are of great importance in the study of orthogonal polarities and quadrics.They are also more briefly referred to as symmetric forms when …   Wikipedia

  • Matrix decomposition — In the mathematical discipline of linear algebra, a matrix decomposition is a factorization of a matrix into some canonical form. There are many different matrix decompositions; each finds use among a particular class of problems. Contents 1… …   Wikipedia

  • Symmetric tensor — In mathematics, a symmetric tensor is a tensor that is invariant under a permutation of its vector arguments. Symmetric tensors of rank two are sometimes called quadratic forms. In more abstract terms, symmetric tensors of general rank are… …   Wikipedia

  • Matrix exponential — In mathematics, the matrix exponential is a matrix function on square matrices analogous to the ordinary exponential function. Abstractly, the matrix exponential gives the connection between a matrix Lie algebra and the corresponding Lie group.… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”