# Linear algebra

﻿
Linear algebra  R3 is a vector (linear) space, and lines and planes passing through the origin are vector subspaces in R3. Subspaces are a common object of study in linear algebra.

Linear algebra is a branch of mathematics that studies vector spaces, also called linear spaces, along with linear functions that input one vector and output another. Such functions are called linear maps (or linear transformations or linear operators) and can be represented by matrices if a basis is given. Thus matrix theory is often considered as a part of linear algebra. Linear algebra is commonly restricted to the case of finite dimensional vector spaces, while the peculiarities of the infinite dimensional case are traditionally covered in linear functional analysis.

Linear algebra is central to modern mathematics and its applications. An elementary application of linear algebra is to find the solution of a system of linear equations in several unknowns. More advanced applications are ubiquitous in areas as diverse as abstract algebra and functional analysis. Linear algebra has a concrete representation in analytic geometry and is generalized in operator theory and in module theory. It has extensive applications in engineering, physics, natural sciences, computer science, and the social sciences (particularly in economics). Nonlinear mathematical models can often be approximated by linear ones.

## History

The subject first took its modern form in the first half of the twentieth century. At this time, many ideas and methods of previous centuries were generalized as abstract algebra. Matrices and tensors were introduced in the latter part of the 19th century. The use of these objects in quantum mechanics, special relativity, and statistics did much to spread the subject of linear algebra beyond pure mathematics.

The origin of many of these ideas is discussed in the articles on determinants and Gaussian elimination.

## Main structures

The main structures of linear algebra are vector spaces and linear maps between them. A vector space is a set whose elements can be added together and multiplied by the scalars, or numbers. In many physical applications, the scalars are real numbers, R. More generally, the scalars may form any field F—thus one can consider vector spaces over the field Q of rational numbers, the field C of complex numbers, or a finite field Fq.

In a vector space, the operations of addition and scalar multiplication must behave similarly to the usual addition and multiplication of numbers: addition is commutative and associative, multiplication distributes over addition, and so on. More precisely, the two operations must satisfy a list of axioms chosen to emulate the properties of addition and scalar multiplication of Euclidean vectors in the coordinate n-space Rn. One of the axioms stipulates the existence of a zero vector, which behaves analogously to the number zero with respect to addition.

Elements of a general vector space V may be objects of any nature, for example, functions or polynomials, but when viewed as elements of V, they are frequently called vectors.

Given two vector spaces V and W over a field F, a linear transformation (or "linear map") is a map $T:V\to W$

that is compatible with addition and scalar multiplication: $T(u+v)=T(u)+T(v), \quad T(rv)=rT(v)$

for any vectors u,vV and a scalar rF.

Other fundamental notions in linear algebra include: linear combination, span, linear independence of vectors, a basis of a vector space, and the dimension of a vector space.

Given a vector space V over a field F, an expression of the form $r_1 v_1 + r_2 v_2 + \cdots + r_k v_k, \,$

where v1, v2, …, vk are vectors and r1, r2, …, rk are scalars, is called the linear combination of the vectors v1, v2, …, vk with coefficients r1, r2, …, rk. The set of all linear combinations of vectors v1, v2, …, vk is called their span. A linear combination of any system of vectors with all zero coefficients is zero vector of V. If this is the only way to express zero vector as a linear combination of v1, v2, …, vk then these vectors are linearly independent. A linearly independent set of vectors that spans a vector space V is a basis of V. If a vector space admits a finite basis then any two bases have the same number of elements (called the dimension of V) and V is a finite-dimensional vector space. This theory can be extended to infinite-dimensional spaces.

There is an important distinction between the coordinate n-space Rn and a general finite-dimensional vector space V. While Rn has a standard basis {e1, e2, …, en}, a vector space V typically does not come equipped with a basis and many different bases exist (although they all consist of the same number of elements equal to the dimension of V). Having a particular basis {v1, v2, …, vn} of V allows one to construct a coordinate system in V: the vector with coordinates (r1, r2, …, rn) is the linear combination $r_1 v_1 + r_2 v_2 + \cdots + r_n v_n.$

The condition that v1, v2, …, vn span V guarantees that each vector v can be assigned coordinates, whereas the linear independence of v1, v2, …, vn further assures that these coordinates are determined in a unique way (i.e. there is only one linear combination of the basis vectors that is equal to v). In this way, once a basis of a vector space V over F has been chosen, V may be identified with the coordinate n-space Fn. Under this identification, addition and scalar multiplication of vectors in V correspond to addition and scalar multiplication of their coordinate vectors in Fn. Furthermore, if V and W are an n-dimensional and m-dimensional vector space over F, and a basis of V and a basis of W have been fixed, then any linear transformation T: VW may be encoded by an m × n matrix A with entries in the field F, called the matrix of T with respect to these bases. Therefore, by and large, the study of linear transformations, which were defined axiomatically, may be replaced by the study of matrices, which are concrete objects. This is a major technique in linear algebra.

## Some main useful theorems

• (AC) Every vector space has a basis.
• (AC) Any two bases of the same vector space have the same cardinality. Equivalently, the dimension of a vector space is well-defined.
• A matrix is invertible, or non-singular, if and only if the linear map represented by the matrix is an isomorphism.
• Any vector space over a field F of dimension n is isomorphic to Fn as a vector space over F.
• Corollary: Any two vector spaces over F of the same finite dimension are isomorphic to each other.

## Generalizations and related topics

Since linear algebra is a successful theory, its methods have been developed in other parts of mathematics. In module theory, one replaces the field of scalars by a ring. In multilinear algebra, one considers multivariable linear transformations, that is, mappings that are linear in each of a number of different variables. This line of inquiry naturally leads to the idea of the tensor product. Functional analysis mixes the methods of linear algebra with those of mathematical analysis.

Wikimedia Foundation. 2010.

### Look at other dictionaries:

• linear algebra — n. the algebra of vectors and matrices, as distinct from the ordinary algebra of real numbers and the abstract algebra of unspecified entities …   English World dictionary

• linear algebra — noun the part of algebra that deals with the theory of linear equations and linear transformation (Freq. 2) • Topics: ↑mathematics, ↑math, ↑maths • Hypernyms: ↑algebra * * * noun 1 …   Useful english dictionary

• linear algebra — Math. See under algebra (def. 2). [1890 95] * * * Branch of algebra concerned with methods of solving systems of linear equations; more generally, the mathematics of linear transformations and vector spaces. Linear refers to the form of the… …   Universalium

• linear algebra — tiesinė algebra statusas T sritis fizika atitikmenys: angl. linear algebra vok. lineare Algebra, f rus. линейная алгебра, f pranc. algèbre linéaire, f …   Fizikos terminų žodynas

• linear algebra — noun a) The branch of mathematics that deals with vectors, vector spaces, linear transformations and systems of linear equations. b) An algebra over a field …   Wiktionary

• linear algebra — noun Date: 1870 a branch of mathematics that is concerned with mathematical structures closed under the operations of addition and scalar multiplication and that includes the theory of systems of linear equations, matrices, determinants, vector… …   New Collegiate Dictionary

• linear algebra — branch of algebra (Mathematics) …   English contemporary dictionary

• Linear Algebra Package — Sommaire 1 Objectifs 2 Résolution du problème avec les processeurs multi cœurs 3 Voir aussi 4 Liens externes LAPACK (pour Linear Alge …   Wikipédia en Français

• List of linear algebra references — This is a list of selected references for linear algebra, an important branch of mathematics.Introductory linear algebraIntroductory linear algebra refers to the material typically covered in a first or second year course for scientists and… …   Wikipedia

• Trace (linear algebra) — In linear algebra, the trace of an n by n square matrix A is defined to be the sum of the elements on the main diagonal (the diagonal from the upper left to the lower right) of A, i.e., where aii represents the entry on the ith row and ith column …   Wikipedia