Dimension theorem for vector spaces

Dimension theorem for vector spaces

In mathematics, the dimension theorem for vector spaces states that all bases of a vector space have equally many elements. This number of elements may be finite, or given by an infinite cardinal number, and defines the dimension of the space.

Formally, the dimension theorem for vector spaces states that

Given a vector space V, any two linearly independent generating sets (in other words, any two bases) have the same cardinality.

If V is finitely generated, then it has a finite basis, and the result says that any two bases have the same number of elements.

While the proof of the existence of a basis for any vector space in the general case requires Zorn's lemma and is in fact equivalent to the axiom of choice, the uniqueness of the cardinality of the basis requires only the ultrafilter lemma [1], which is strictly weaker (the proof given below, however, assumes trichotomy, i.e., that all cardinal numbers are comparable, a statement which is also equivalent to the axiom of choice). The theorem can be generalized to arbitrary R-modules for rings R having invariant basis number.

The theorem for finitely generated case can be proved with elementary arguments of linear algebra, and requires no forms of the axiom of choice.

Contents

Proof

Assume that { ai: iI } and { bj: jJ } are both bases, with the cardinality of I bigger than the cardinality of J. From this assumption we will derive a contradiction.

Case 1

Assume that I is infinite.

Every bj can be written as a finite sum

b_j = \sum_{i\in E_j} \lambda_{i,j} a_i , where Ej is a finite subset of I.

Since the cardinality of I is greater than that of J and the Ej's are finite subsets of I, the cardinality of I is also bigger than the cardinality of \bigcup_{j\in J} E_j. (Note that this argument works only for infinite I.) So there is some i_0\in I which does not appear in any Ej. The corresponding a_{i_0} can be expressed as a finite linear combination of bj's, which in turn can be expressed as finite linear combination of ai's, not involving a_{i_0}. Hence  a_{i_0} is linearly dependent on the other ai's.

Case 2

Now assume that I is finite and of cardinality bigger than the cardinality of J. Write m and n for the cardinalities of I and J, respectively. Every ai can be written as a sum

a_i = \sum_{j\in J} \mu_{i,j} b_j

The matrix  (\mu_{i,j}: i\in I, j\in J) has n columns (the j-th column is the m-tuple  (\mu_{i,j}: i\in I)), so it has rank at most n. This means that its m rows cannot be linearly independent. Write r_i = (\mu_{i,j}: j\in J) for the i-th row, then there is a nontrivial linear combination

 \sum_{i\in I}  \nu_i r_i = 0

But then also \sum_{i\in I} \nu_i a_i = \sum_{i\in I} \nu_i \sum_{j\in J} \mu_{i,j} b_j = \sum_{j\in J} \biggl(\sum_{i\in I} \nu_i\mu_{i,j} \biggr) b_j = 0, so the ai are linearly dependent.

Alternative Proof

The proof above uses several non-trivial results. If these results are not carefully established in advance, the proof may give rise to circular reasoning. Here is a proof of the finite case which requires less prior development.

Theorem 1: If A = (a_1,\dots,a_n) \subseteq V is a linearly independent tuple in a vector space V, and B0 = (b1,...,br) is a tuple that spans V, then n\leq r.[2] The argument is as follows:

Since B0 spans V, the tuple (a_1,b_1,\dots,b_r) also spans. Since a_1\neq 0 (because A is linearly independent), there is at least one t \in \{1,\ldots,r\} such that bt can be written as a linear combination of B_1 = (a_1,b_1,\dots,b_{t-1}, b_{t+1}, ... b_r). Thus, B1 is a spanning tuple, and its length is the same as B0's.

Repeat this process. Because A is linearly independent, we can always remove an element from the list Bi which is not one of the aj's that we prepended to the list in a prior step (because A is linearly independent, and so there must be some nonzero coefficient in front of one of the bi's). Thus, after n iterations, the result will be a tuple B_n = (a_1, \ldots, a_n, b_{m_1}, \ldots, b_{m_k}) (possibly with k = 0) of length r. In particular, A \subseteq B_n, so |A| \leq |B_n|, i.e., n \leq r.

To prove the finite case of the dimension theorem from this, suppose that V is a vector space and S = \{v_1, \ldots, v_n\} and T = \{w_1, \ldots, w_m\} are both bases of V. Since S is linearly independent and T spans, we can apply Theorem 1 to get m \geq n. And since T is linearly independent and S spans, we get n \geq m. From these, we get m = n.

Kernel extension theorem for vector spaces

This application of the dimension theorem is sometimes itself called the dimension theorem. Let

T: UV

be a linear transformation. Then

dim(range(T)) + dim(kernel(T)) = dim(U),

that is, the dimension of U is equal to the dimension of the transformation's range plus the dimension of the kernel. See rank-nullity theorem for a fuller discussion.

See also

References

  1. ^ Howard, P., Rubin, J.: "Consequences of the axiom of choice" - Mathematical Surveys and Monographs, vol 59 (1998) ISSN 0076-5376.
  2. ^ S. Axler, "Linear Algebra Done Right," Springer, 2000.

Wikimedia Foundation. 2010.

Игры ⚽ Поможем решить контрольную работу

Look at other dictionaries:

  • Category of vector spaces — In mathematics, especially category theory, the category K Vect has all vector spaces over a fixed field K as objects and K linear transformations as morphisms. If K is the field of real numbers, then the category is also known as Vec.Since… …   Wikipedia

  • Dimension (vector space) — In mathematics, the dimension of a vector space V is the cardinality (i.e. the number of vectors) of a basis of V. It is sometimes called Hamel dimension or algebraic dimension to distinguish it from other types of dimension. This description… …   Wikipedia

  • Théorème de la dimension pour les espaces vectoriels — En mathématiques, le théorème de la dimension pour les espaces vectoriels énonce que deux bases quelconques d un même espace vectoriel ont même cardinalité. Joint au théorème de la base incomplète qui assure l existence de bases, il permet de… …   Wikipédia en Français

  • Vector space — This article is about linear (vector) spaces. For the structure in incidence geometry, see Linear space (geometry). Vector addition and scalar multiplication: a vector v (blue) is added to another vector w (red, upper illustration). Below, w is… …   Wikipedia

  • Vector calculus — Topics in Calculus Fundamental theorem Limits of functions Continuity Mean value theorem Differential calculus  Derivative Change of variables Implicit differentiation Taylor s theorem Related rates …   Wikipedia

  • Vector bundle — The Möbius strip is a line bundle over the 1 sphere S1. Locally around every point in S1, it looks like U × R, but the total bundle is different from S1 × R (which is a cylinder instead). In mathematics, a vector bundle is a… …   Wikipedia

  • Hirzebruch–Riemann–Roch theorem — In mathematics, the Hirzebruch–Riemann–Roch theorem, named after Friedrich Hirzebruch, Bernhard Riemann, and Gustav Roch, is Hirzebruch s 1954 result contributing to the Riemann–Roch problem for complex algebraic varieties of all dimensions. It… …   Wikipedia

  • Atiyah–Singer index theorem — In the mathematics of manifolds and differential operators, the Atiyah–Singer index theorem states that for an elliptic differential operator on a compact manifold, the analytical index (closely related to the dimension of the space of solutions) …   Wikipedia

  • List of theorems — This is a list of theorems, by Wikipedia page. See also *list of fundamental theorems *list of lemmas *list of conjectures *list of inequalities *list of mathematical proofs *list of misnamed theorems *Existence theorem *Classification of finite… …   Wikipedia

  • Pythagorean theorem — See also: Pythagorean trigonometric identity The Pythagorean theorem: The sum of the areas of the two squares on the legs (a and b) equals the area of the square on the hypotenuse (c) …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”