Читать книгу From Euclidean to Hilbert Spaces - Edoardo Provenzi - Страница 19
1.5. Orthogonality and linear independence
ОглавлениеThe orthogonality condition is more restrictive than that of linear independence: all orthogonal families are free.
THEOREM 1.10.– Let F be an orthogonal family in (V, 〈, 〉), F = {v1, · · · , vn}, vi ≠ 0 ∀i, then F is free.
PROOF.– We need to prove the linear independence of the elements vi, that is, . To this end, we calculate the inner product of the linear combination and an arbitrary vector vj with j ∈ {1, . . . , n}:
By hypothesis, none of the vectors in F are zero; the hypothesis that therefore implies that:
This holds for any j ∈ {1, . . . , n}, so the orthogonal family F is free.□
Using the general theory of vector spaces in finite dimensions, an immediate corollary can be derived from theorem 1.10.
COROLLARY 1.1.– An orthogonal family of n non-null vectors in a space (V, 〈, 〉) of dimension n is a basis of V .
DEFINITION 1.6.– A family of n non-null orthogonal vectors in a vector space (V, 〈, 〉) of dimension n is said to be an orthogonal basis of V . If this family is also orthonormal, it is said to be an orthonormal basis of V .
The extension of the orthogonal basis concept to inner product spaces of infinite dimensions will be discussed in Chapter 5. For the moment, it is important to note that an orthogonal basis is made up of the maximum number of mutually orthogonal vectors in a vector space. Taking n to represent the dimension of the space V and proceeding by reductio ad absurdum, imagine the existence of another vector u* ∈ V , u ≠ 0, orthogonal to all of the vectors in an orthogonal basis ; in this case, the set would be free as orthogonal vectors are linearly independent, and the dimension of V would be n + 1 instead of n! This property is usually expressed by saying that an orthogonal family is a basis if it is not a subset of another orthogonal family of vectors in V .
Note that in order to determine the components of a vector in relation to an arbitrary basis, we must solve a linear system of n equations with n unknown variables. In fact, if v ∈ V is any vector and (ui) i = 1, . . . , n is a basis of V , then the components of v in (ui) are the scalars α1, . . . , αn such that:
where ui,j is the j-th component of vector ui.
However, in the presence of an orthogonal or orthonormal basis, components are determined by inner products, as seen in Theorem 1.11.
Note, too, that solving a linear system of n equations with n unknown variables generally involves far more operations than the calculation of inner products; this highlights one advantage of having an orthogonal basis for a vector space.
THEOREM 1.11.– Let B = {u1, . . . , un} be an orthogonal basis of (V, 〈, 〉). Then:
Notably, if B is an orthonormal basis, then:
PROOF.– B is a basis, so there exists a set of scalars α1, . . . , αn such that . Consider the inner product of this expression of v with a fixed vector ui, i ∈ {1, . . . , n}:
so , and thus . If B is an orthonormal basis, ‖ui‖ = 1 giving the second law in the theorem.□
Geometric interpretation of the theorem: The theorem that we are about to demonstrate is the generalization of the decomposition theorem of a vector in plane ℝ2 or in space ℝ3 on a canonical basis of unit vectors on axes. To simplify this, consider the case of ℝ2.
If and are, respectively, the unit vectors of axes x and y, then the decomposition theorem says that:
which is a particular case of the theorem above.
We will see that the Fourier series can be viewed as a further generalization of the decomposition theorem on an orthogonal or orthonormal basis.