Читать книгу From Euclidean to Hilbert Spaces - Edoardo Provenzi - Страница 22
1.8. Fundamental properties of orthonormal and orthogonal bases
ОглавлениеThe most important properties of an orthonormal basis are listed in theorem 1.14.
THEOREM 1.14.– Let (u1, . . . , un) be an orthonormal basis of (V, 〈, 〉), dim(V ) = n. Then, ∀v, w ∈ V :
1) Decomposition theorem on an orthonormal basis:
[1.7]
2) Parseval’s identity7:
[1.8]
3) Plancherel’s theorem8:
[1.9]
Proof of 1: an immediate consequence of Theorem 1.12. Given that (u1, . . . , un) is a basis, v ∈ span(u1, . . . , un); furthermore, (u1, . . . , un) is orthonormal, so . It is not necessary to divide by ‖ui‖2 when summing since ‖ui‖ = 1 ∀i.
Proof of 2: using point 1 it is possible to write , and calculating the inner product of v, written in this way, and w, using equation [1.1], we obtain:
Proof of 3: writing w = v on the left-hand side of Parseval’s identity gives us 〈v, v〉 = ‖v‖2. On the right-hand side, we have:
hence .
NOTE.–
1) The physical interpretation of Plancherel’s theorem is as follows: the energy of v, measured as the square of the norm, can be decomposed using the sum of the squared moduli of each projection of v on the n directions of the orthonormal basis (u1, ..., un).
In Fourier theory, the directions of the orthonormal basis are fundamental harmonics (sines and cosines with defined frequencies): this is why Fourier analysis may be referred to as harmonic analysis.
2) If (u1, . . . , un) is an orthogonal, rather than an orthonormal, basis, then using the projector formula and theorem 1.12, the results of Theorem 1.14 can be written as:
a) decomposition of v ∈ V on an orthogonal basis:
b) Parseval’s identity for an orthogonal basis:
c) Plancherel’s theorem for an orthogonal basis:
The following exercise is designed to test the reader’s knowledge of the theory of finite-dimensional inner product spaces. The two subsequent exercises explicitly include inner products which are non-Euclidean.
Exercise 1.1
Consider the complex Euclidean inner product space 3 and the following three vectors:
1) Determine the orthogonality relationships between vectors u, v, w.
2) Calculate the norm of u, v, w and the Euclidean distances between them.
3) Verify that (u, v, w) is a (non-orthogonal) basis of 3.
4) Let S be the vector subspace of 3 generated by u and w. Calculate PSv, the orthogonal projection of v onto S. Calculate d(v, PSv), that is, the Euclidean distance between v and its projection onto S, and verify that this minimizes the distance between v and the vectors of S (hint: look at the square of the distance).
5) Using the results of the previous questions, determine an orthogonal basis and an orthonormal basis for 3 without using the Gram-Schmidt orthonormalization process (hint: remember the geometric relationship between the residual vector r and the subspace S).
6) Given a vector a = (2i, −1, 0), write the decomposition of a and Plancherel’s theorem in relation to the orthonormal basis identified in point 5. Use these results to identify the vector from the orthonormal basis which has the heaviest weight in the decomposition of a (and which gives the best “rough approximation” of a). Use a graphics program to draw the progressive vector sum of a, beginning with the rough approximation and adding finer details supplied by the other vectors.
Solution to Exercise 1.1
1) Evidently, , so by directly calculating the inner products: 〈u, v〉 = −2, 〈u, w〉 = 0 et .
2) By direct calculation: . After calculating the difference vectors, we obtain: , .
3) The three vectors u, v, w are linearly independent, so they form a basis in 3. This basis is not orthogonal since only vectors u and w are orthogonal.
4) S = span(u, w). Since (u, w) is an orthogonal basis in S, we can write:
The residual vector of the projection of v on S is r = v − PSv = (2i, 0, 0) and thus d(v, PSv)2 = ‖r‖2 = 4. The most general vector in S is and . This confirms that PSv is the vector in S with the minimum distance from v in relation to the Euclidean norm.
5) r is orthogonal to S, which is generated by u and w, hence (u, w, r) is a set of orthogonal vectors in 3, that is, an orthogonal basis of 3. To obtain an orthonormal basis, we then simply divide each vector by its norm:
6) Decomposition: .
Plancherel’s theorem: .
The vector with the heaviest weight in the reconstruction of a is thus r̂: this vector gives the best rough approximation of a. By calculating the vector sum of this rough representation and the other two vectors, we can reconstruct the “fine details” of a, first with ŵ and then with û.
Exercise 1.2
Let M(n, ) be the space of n × n complex matrices. The application ϕ : M(n, ) × M(n, ) → is defined by:
where denotes the adjoint matrix of B and tr is the matrix trace. Prove that ϕ is an inner product.
Solution to Exercise 1.2
The distributive property of matrix multiplication for addition and the linearity of the trace establishes the linearity of ϕ in relation to the first variable.
Now, let us prove that ϕ is Hermitian. Let A = (ai,j)1≼i,j≼n and B = (bi,j)1≼i,j≼n be two matrices in M(n, ). Let be the coefficients of the matrix B† and let be the coefficients of A†.
This gives us:
Thus, ϕ is a sesquilinear Hermitian form. Furthermore, ϕ is positive:
It is also definite:
Thus, ϕ is an inner product.
Exercise 1.3
Let E = ℝ[X] be the vector space of single variable polynomials with real coefficients. For P, Q ∈ E, take:
1) Remember that means that Ǝ a, C > 0 such that |t − t0| < a |f(t)| ≼ C |g(t)|. Prove that for all P, Q ∈ E, this is equal to:
and:
Use this result to deduce that Φ is definite over E × E.
2) Prove that Φ is an inner product over E, which we shall note 〈 , 〉.
3) For n ∈ ℕ, let Tn be the n-th Chebyshev polynomial, that is, the only polynomial such that ∀θ ∈ ℝ, Tn(cos θ) = cos(nθ). Applying the substitution t = cos θ, show that (Tn)n∈ℕ is an orthogonal family in E. Hint: use the trigonometric formula [1.13]:
4) Prove that for all n ∈ ℕ, (T0, . . . , Tn) is an orthogonal basis of ℝn[X], the vector space of polynomials in ℝ[X] of degree less than or equal to n. Deduce that (Tn)n∈ℕ is an orthogonal basis in the algebraic sense: every element in E is a finite linear combination of elements in the basis of E.
5) Calculate the norm of Tn for all n and deduce an orthonormal basis (in the algebraic sense) of E using this result.
Solution to Exercise 1.3
1) We write . Since P and Q are polynomials, the function is continuous in a neighborhood V1(1) and thus, according to the Weierstrass theorem, it is bounded in this neighborhood, that is, Ǝ C1 > 0 such that . Similarly, the function is continuous in a neighborhood V2(−1), thus Ǝ C2 > 0 such that . This gives us:
and:
This implies that the integral defining Φ is definite; f(t) is continuous over (−1, 1) and therefore can be integrated. The result which we have just proved shows that f(t) is integrable in a right neighborhood of –1 and a left neighborhood of 1, as the integral of its absolute value is incremented by an integrable function in both cases.
2) The bilinearity of Φ is obtained from the linearity of the integral using direct calculation. Its symmetry is a consequence of that of the dot product between functions. The only property which is not immediately evident is definite positiveness. Let us start by proving positiveness:
and9:
but the only polynomial with an infinite number of roots is the null polynomial 0(t) ≡ 0, so P = 0. Φ is therefore an inner product on E.
3) For all n, m ∈ ℕ:
So, for all n ≠ m, we have:
that is, Chebyshev polynomials form an orthogonal family of polynomials in relation to the inner product defined above.
4) The family (T0, T1, . . . , Tn) is an orthogonal (and thus free) of n+1 elements of ℝ[X], which is of dimension n + 1, meaning that it is an orthogonal basis of ℝn[X]. To show that (Tn)n∈ℕ is a basis in the algebraic sense of E, consider a polynomial P ∈ E of an arbitrary degree d ∈ ℕ, i.e. P ∈ ℝd[X], and note that (T0, T1, . . . , Td) is an orthogonal (free) family of generators of ℝd[X], that is, a basis in the algebraic sense of the term.
5) The norm of Tn is calculated using the following equality:
which was demonstrated in point 3. Taking n = m, we have:
hence . Finally, the family:
is an orthonormal basis of the vector space of first-order polynomials with real coefficients E□