Language:EN
Pages: 21
Rating : ⭐⭐⭐⭐⭐
Price: $10.99
Page 1 Preview
the solution space the single linear equation sin

The solution space the single linear equation sin

Inner Product Spaces and Orthogonality

week 13-14 Fall 2006

for

The inner product ⟨ , ⟩ satisfies the following properties: (1) Linearity: ⟨au + bv, w = a⟨u, w + b⟨v, w.

(2) Symmetric Property: u, v = v, u.

(3) Positive Definite Property: For any u ∈ V , u, u⟩ ≥ 0; and u, u = 0 if and only if u = 0. The vector space V with an inner product is called a (real) inner product space.

Example 2.1. For x =�x1 x2�, y =�y1 y2� R2, define

For each vector u ∈ V , the norm (also called the length) of u is defined as the number u :=�u, u⟩.

If u = 1, we call u a unit vector and u is said to be normalized. For any nonzero vector v ∈ V , we have the unit vector
1
ˆv =
vv.

The linearity implies
u, v =
xiui, yjuj

=

xiyj⟨ui, uj⟩.

un, un⟩ u1, un⟩ u2, un⟩ ... 

the matrix of the inner product ⟨ , ⟩ relative to the basis B. Thus, using coordinate vectors

[u]B = [x1, x2, . . . , xn]T,

[v]B = [y1, y2, . . . , yn]T,

3

Example 3.2. The vector space C[a, b] of all real-valued continuous functions on a closed interval [a, b] is an inner product space, whose inner product is defined by

f, g � =� b f(t)g(t)dt, f, g ∈ C[a, b].

of ⟨, ⟩ relative to a basis B. Then for any vectors u, v ∈ V ,

u, v = xTAy.

x1�
, y =

y1� R2, its matrix relative to the standard basis E =

y2 �.

We may change variables so the the inner product takes a simple form. For instance, let

y1 = (2/3)y′1+ (1/3)y′

1 (1/3)y′
2

�2 3x′1+ 1 3x′2��1 3y′1 1 3y′2�

�1 3x′1 1 3x′2��1 3y′1 1 3y′2�

In fact, the matrix of the inner product relative to the basis

B =�u1 =� 2/3�, u2 =�1/3 1/3 ��
is the identity matrix, i.e.,
u1, u1 u2, u1 u2, u2�=� 1 0 1�
Let P be the transition matrix from the standard basis {e1, e2} to the basis {u1, u2}, i.e.,

It follows that x = Px′.

Note that xT= x′TPT. Thus, on the one hand by Theorem,

x, y = x′TIny= x′Ty′.

u, v2≤ ⟨u, u⟩⟨v, v⟩. Equivalently,

Proof. Consider the function ��u, v�� ≤ ∥u∥ ∥v∥.

y(t) =
=

The Cauchy-Schwarz inequality follows.�2u, v�2 4u, u⟩⟨v, v⟩ ≤ 0.

4

For nonzero vectors u, v ∈ V , the Cauchy-Schwarz inequality implies

cos θ = u, v
u∥ ∥v∥.

Example 6.1. For inner product space C[−π, π], the functions sin t and cos t are orthogonal as

sin t, cos t⟩ = � π−π sin t cos t dt

x1 + 2x2 + 3x3 + 3x4 + 2x5 = 0

that are orthogonal to every vector of S, called the orthogonal complement of S in V . In notation,

Proof. To show that S⊥is a subspace. We need to show that S⊥is closed under addition and scalar multiplication. Let u, v ∈ S⊥and c ∈ R. Since u, w = 0 and v, w = 0 for all w ∈ S, then

u + v, w = u, w + v, w = 0,

⟨cu, w = c⟨u, w = 0 for all w ∈ S. So u + v, cu ∈ S⊥. Hence S⊥is a subspace of Rn.
Then for any v ∈ S⊥,

Example 6.4. Let A be an m × n real matrix. Then Nul A and Row A are orthogonal complements of each other in Rn, i.e.,

� �⊥, � � = Row A.

7
Let V be an inner product space. orthogonal set if every pair of vectors are orthogonal, i.e., A subset S =�u1, u2, . . . , uk of nonzero vectors of V is called an
An orthogonal set S = u1, u2, . . . , uk

is called an orthonormal set if we further have

An orthonormal basis of V is a basis which is also an orthonormal set.

Example 7.1. The three vectors

Method 1: Solving the linear system by performing row operations to its augmented matrix

v1, v2, v3 | v

where i = 1, 2, 3. Then

Theorem 7.2. Let v1, v2, . . . , vk be an orthogonal basis of a subspace W. Then for any w ∈ W,

Write the vector v as v = [a1, a2, . . . , an]T. The for any scalar c,

cv = can ca1

where [c] is the 1 × 1 matrix with the only entry c. Note that

[v · y] = vTy.

Proj(y) = 1
v v · v 1 ��
=
v · v 1 ��

·

=

T

v · v
I 1
I − v · v

Example 8.1. Find the linear mapping from R3to R3that is a the orthogonal projection of R3ont the

plane x1 + x2 + x3 = 0.

Then
αi =vi, y vi, vi⟩, 1 ≤ i ≤ k.

We thus define
ProjW (y) =v1, y⟩⟨v1, v1v1 + v2, y⟩⟨v2, v2v2 + · · · + vk, y⟩⟨vk, vk⟩vk, called the orthogonal projection of v along W. The linear transformation

In particular, if B is an orthonormal basis of W, then
ProjW (y) = v1, yv1 + v2, yv2 + · · · + vk, yvk.

Proposition 8.3. Let W be a subspace of Rn. Let U = form an orthonormal basis of W. Then the orthogonal projection ProjW : Rn→ Rnis given byu1, u2, . . . , ukbe an n × k matrix, whose columns

ProjW (y) = (u1 · y)u1 + (u2 · y)u2 + · · · + (uk · y)uk.

UTy = 

uT 1

 uT 1y


u1 · y u2 · y ...


 .

uT k uT ky uk · y
UUTy =





= uk · y

ProjW (y).

where W is the plane x1 + x2 + x3 = 0.

form an orthogonal basis of W. Then v1 =1 0 and v2 =2 1 1 1

ProjW (y) = � v1 · y v1 · v1�v1 +� v2 · y v2 · v2�

1
2
6

1

y1

y2

where

The following two vectors
W = Span

1/


.

3 , u2 =1/√2

0


3

3

1/


3

Alternatively, the matrix can be found by computing the orthogonal projection: 1/6
5/6

1/3
1/6 5/6

3 
1

1

6

1/6
5/6

1/3


 y1

11

Example 9.1. Let W be the subspace of R4spanned by

v1 =  1
 1
 1
1 1 1
1 1 0
Construct an orthogonal basis for W. 1 0 0

Theorem 9.2. Any m × n real matrix A can be written as

A = QR,

Let v1, v2, v3, v4 be the column vectors of A. Set

1
1
0

2 1
1
1
w1 = v1 = 
0
1

Then v1 = w1. Set

Theorem 10.1. A linear transformation T : V → V is an isometry if and only if T preserving inner product, i.e., for u, v ∈ V ,
⟨T(u), T(v) = u, v⟩.

Proof. Note that for vectors u, v ∈ V ,

(b) QTis orthogonal.

(c) The column vectors of Q are orthonormal.

v2 = p12u1 + p22u2 + p32u3,
v3 = p13u1 + p23u2 + p33u3.

vi, vj⟩ =
=
The proof is finished.

Theorem 10.4. Let V be an n-dimensional inner product space with an orthonormal basis B = Let T : V → V be a linear transformation. Then T is an isometry if and only if the matrix of T relative tou1, u2, . . . , un

B is an orthogonal matrix.

of the column vectors of A is an orthogonal basis of R3. However, the set of the row vectors of A is not an B =  1 1 1 , 1 1 0 ,2 1 1
orthogonal set.

The matrix

Uv =

1/√3

1/ 1/√

 3  = 

3 + 4/√6

3 + 4/√6

0
0
The length of v is 1/√3
4

�2�3 + 4/√6�2 +�3 8/√6�2

and the length of Uv is

∥Uv =

11
Let V be an n-dimensional real inner product space. A linear mapping T : V → V is said to be symmetric if
⟨T(u), v = u, T(v) for all
Example 11.1. Let A be a real symmetric n×n matrix. Let T : Rn→ Rnbe defined by T(x) = Ax. Then T is symmetric for the Euclidean n-space. In fact, for u, v Rn, we have
T(u) · v =
=
Proposition 11.1. Let V be an n-dimensional real inner product space with an orthonormal basis B = {u1, u2, . . . , un}. Let T : V → V be a linear mapping whose matrix relative to B is A. Then T is symmetric if and only the matrix A is symmetric.

Proof. Note that

Theorem 11.2. The roots of characteristic polynomial of a real symmetric matrix A are all real numbers. Proof. Let λ be a (possible complex) root of the characteristic polynomial of A, and let v be a (possible complex) eigenvector for the eigenvalue λ. Then
Av = λv.

Note that

Proof. Let u be an eigenvector for λ, and let v be an eigenvectors for µ, i.e., T(u) = λu, T(v) = µv. Then

λ⟨u, v =

⟨λu, v = ⟨T(u), v
u, T(v) = u, µv = µ⟨u, v⟩.

=
Thus
Since λ − µ ̸= 0, it follows that u, v = 0.
⟨T(w), u1 =

w, T(u1) = w, λ1u1
λ1w, u1 = 0.

=

This means that T(w) ∈ W. Thus the restriction T|W : W → W is a symmetric linear transformation. Since dim W = n − 1, by induction hypothesis, W has an orthonormal basis {u2, . . . , un} of eigenvectors of T|W. Clearly, B = {u1, u2, . . . , un} is an orthonormal basis of V , and u1, u2, . . . , un are eigenvectors of T.

T(u) · v = (Au) · v = (Au)Tv = uTATv uTAv = u · (Av) = u · T(v).
=

∆(t) = (t + 2)2(t − 7).

Set w1 = v1, v1 = [1, 1, 0]T,
w2 = v2 v2 · w1 w1 · w1

17

The orthonormal basis of 2 is
Then the orthogonal matrix


1/√3

 .

1/ 1/√


6

6

Q =
0
diagonalizes the symmetric matrix A. 1/√3

An n × n real symmetric matrix A is called positive definite if, for any nonzero vector u Rn, u, Au = uTAu > 0.

12 Complex inner product spaces

Definition 12.1. Let V be a complex vector space. An inner product of V is a function ⟨ , ⟩ : V × V → C satisfying the following properties:

u, av + bw = ¯a⟨u, v + ¯b⟨u, w⟩.

orthogonal set, orthonormal basis, orthogonal projection, and Gram-Schmidt process, etc.

Two vectors u, v in a complex inner product space V are called orthogonal if

A set

mutually orthogonal. A basis of V is called an orthogonal basis if their vectors are mutually orthogonal;�v1, v2, . . . , vk�of nonzero vectors of V is called an orthogonal set if the vectors v1, v2, . . . , vk are

Theorem 12.3. Let B = v ∈ V ,

v1, v2, . . . , vnbe an orthogonal basis of an inner product space V . Then for any

v =v, v1⟩⟨v1, v1v1 + v, v2⟩⟨v2, v2v2 + · · · + v, vn⟩⟨vn, vn⟩vn.

where U = [u1, u2, . . . , uk] is an n × k complex matrix.

Theorem 12.4. Let B = inner product of V , i.e., A = [aij], where aij = vi, vj⟩. Then for u, v ∈ V ,v1, v2, . . . , vnbe basis of an inner product space V . Let A be the matrix of the

xTAx 0.

19

or equivalently, AA∗= A∗A = I.

Theorem 12.6. Let A be a complex square matrix. Then the following statements are equivalent:

Then B′=

i.e., �v1, v2, v3� =

u1, u2, u3�A,

vi, vj⟩ =

⟨a1iu1 + a2iu2 + a3iu3,
a1ju1 + a2ju2 + a3ju3⟩a1i¯a1j + a2i¯a2j + a3i¯a3j.

=

Note that B′is an orthonormal basis is equivalent to

The proof is finished.

isometry of V if T preserves length of vector, i.e., for any v ∈ V ,

∥T(v) = v∥.

You are viewing 1/3rd of the document.Purchase the document to get full access instantly

Immediately available after payment
Both online and downloadable
No strings attached
How It Works
Login account
Login Your Account
Place in cart
Add to Cart
send in the money
Make payment
Document download
Download File
img

Uploaded by : Nicole Palmer

PageId: DOC0CE4F78