3F Duality


Topics

  • Dual Spaces and the Dual Map
  • Kernel and Range of the Dual
  • Matrix of the Dual



Dual Spaces and the Dual Map


Content

  • Definition: Linear Functional
  • Definition: Dual Space, VV'
  • Lemma: dim(V)=dim(V)\dim(V')=\dim(V)
  • Definition: Dual Basis
  • Lemma: Dual Basis and Linear Combination Scalars
  • Lemma: Dual Basis is a Basis for the Dual Space
  • Definition: Dual Map, TT'
  • Lemma: Algebraic Properties of Dual Maps

Definition: Linear Functional

A linear frunctional on VV is a linear map from VV to F\mathbb{F}. In other words, a linear functional is an element of L(V,F)\mathscr{L}(V, \mathbb{F}).

Definition: Dual Space, V'

The dual space of VV, denoted VV', is the vector space of all linear functionals on VV. In other words, V=L(V,F)V'=\mathscr{L}(V,\mathbb{F}).

Lemma: dim(V)=dim(V)\dim(V')=\dim(V)

Let VV be a finite-dimensional vector space with basis β=(v1,,vn)\beta = (v_1, \ldots, v_n). The dual basis of β\beta is the list (φ1,,φn)(\varphi_1, \ldots, \varphi_n) of elements in VV', where each φi\varphi_i sends viv_i to 11 and all other vjv_j's to 0.0.

Proof:

Let n=dim(V)n = \dim(V). Then because V=L(V,F)V' = \mathscr{L}(V,\mathbb{F}),

dim(V)=dim(L(V,F))=dim(V)dim(F)=n1=n.\dim(V') = \dim \left( \mathscr{L}(V,\mathbb{F}) \right) = \dim(V) \cdot \dim(\mathbb{F}) = n \cdot 1 = n.

Definition: Dual Basis

Let VV be a finite-dimensional vector space with basis β=(v1,,vn)\beta = (v_1, \ldots, v_n). The dual basis of β\beta is the list (φ1,,φn)(\varphi_1, \ldots, \varphi_n) of elements in VV', where each φi\varphi_i sends viv_i to 11 and all other vjv_j's to 0.0.

Lemma: Dual Basis and Linear Combination Scalars

Suppose β=(v1,,vn)\beta = (v_1, \ldots, v_n) is a basis for VV and (φ1,,φn)(\varphi_1, \ldots, \varphi_n) is its dual basis. Then for every vVv\in V, v=φ1(v)v1++φn(v)vnv = \varphi_1(v) v_1 + \cdots + \varphi_n(v) v_n.

Proof:

Let vVv \in V. Because β\beta is a basis for VV, there exist unique scalars c1,,cnc_1, \ldots, c_n such that v=c1v1++cnvnv=c_1 v_1 + \cdots + c_n v_n. Now, apply any φi\varphi_i to this equation. Because φi\varphi_i is linear, we get

φ1(v)=φi(c1v1++cnvn)=c1φi(v1)+ciφi(vi)++cnφi(vn) \varphi_1(v) = \varphi_i(c_1 v_1 + \cdots + c_n v_n) = c_1 \varphi_i(v_1) + \cdots c_i \varphi_i(v_i) + \cdots + c_n \varphi_i(v_n)

=0++ci1++0=ci. = 0 + \cdots + c_i \cdot 1 + \cdots + 0 = c_i.

Lemma: Dual Basis is a Basis for the Dual Space

Suppose VV is a finite dimensional vector space and β=(v1,,vn)\beta = (v_1, \ldots, v_n) is a basis for VV. Then (φ1,,φn)(\varphi_1, \ldots, \varphi_n) is a basis for V.V'.

Proof:

We will show linear independence, which is sufficient for showing it is a basis because of the list's length. Suppose c1φ1++cnφn=0c_1 \varphi_1 + \cdots + c_n \varphi_n = 0. Applying this map to any vVv \in V must be 00, and also applying this map to any viv_i results in cic_i. So every cic_i must be 00.

Definition: Dual Map, TT'

Let TL(V,W)T \in \mathscr{L}(V,W). The dual map of TT is the linear map TL(W,V)T' \in \mathscr{L}(W', V') defined for each φW\varphi \in W' as

T(φ)=φT.T'(\varphi) = \varphi \circ T.

Lemma: Algebraic Properties of Dual Maps

Suppose TL(V,W)T \in \mathscr{L}(V, W). Then a) (S+T)=S+T(S + T)' = S' + T' for every SL(V,W)S \in \mathscr{L}(V, W). b) (λT)=λT(\lambda T)' = \lambda T' for all λF\lambda \in \mathbb{F}. c) (TS)=ST(TS)' = S' T' for all SL(U,V)S \in \mathscr{L}(U,V).

Proof:

The proofs of (a) and (b) are left to the reader. To prove (c), we use the definition:

(TS)(φ)=φTS=S(φT)=S(T(φ))=(ST)(φ). (TS)'(\varphi) = \varphi \circ T \circ S = S'(\varphi \circ T) = S'(T'(\varphi))=(S' \circ T')(\varphi).


Kernel and Range of a Dual


Content

  • Definition: Span (of an infinite space)
  • Lemma: For any subset UVU \subseteq V, span(U)\text{span}(U) is the smallest subspace containing UU.
  • Definition: Annihilator, U0U^0
  • Theorem: Annihilators are the same for generating sets a subspaces
  • Lemma: The annihilator is a subspace
  • Lemma: Basis for an Annihilator
  • Theorem: Dimension of an Annihilator
  • Corrollary: Condition for the annihilator to equal {0}\left\{ \vec{0} \right\} or the whole space
  • Lemma: The Kernel of TT'
  • Theorem: TT is surjective     T\iff T' is injective.
  • Lemma: The range of TT'
  • Theorem: TT is injective     \iff TT' is surjective

We start with a slight deviation from the book.

Definition: Span (for infinite sets)

Let UVU \subseteq V. Then

span(U)={c1u1++cnun  u1,,unU and c1,,cnF}.span(U) = \left\{ c_1 u_1 + \cdots + c_n u_n \ | \ u_1, \ldots, u_n \in U \text{ and } c_1, \ldots, c_n \in \mathbb{F} \right\}.

Lemma: Smallest Subspace

For any subset UVU \subseteq V, span(U)span(U) is the smallest subspace containing UU.

Proof:

(Sketch) a) To show that span(U)span(U) contains the zero vector, that it has closure under vector addition and also closure under scalar multiplication, we note that all three statements can be expressed as linear combinations of elements in UU. That latter two involve adding and scaling linear combinations. b) For the smallest subspace claim, clearly span(U)span(U) conains UU because 1uU1 \cdot u \in U for every uUu \in U, so we just need to show that every subspace containing U must contain everything in span(U). This follows from subspaces containing all linear combinations of their elements from closure under vector addition and scalar multiplication.

Definition: Annihilator, U0U^0

Let UU be a subset of VV. The annihilator of UU is defined as the set of linear functionals that "annihilate" everything in UU, specifically:

U0={φV  φ(u)=0 uU}. U^0 = \left\{ \varphi \in V' \ | \ \varphi(u) = 0 \ \forall u \in U \right\} .

Theorem: Annihilators are the same for generating sets of subspaces.

Let SVS \subseteq V and let U=span(S)U=span(S). Then U0=S0U^0=S^0.

Proof:

The inclusion U0S0U^0 \subseteq S^0 follows from the fact that any φU0\varphi \in U^0 sends everything in UU to 00, so it automatically sends everything in SUS \subseteq U to 00 as well because everything in SS is also in UU. The other inclusion S0U0S^0 \subseteq U^0 follows from the linearity of the functionals. Suppose φS0\varphi \in S^0 and let uUu \in U. Then u=c1v1++unvn for some v1,,vnS.u = c_1 v_1 + \cdots + u_n v_n \text{ for some } v_1, \ldots, v_n \in S. And therefore

φ(u)=φ(c1v1++cnvn)=c1φ(v1)++cnφn(vn)\varphi(u) = \varphi(c_1 v_1 + \cdots + c_n v_n ) = c_1 \varphi(v_1) + \cdots + c_n \varphi_n(v_n)

=c10++cn0=0. = c_1 \cdot 0 + \cdots + c_n \cdot 0 = 0.

Lemma: The annihilator is a subspace

Let UVU \subseteq V. Then U0U^0 is a subspace.

Proof:

We use the usual subspace test. Zero vector: The zero function sends everything to 00, so it is in the annihilator of every subset of VV. Vector Addition: Suppose φ1,φ2U0\varphi_1, \varphi_2 \in U^0 and let uUu \in U. Then

(φ1+φ2)(u)=φ1(u)+φ2(u)=0+0=0.\left( \varphi_1 + \varphi_2 \right)(u) = \varphi_1(u) + \varphi_2(u) = 0 + 0 = 0.

Scalar Multiplication: Suppose λF\lambda \in \mathbb{F} and φU0\varphi \in U^0. Let uUu \in U. Then

(λφ)(u)=λ(φ(u))=λ(0)=0.(\lambda \varphi)(u) = \lambda \left( \varphi(u) \right) = \lambda(0) = 0.

We now deviate more from the book, starting with a claim about a basis for the annihilator.

Claim 1: Basis of the Annihilator

Let VV be a finite dimensional vector space, UU a subspace of VV, (u1,,uk)(u_1, \ldots, u_k) a basis for UU, and (u1,,uk,v1,,v)(u_1, \ldots, u_k, v_1, \ldots, v_\ell) an basis extension for VV. Let (μ1,,μk,φ1,,φ)(\mu_1, \ldots, \mu_k, \varphi_1, \ldots, \varphi_\ell) be the corresponding dual basis vectors, a basis for VV'. Then list of dual extension vectors, (φ1,,φ)(\varphi_1, \ldots, \varphi_\ell), is a basis for U0U^0.

Proof:

This should remind you of the proof of the rank-nullity theorem. (φ1,,φ)(\varphi_1, \ldots, \varphi_\ell) is linearly independent because it is part of a basis. Each φi\varphi_i annihilates UU because it sends all of the basis vectors u1,,uku_1, \ldots, u_k to 00. Finally, (φ1,,φ)(\varphi_1, \ldots, \varphi_\ell) is a spanning set for the following reason: For any φU0\varphi \in U^0, φV\varphi \in V' means that there exist scalars c1,,ck,d1,,dFc_1, \ldots, c_k, d_1, \ldots, d_\ell \in \mathbb{F} such that φ=i=0kciμi+j=1djφj\varphi = \sum_{i=0}^k c_i \mu_i + \sum_{j=1}^\ell d_j \varphi_j. But each cic_i must be 00 because φ\varphi sends every uiu_i to 00. So φ=j=1djφj\varphi = \sum_{j=1}^\ell d_j \varphi_j.


Picture of the Setup

Theorem: Dimension of an Annihilator

Suppose VV is finite-dimensional and UU is a subspace of VV. Then

dim(U0)=dim(V)dim(U). \dim(U^0) = \dim(V) - \dim(U).

Proof:

In Claim 1, k=dim(U)k = \dim(U), =dim(U0)\ell = \dim(U^0), and k+=dim(V)k + \ell = \dim(V).

Corollary: Condition for the annihilator to equal {0}\{ 0 \} or the whole space

Suppose VV is a finite dimensional vector space and UU is a subspace of VV. Then a) U0={0}    U=VU^0 = \{ 0 \} \iff U = V b) U0=V    U={0}U^0 = V' \iff U = \{ 0 \}

Proof:

Both conditions in part (a) happen if and only if k=dim(V)k = \dim(V) and =0\ell = 0 in Claim 1. Both conditions in part (b) happen if and only if k=0k=0 and =dim(V)\ell = \dim(V).

We now have two more claims related to the book's results, followed by a setup that will take care of all the remaining proofs.

Claim 2: ker(T)=T(V)0\ker(T') = T(V)^0

Let TL(V,W)T \in \mathscr{L}(V, W). Then ker(T)=T(V)0\ker(T') = T(V)^0.

Proof:

On the one hand, ker(T)\ker(T') is the set of dual vectors ψW\psi \in W' such that (ψT)(v)=0(\psi \circ T)(v) = 0 for every vVv \in V. On the other hand, T(V)0T(V)^0 is the set of dual vectors ψW\psi \in W' such that ψ(T(v))=0\psi(T(v)) = 0 for every vVv \in V.

Claim 3: T(W)ker(T)0T'(W') \subseteq \ker(T)^0

Proof:

Let ψW\psi \in W' to represent an arbitrary element T(ψ)T(W)T'(\psi) \in T'(W'). Let uker(T)u \in \ker(T). Then (T(ψ))(u)=(ψT)(u)=ψ(0)=0.(T'(\psi))(u) = (\psi \circ T)(u) = \psi(0) = 0. Thus, T(ψ)T'(\psi) annihilates every uUu \in U.

Setup for the rest of this section

  • Let TL(V,W)T \in \mathscr{L}(V,W), where VV and WW are finite dimensional
  • Let (u1,,u)(u_1, \ldots, u_\ell) be a basis for ker(T)\ker(T).
  • Let (u1,,uk,v1,,v)(u_1, \ldots, u_k, v_1, \ldots, v_\ell) be an extended basis from UU to VV. It follows from the proof of the rank-nullity theorem that (w1,,w)=(T(v1),,T(v))(w_1, \ldots, w_\ell) = \left( T(v_1), \ldots, T(v_\ell) \right) is a basis for T(V)T(V).
  • Let (w1,,w,y1,,ym)(w_1, \ldots, w_\ell, y_1, \ldots, y_m) be an extended basis forT(V)T(V) to WW.
  • Let (ω1,,ω,ψ1,,ψm)(\omega_1, \ldots, \omega_\ell, \psi_1, \ldots, \psi_m) be the dual basis for WW'.
  • Observe that (ψ1,,ψm)(\psi_1, \ldots, \psi_m) is a basis for T(V)0T(V)^0 by Claim 1 and for ker(T)\ker(T') by Claim 2.
  • Observe that the dual vectors (φ1,,φ)=(T(ω1),,T(ω))(\varphi_1, \ldots, \varphi_\ell) = \left( T'(\omega_1), \ldots, T'(\omega_\ell) \right) in VV' are a basis for T(W)T'(W') by the proof the rank-nullity theorem.

Picture of the Setup


Lemma: The kernel of TT'

Suppose VV and WW are finite dimensional vector spaces and TL(V,W)T \in \mathscr{L}(V,W). Then a) ker(T)=T(V)0\ker(T') = T(V)^0. b) dim(ker(T))=dim(ker(T))+dim(W)dim(V)\dim(\ker(T'))=\dim(\ker(T)) + \dim(W) - \dim(V).

Proof:

a) This is Claim 2. b) LHS = mm, RHS = (k)+(+m)(k+)=m(\cancel{k}) + (\cancel{\ell} + m) - (\cancel{k} + \cancel{\ell}) = m

Theorem: TT is surjective     \iff TT' is injective.

Suppose VV and WW are finite-dimensional and TL(V,W)T \in \mathscr{L}(V,W). Then TT is surjective     \iff TT' is injective.

Proof:

Both conditions happen if and only if m=0m = 0.

Lemma: The range of TT'

Suppose VV and WW are finite-dimensional and TL(V,W)T \in \mathscr{L}(V,W). Then a) dim(T(W))=dim(T(V))\dim(T'(W')) = \dim(T(V)). b) T(W)=ker(T)0T'(W') = \ker(T)^0

Proof:

a) Both are \ell. b) T(W)ker(T)0T'(W) \subseteq \ker(T)^0 by Claim 3, and we have the other containment from having the same dimension of \ell.

Theorem: TT is injective     \iff TT' is surjective

Suppose VV and WW are finite-dimensional and TL(V,W).T \in \mathscr{L}(V,W). Then TT is injective     \iff TT' is surjective.

Proof:

Both conditions happen if and only if k=0k = 0.

Matrix of a Dual


Content

  • Theorem: Matrix of TT' is the transpose of the matrix of TT
  • Theorem: Column Rank = Row Rank

Theorem: Matrix of TT' is the transpose of the matrix of TT

Suppose VV and WW are finite dimensional vector spaces and TL(V,W)T \in \mathscr{L}(V,W). Then

M(T)=(M(T))T. \mathcal{M}(T') = \left( \mathcal{M}(T) \right)^T.

Proof:

(Summary) There are isomorphisms between the vector spaces VV and VV' and between the vector spaces WW and WW' that extend linearly from the bijection between a given basis and its corresponding dual basis. We start by observing that every column vector with respect to this basis in either VV or WW corresponds to its transpose as a functional. Now, since TT' is pre-composition with TT, a dual vector in WW' is the transpose of a vector in WW and TT' of that vector is that transposed vector preceeded by M(T)T\mathcal{M}(T)^T. To verify this, put any dual basis vector in WW' into the matrix and examine the corresponding column.

Theorem: Column Rank = Row Rank

Suppose AFA \in \mathbb{F}. Then the column rank of AA equals the row rank of AA.

Proof:

Let V=FnV = \mathbb{F}^n, W=FpW = \mathbb{F}^p, and let TT be the linear operator obtained by applying the matrix AA. Then the column rank of AA is the dimsion of T(V)T(V). Recall the setup for the proofs for the previous section titled "Setup for the Rest of this Section". In that setup, \ell is shown to be both the dimension of both T(V)T(V) and T(W)T'(W'). So the row rank of AA, which is the column rank of ATA^T, must be the same as the rank of AA because AA has the same rank as that of AT=M(T).A^T = \mathcal{M}(T').