Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
hilbert_space [2021/03/04 21:23] – [2.i.3 Subspaces] adminhilbert_space [2021/04/02 19:03] (current) – [2.i.3 Subspaces] admin
Line 90: Line 90:
 is a basis for it. is a basis for it.
  
-This subspace is //effectively// the same as $\mathbb{C}^2$.  If we do not bother to write down the third component then we do have a vector in $\mathbb{C}^2$ and we can always reconstruct the vector in $\mathbb{C}^3$ that it came from by just putting the zero back in the third component.  Mathematicians would say that $\mathbb{C}^2$ is //**isomorphic**// to subspace of $\mathbb{C}^3$, which means that there exists a one-to-one map between vectors in the subspace and vectors in $\mathbb{C}^3$.+This subspace is //effectively// the same as $\mathbb{C}^2$.  If we do not bother to write down the third component then we do have a vector in $\mathbb{C}^2$ and we can always reconstruct the vector in $\mathbb{C}^3$ that it came from by just putting the zero back in the third component.  Mathematicians would say that $\mathbb{C}^2$ is //**isomorphic**// to this subspace of $\mathbb{C}^3$, which means that there exists a one-to-one map between vectors in the subspace and vectors in $\mathbb{C}^3$.
  
 The distinction is somewhat important because $\mathbb{C}^2$ can be embedded in $\mathbb{C}^3$ in a variety of different ways.  For example, The set of all vectors of the form The distinction is somewhat important because $\mathbb{C}^2$ can be embedded in $\mathbb{C}^3$ in a variety of different ways.  For example, The set of all vectors of the form
Line 103: Line 103:
 \[\left ( \begin{array}{c} a \\ a \\ b\end{array}\right ),\] \[\left ( \begin{array}{c} a \\ a \\ b\end{array}\right ),\]
 is also a two-dimensional subspace of $\mathbb{C}^3$ that is isomorphic to $\mathbb{C}^2$.  One possible basis for this subspace is is also a two-dimensional subspace of $\mathbb{C}^3$ that is isomorphic to $\mathbb{C}^2$.  One possible basis for this subspace is
-\[\left ( \begin{array}{c} 1 \\ 1 \\ 0\end{array}\right ),\qquad \left ( \begin{array}{c} 0 \\ 0 \\ 1\end{array}\right )\] +\[\left ( \begin{array}{c} 1 \\ 1 \\ 0\end{array}\right ),\qquad \left ( \begin{array}{c} 0 \\ 0 \\ 1\end{array}\right ).\] 
-====== 2.i.Dual Vectors and Inner Products ======+====== 2.i.Dual Vectors and Inner Products ======
  
 ===== Dual Vector Spaces ===== ===== Dual Vector Spaces =====
Line 153: Line 153:
 where $\boldsymbol{f}_{\vec{s}}\vec{r}$ is just matrix multiplication of the row vector $\boldsymbol{f}_{\vec{s}}$ with the column vector $\vec{z}$. where $\boldsymbol{f}_{\vec{s}}\vec{r}$ is just matrix multiplication of the row vector $\boldsymbol{f}_{\vec{s}}$ with the column vector $\vec{z}$.
  
-===== Subspaces and Inner Products =====+====== 2.i.5 Orthonormal Bases ====== 
 + 
 +A basis $\phi_1,\phi_2,\cdots,\phi_d$ for a $d$-dimensional inner product space is called **//orthonormal//** if 
 +\[(\phi_j,\phi_k) = \delta_{jk} = \begin{cases} 0, & j\neq k \\ 1, & j=k\end{cases}\] 
 + 
 +For any basis, we can write any vector as $\psi = \sum_{j=1}^d b_j \phi_j$, and if the basis is also orthonormal then 
 +\[(\phi_k,\psi) = \sum_{j=1}^d b_j (\phi_k,\phi_j) = \sum_{j=1}^d b_j \delta_{jk} = b_k,\] 
 +so there is an easy way of finding the components of a vector in an orthonormal basis by just taking the inner products 
 +\[b_k = (\phi_k,\psi).\] 
 +Note: this only works in an //orthonormal// basis. 
 + 
 +As an example, in $\mathbb{R}^2$ and $\mathbb{C}^d$, the basis $\left ( \begin{array}{c} 1 \\ 0 \end{array} \right ), \left ( \begin{array}{c} 0 \\ 1 \end{array} \right )$ is orthonormal but the basis $\left ( \begin{array}{c} 1 \\ 0 \end{array} \right ), \left ( \begin{array}{c} 1 \\ 1 \end{array} \right )$ is not. 
 + 
 +====== 2.i.6 Orthogonal Subspaces ======
  
 In an inner product space, subspaces have more structure.  Suppose that $V_1 \subset V$ and $V_2 \subset V$.  $V_1$ and $V_2$ are //**orthogonal subspaces**// of $V$ if $(\psi,\phi) = 0$ for all vectors $\psi \in V_1$ and $\phi \in V_2$.  As an example, the set of all vectors of the form In an inner product space, subspaces have more structure.  Suppose that $V_1 \subset V$ and $V_2 \subset V$.  $V_1$ and $V_2$ are //**orthogonal subspaces**// of $V$ if $(\psi,\phi) = 0$ for all vectors $\psi \in V_1$ and $\phi \in V_2$.  As an example, the set of all vectors of the form
Line 162: Line 175:
 \[\left ( \begin{array}{C} a \\ a \\ b \end{array}\right ),\] \[\left ( \begin{array}{C} a \\ a \\ b \end{array}\right ),\]
 and the set of vectors of the form and the set of vectors of the form
-\[\left ( \begin{array} a \\ -a \\ 0 \end{array} \right ),\]+\[\left ( \begin{array}{c} a \\ -a \\ 0 \end{array} \right ),\]
 are orthogonal subspaces of $\mathbb{C}^3$ are orthogonal subspaces of $\mathbb{C}^3$
  
 Suppose $V' \subset V$.  Then, we can construct another subspace $V'^{\perp}$ called the //**orthogonal complement**// of $V'$ in $V$.  It consists of //all// vectors $\phi$ such that, for all vectors $\psi \in V'$, $(\psi,\phi) = 0$. Suppose $V' \subset V$.  Then, we can construct another subspace $V'^{\perp}$ called the //**orthogonal complement**// of $V'$ in $V$.  It consists of //all// vectors $\phi$ such that, for all vectors $\psi \in V'$, $(\psi,\phi) = 0$.
  
-It is easy to see that this is indeed a subspace.  If $phi, \chi \in V'^{\perp}$ and $\psi \in V'$ then, for any scalars $a$ and $b$+It is easy to see that this is indeed a subspace.  If $\phi, \chi \in V'^{\perp}$ and $\psi \in V'$ then, for any scalars $a$ and $b$
 \[(\psi,a\phi + b\chi) = a(\psi,\phi) + b(\psi,\chi) = a\times 0 + b\times 0 = 0,\] \[(\psi,a\phi + b\chi) = a(\psi,\phi) + b(\psi,\chi) = a\times 0 + b\times 0 = 0,\]
 i.e. the orthogonality property is preserved under taking linear combinations, due to the linearity of the inner product in its second argument. i.e. the orthogonality property is preserved under taking linear combinations, due to the linearity of the inner product in its second argument.
  
 A set of orthogonal subspaces $V_1,V_2,\cdots \subset V$ is said to //**span**// the inner product space $V$ if all vectors $\psi \in V$ can be written as A set of orthogonal subspaces $V_1,V_2,\cdots \subset V$ is said to //**span**// the inner product space $V$ if all vectors $\psi \in V$ can be written as
-\[\psi = \sum_j a_j \psi_j,\] +\[\psi = \sum_j \psi_j,\] 
-where the $a_j's$ are scalars and $\psi_j \in V_j$.  We sometimes write this as $V = \oplus_j V_j$+where $\psi_j \in V_j$.  We sometimes write this as $V = \oplus_j V_j$
  
 As an example, let $\phi_1,\phi_2,\cdots$ be an orthonormal basis for $V$ and let $V_j$ be the one dimensional subspace consisting of all vectors of the form $a\phi_j$.  Then, $V = \oplus_j V_j$ just by the definition of a basis, i.e. all vectors $\psi \in V$ can be written as As an example, let $\phi_1,\phi_2,\cdots$ be an orthonormal basis for $V$ and let $V_j$ be the one dimensional subspace consisting of all vectors of the form $a\phi_j$.  Then, $V = \oplus_j V_j$ just by the definition of a basis, i.e. all vectors $\psi \in V$ can be written as
 \[\psi = \sum_j a_j \phi_j.\] \[\psi = \sum_j a_j \phi_j.\]
  
-As a less trivial example, for any subspace $V' \subset V$, we have $V = V' \oplus V'^{\perp}$. +As a less trivial example, for any subspace $V' \subset V$, we have $V = V' \oplus V'^{\perp}$.  To see this, note that if $\phi_1,\phi_2,\cdots$ is an orthonormal basis for $V'$ and $\chi_1,\chi_2,\cdots$ is an orthonormal basis for $V'^{\perp}$ then $\phi_1,\phi_2,\cdots,\chi_1,\chi_2,\cdots$ is a basis for $V$.  Any vector $\psi$ can be written in this basis as 
- +\[\psi = \sum_j a_j \phi_j + \sum_k b_k \chi_k,\] 
-====== 2.i.Norms ======+and then if we define 
 +\begin{align*} 
 +\psi' & = \sum_j a_j \phi_j, & \psi'^{\perp} & = \sum_k b_k \chi_k, 
 +\end{align*} 
 +we have 
 +\[\psi = \psi' + \psi'^{\perp},\] 
 +where $\psi' \in V'$ and $\psi'^{\perp} \in V'^{\perp}$. 
 +====== 2.i.Norms ======
  
 On an inner product space, we define the //**norm**// of a vector as On an inner product space, we define the //**norm**// of a vector as
Line 238: Line 258:
 from which the triangle inequality follows by taking the square root. from which the triangle inequality follows by taking the square root.
  
-====== 2.i.5 Orthonormal Bases ====== 
  
-A basis $\phi_1,\phi_2,\cdots,\phi_d$ for a $d$-dimensional inner product space is called **//orthonormal//** if 
-\[(\phi_j,\phi_k) = \delta_{jk} = \begin{cases} 0, & j\neq k \\ 1, & j=k\end{cases}\] 
- 
-For any basis, we can write any vector as $\psi = \sum_{j=1}^d b_j \phi_j$, and if the basis is also orthonormal then 
-\[(\phi_k,\psi) = \sum_{j=1}^d b_j (\phi_k,\phi_j) = \sum_{j=1}^d b_j \delta_{jk} = b_k,\] 
-so there is an easy way of finding the components of a vector in an orthonormal basis by just taking the inner products 
-\[b_k = (\phi_k,\psi).\] 
-Note: this only works in an //orthonormal// basis. 
- 
-As an example, in $\mathbb{R}^2$ and $\mathbb{C}^d$, the basis $\left ( \begin{array}{c} 1 \\ 0 \end{array} \right ), \left ( \begin{array}{c} 0 \\ 1 \end{array} \right )$ is orthonormal but the basis $\left ( \begin{array}{c} 1 \\ 0 \end{array} \right ), \left ( \begin{array}{c} 1 \\ 1 \end{array} \right )$ is not. 
  
-====== 2.i.Hilbert Spaces ======+====== 2.i.Hilbert Spaces ======
  
 From the point of view of this course, a Hilbert space is an inner product space that might be finite or infinite-dimensional, but if it is infinite dimensional then it is well-behaved enough to have all the properties of a finite-dimensional space that we need to get things to work nicely.  In other words, in this course, we will often only prove things for the finite dimensional case and then just assume that they are true in infinite dimensions as well.  Although this is not the case for a general infinite dimensional inner product space, it is true if we invoke the magic words of mathematical gobbledygook "Hilbert Space". From the point of view of this course, a Hilbert space is an inner product space that might be finite or infinite-dimensional, but if it is infinite dimensional then it is well-behaved enough to have all the properties of a finite-dimensional space that we need to get things to work nicely.  In other words, in this course, we will often only prove things for the finite dimensional case and then just assume that they are true in infinite dimensions as well.  Although this is not the case for a general infinite dimensional inner product space, it is true if we invoke the magic words of mathematical gobbledygook "Hilbert Space".