Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
hilbert_space [2021/02/18 00:18] – [The Triangle Inequality] adminhilbert_space [2021/04/02 19:03] (current) – [2.i.3 Subspaces] admin
Line 78: Line 78:
   * The space of complex functions of a real variable has dimension (uncountable) infinity.   * The space of complex functions of a real variable has dimension (uncountable) infinity.
  
-====== 2.i.3 Dual Vectors and Inner Products ======+====== 2.i.3 Subspaces ====== 
 + 
 +A //**subspace**// $V'$ of a vector space $V$ is a subset of the vectors that is itself a vector space, i.e. it is closed under taking linear combinations of vectors.  The fact that $V'$ is a subspace of $V$ is denoted $V' \subset V$.  Clearly, the dimension of a subspace of $V$ is less than or equal to the dimension of $V$. 
 + 
 +For example $\mathbb{R}^n \subset \mathbb{C}^n$ consisting of those vectors that only have real components.  If we take a linear combination $a\psi + b\phi$ of such vectors with real numbers $a$ and $b$ (which are the scalars of $\mathbb{R}^n$) then it will also be a vector in $\mathbb{R}^n$. 
 + 
 +This is an example where the scalars of the subspace are different from the scalars of the original space.  However, the most relevant examples for our purposes are where the scalars of the subspace and original space are the same.  An example is that $\mathbb{C}^m \subset \mathbb{C}^n$ for $m\leq n$.  For a concrete example, consider $\mathbb{C}^3$.  The set of vectors of the form 
 +\[\left ( \begin{array}{c} a \\ b \\ 0 \end{array} \right ),\] 
 +where $a$ and $b$ are complex numbers is a two-dimensional subspace of $\mathbb{C}^3$, as taking linear combinations of such vectors will not change the zero in the third component.  You can see that it is two-dimensional by noting that 
 +\[\left ( \begin{array}{c} 1 \\ 0 \\ 0\end{array}\right ),\qquad \left ( \begin{array}{c} 0 \\ 1 \\ 0 \end{array}\right ),\] 
 +is a basis for it. 
 + 
 +This subspace is //effectively// the same as $\mathbb{C}^2$.  If we do not bother to write down the third component then we do have a vector in $\mathbb{C}^2$ and we can always reconstruct the vector in $\mathbb{C}^3$ that it came from by just putting the zero back in the third component.  Mathematicians would say that $\mathbb{C}^2$ is //**isomorphic**// to this subspace of $\mathbb{C}^3$, which means that there exists a one-to-one map between vectors in the subspace and vectors in $\mathbb{C}^3$. 
 + 
 +The distinction is somewhat important because $\mathbb{C}^2$ can be embedded in $\mathbb{C}^3$ in a variety of different ways.  For example, The set of all vectors of the form 
 +\[\left ( \begin{array}{c} a \\ b \\ 0 \end{array} \right ),\] 
 +the set of all vectors of the form 
 +\[\left ( \begin{array}{c} a \\ 0 \\ b \end{array} \right ),\] 
 +and the set of all vectors of the form 
 +\[\left ( \begin{array}{c} 0 \\ a \\ b \end{array} \right ),\] 
 +are all two-dimensional subspaces of $\mathbb{C}^3$ that are isomorphic to $\mathbb{C}^2$, but they are //different// subspaces of $\mathbb{C}^3$. 
 + 
 +These examples, are pretty trivial.  For a less trivial example, note that the set of all vectors of the form 
 +\[\left ( \begin{array}{c} a \\ a \\ b\end{array}\right ),\] 
 +is also a two-dimensional subspace of $\mathbb{C}^3$ that is isomorphic to $\mathbb{C}^2$.  One possible basis for this subspace is 
 +\[\left ( \begin{array}{c} 1 \\ 1 \\ 0\end{array}\right ),\qquad \left ( \begin{array}{c} 0 \\ 0 \\ 1\end{array}\right ).\] 
 +====== 2.i.4 Dual Vectors and Inner Products ======
  
 ===== Dual Vector Spaces ===== ===== Dual Vector Spaces =====
Line 116: Line 142:
  
 An inner product induces a one-to-one map between vectors and dual vectors, constructed as follows: An inner product induces a one-to-one map between vectors and dual vectors, constructed as follows:
-  * Given a vector $\phi$, the map $f_{\phi}$ where $f_{\phi}(\psi) = (\phi,\psi)$ is a linear map from vectors to scalars, so it is a dual vector.+  * Given a vector $\phi$, the map $f_{\phi}$ where $f_{\phi}(\psi) = (\phi,\psi)$ is a linear map from vectors to scalars, so it is a dual vector.\
   * Firther //any// linear map from vectors to scalars can be written as $f_{\phi}$ for some vector $\phi$.  (Proving this for $\mathbb{R}^n$ is an in class activity.)   * Firther //any// linear map from vectors to scalars can be written as $f_{\phi}$ for some vector $\phi$.  (Proving this for $\mathbb{R}^n$ is an in class activity.)
  
Line 124: Line 150:
 \[\boldsymbol{f}_{\vec{s}} = ( s_1^*,s_2^*,\cdots,s_n^*),\] \[\boldsymbol{f}_{\vec{s}} = ( s_1^*,s_2^*,\cdots,s_n^*),\]
 and then, for any other vector $\vec{z}$, we have and then, for any other vector $\vec{z}$, we have
-\[f_{\vec{s}(\vec{z})= \vec{s} \cdot \vec{z} = \boldsymbol{f}_{\vec{s}}\vec{z},\] +\[f_{\vec{s}}(\vec{z}) = \vec{s} \cdot \vec{z} = \boldsymbol{f}_{\vec{s}}\vec{z},\] 
-where $\boldsymbol{f}_{\vec{s}}\vec{r}$ is just matrix multiplication of the row vector $\boldsymbol{f}_{\vec{s}}$ with the column vector $\vec{Z}$.+where $\boldsymbol{f}_{\vec{s}}\vec{r}$ is just matrix multiplication of the row vector $\boldsymbol{f}_{\vec{s}}$ with the column vector $\vec{z}$.
  
-====== 2.i.Norms ======+====== 2.i.5 Orthonormal Bases ====== 
 + 
 +A basis $\phi_1,\phi_2,\cdots,\phi_d$ for a $d$-dimensional inner product space is called **//orthonormal//** if 
 +\[(\phi_j,\phi_k) = \delta_{jk} = \begin{cases} 0, & j\neq k \\ 1, & j=k\end{cases}\] 
 + 
 +For any basis, we can write any vector as $\psi = \sum_{j=1}^d b_j \phi_j$, and if the basis is also orthonormal then 
 +\[(\phi_k,\psi) = \sum_{j=1}^d b_j (\phi_k,\phi_j) = \sum_{j=1}^d b_j \delta_{jk} = b_k,\] 
 +so there is an easy way of finding the components of a vector in an orthonormal basis by just taking the inner products 
 +\[b_k = (\phi_k,\psi).\] 
 +Note: this only works in an //orthonormal// basis. 
 + 
 +As an example, in $\mathbb{R}^2$ and $\mathbb{C}^d$, the basis $\left ( \begin{array}{c} 1 \\ 0 \end{array} \right ), \left ( \begin{array}{c} 0 \\ 1 \end{array} \right )$ is orthonormal but the basis $\left ( \begin{array}{c} 1 \\ 0 \end{array} \right ), \left ( \begin{array}{c} 1 \\ 1 \end{array} \right )$ is not. 
 + 
 +====== 2.i.6 Orthogonal Subspaces ====== 
 + 
 +In an inner product space, subspaces have more structure.  Suppose that $V_1 \subset V$ and $V_2 \subset V$.  $V_1$ and $V_2$ are //**orthogonal subspaces**// of $V$ if $(\psi,\phi) = 0$ for all vectors $\psi \in V_1$ and $\phi \in V_2$.  As an example, the set of all vectors of the form 
 +\[\left ( \begin{array}{c} a \\ 0 \\ 0 \end{array}\right ),\] 
 +and the set of all vectors of the form 
 +\[\left ( \begin{array}{c} 0 \\ a \\ 0 \end{array}\right ),\] 
 +are orthogonal subspaces of $\mathbb{C}^3$.  For a less trivial example, the set of all vectors of the form 
 +\[\left ( \begin{array}{C} a \\ a \\ b \end{array}\right ),\] 
 +and the set of vectors of the form 
 +\[\left ( \begin{array}{c} a \\ -a \\ 0 \end{array} \right ),\] 
 +are orthogonal subspaces of $\mathbb{C}^3$ 
 + 
 +Suppose $V' \subset V$.  Then, we can construct another subspace $V'^{\perp}$ called the //**orthogonal complement**// of $V'$ in $V$.  It consists of //all// vectors $\phi$ such that, for all vectors $\psi \in V'$, $(\psi,\phi) = 0$. 
 + 
 +It is easy to see that this is indeed a subspace.  If $\phi, \chi \in V'^{\perp}$ and $\psi \in V'$ then, for any scalars $a$ and $b$ 
 +\[(\psi,a\phi + b\chi) = a(\psi,\phi) + b(\psi,\chi) = a\times 0 + b\times 0 = 0,\] 
 +i.e. the orthogonality property is preserved under taking linear combinations, due to the linearity of the inner product in its second argument. 
 + 
 +A set of orthogonal subspaces $V_1,V_2,\cdots \subset V$ is said to //**span**// the inner product space $V$ if all vectors $\psi \in V$ can be written as 
 +\[\psi = \sum_j \psi_j,\] 
 +where $\psi_j \in V_j$.  We sometimes write this as $V = \oplus_j V_j$ 
 + 
 +As an example, let $\phi_1,\phi_2,\cdots$ be an orthonormal basis for $V$ and let $V_j$ be the one dimensional subspace consisting of all vectors of the form $a\phi_j$.  Then, $V = \oplus_j V_j$ just by the definition of a basis, i.e. all vectors $\psi \in V$ can be written as 
 +\[\psi = \sum_j a_j \phi_j.\] 
 + 
 +As a less trivial example, for any subspace $V' \subset V$, we have $V = V' \oplus V'^{\perp}$.  To see this, note that if $\phi_1,\phi_2,\cdots$ is an orthonormal basis for $V'$ and $\chi_1,\chi_2,\cdots$ is an orthonormal basis for $V'^{\perp}$ then $\phi_1,\phi_2,\cdots,\chi_1,\chi_2,\cdots$ is a basis for $V$.  Any vector $\psi$ can be written in this basis as 
 +\[\psi = \sum_j a_j \phi_j + \sum_k b_k \chi_k,\] 
 +and then if we define 
 +\begin{align*} 
 +\psi' & = \sum_j a_j \phi_j, & \psi'^{\perp} & = \sum_k b_k \chi_k, 
 +\end{align*} 
 +we have 
 +\[\psi = \psi' + \psi'^{\perp},\] 
 +where $\psi' \in V'$ and $\psi'^{\perp} \in V'^{\perp}$. 
 +====== 2.i.7 Norms ======
  
 On an inner product space, we define the //**norm**// of a vector as On an inner product space, we define the //**norm**// of a vector as
Line 150: Line 223:
  
 Since $-1 \leq \cos\theta \leq 1$, we obviously have Since $-1 \leq \cos\theta \leq 1$, we obviously have
-\[|\vec{r}\cdot \vec{r}' \leq ||\vec{r}|| ||\vec{r}'||,\]+\[|\vec{r}\cdot \vec{r}'\leq ||\vec{r}|| ||\vec{r}'||,\]
 which is the special case of the Cauchy-Schwartz inequality for $\mathbb{R}^2$. which is the special case of the Cauchy-Schwartz inequality for $\mathbb{R}^2$.
  
Line 185: Line 258:
 from which the triangle inequality follows by taking the square root. from which the triangle inequality follows by taking the square root.
  
-====== 2.i.5 Orthonormal Bases ====== 
  
-A basis $\phi_1,\phi_2,\cdots,\phi_d$ for a $d$-dimensional inner product space is called **//orthonormal//** if 
-\[(\phi_j,\phi_k) = \delta_{jk} = \begin{cases} 0, & j\neq k \\ 1, & j=k\end{cases}\] 
- 
-For any basis, we can write any vector as $\psi = \sum_{j=1}^d b_j \phi_$, and if the basis is also orthonormal then 
-\[(\phi_k,\psi) = \sum_{j=1}^d b_j (\phi_k,\phi_j) = \sum_{j=1}^d b_j \delta_{jk} = b_k,\] 
-so there is an easy way of finding the components of a vector in an orthonormal basis by just taking the inner products 
-\[b_k = (\phi_k,\psi).\] 
-Note: this only works in an //orthonormal// basis. 
- 
-As an example, in $\mathbb{R}^2$ and $\mathbb{C}^d$, the basis $\left ( \begin{array}{c} 1 \\ 0 \end{array} \right ), \left ( \begin{array}{c} 0 \\ 1 \end{array} \right )$ is orthonormal but the basis $\left ( \begin{array}{c} 1 \\ 0 \end{array} \right ), \left ( \begin{array}{c} 1 \\ 1 \end{array} \right )$ is not. 
  
-====== 2.i.Hilbert Spaces ======+====== 2.i.Hilbert Spaces ======
  
 From the point of view of this course, a Hilbert space is an inner product space that might be finite or infinite-dimensional, but if it is infinite dimensional then it is well-behaved enough to have all the properties of a finite-dimensional space that we need to get things to work nicely.  In other words, in this course, we will often only prove things for the finite dimensional case and then just assume that they are true in infinite dimensions as well.  Although this is not the case for a general infinite dimensional inner product space, it is true if we invoke the magic words of mathematical gobbledygook "Hilbert Space". From the point of view of this course, a Hilbert space is an inner product space that might be finite or infinite-dimensional, but if it is infinite dimensional then it is well-behaved enough to have all the properties of a finite-dimensional space that we need to get things to work nicely.  In other words, in this course, we will often only prove things for the finite dimensional case and then just assume that they are true in infinite dimensions as well.  Although this is not the case for a general infinite dimensional inner product space, it is true if we invoke the magic words of mathematical gobbledygook "Hilbert Space".
Line 217: Line 279:
  
 In this inner product space, the inner products and norms can be infinite, e.g. consider $||\psi|| = \sqrt{(\psi,\psi)}$ when $\psi(x) = c$ for some constant $c$.  Then In this inner product space, the inner products and norms can be infinite, e.g. consider $||\psi|| = \sqrt{(\psi,\psi)}$ when $\psi(x) = c$ for some constant $c$.  Then
-\[(\psi,\psi) = \int_{-\infty}^{+\infty}c \,\mathrm{d}x = \infty.\]+\[(\psi,\psi) = \int_{-\infty}^{+\infty}c^2 \,\mathrm{d}x = \infty.\]
 Since the Born rule in quantum mechanics tells us that integrals of the form $\int |\psi(x)|^2$ have to do with probabilities, we want to ensure that $(\psi,\psi) = \int_{-\infty}^{+\infty} |\psi(x)|^2$ is always finite.  Therefore, the Hilbert space of //**square integrable functions**// is defined to be the set of functions $\psi(x)$ such that Since the Born rule in quantum mechanics tells us that integrals of the form $\int |\psi(x)|^2$ have to do with probabilities, we want to ensure that $(\psi,\psi) = \int_{-\infty}^{+\infty} |\psi(x)|^2$ is always finite.  Therefore, the Hilbert space of //**square integrable functions**// is defined to be the set of functions $\psi(x)$ such that
 \[||\psi||^2 = (\psi,\psi) = \int_{-\infty}^{+\infty} |\psi(x)|^2,\] \[||\psi||^2 = (\psi,\psi) = \int_{-\infty}^{+\infty} |\psi(x)|^2,\]