\input{template}
\input{macros}

\begin{document}
\lecture{4} {Linear algebra : Basis, Dimension}{Bhaskara Aditya}

\section{Basis (contd.)}

We first give the proof for the result stated in the previous lecture -- if two sets of vectors $S$ and $T$ form a basis for a vector space $V$, their sizes are the same (i.e., $|S| = |T|$). We recall what is meant by saying a set $X$ of vectors is a basis for a space $V$:
\begin{enumerate}
\item The vectors in $X$ are linearly independent (i.e., there's no nontrivial combination of them which is zero).
\item Any vector in $V$ can be expressed as a linear combination of vectors in $X$.
\end{enumerate}

We will prove the result by contradiction, by assuming $|S|$ is strictly smaller than $|T|$, and then obtaining a set $S' \subset T$ which is also a basis, with the property $|S'| = |S|$. This is clearly a contradiction, because $|S'| = |S| < |T|$ and $S' \subset T$, so the vectors in $T \setminus S'$ can be expressed in terms of those in $S'$, contradicting the fact that $T$ is a basis (in particular, (1) above).

We start with a lemma.

\newtheorem{lem1}{Lemma}
\begin{lem1}
Suppose $S = \{v_1, \dots, v_n\}$ and let $x = \sum_{i=1}^n \alpha_i v_i$ with $\alpha_1 \neq 0$. Let $S' = \{x, v_2, \dots, v_n\}$. Then, $span(S) =$ $span(S')$.
\end{lem1}
\begin{proof}
The proof in both directions is easy. The crucial thing is to observe that since $\alpha_1 \neq 0$, we can write $v_1$ as a combination of $x, v_2, \dots, v_n$. Thus, any vector in $span(S)$ is also in $span(S')$. For the other direction, since $x$ is a combination of $v_i$'s, every vector in $span(S')$ is also in $span(S)$. This proves the equality.
\end{proof}

We now proceed with the proof of the theorem.

\newtheorem{thm1}{Theorem}
\begin{thm1}
Suppose $S = \{u_1, u_2, \dots, u_m\}$ and $T = \{v_1, v_2, \dots, v_n\}$ be two sets of vectors such that each is a basis for the vector space $V$. Then $|S| = |T|$.
\end{thm1}
\begin{proof}
The proof will follow the outline above. Suppose without loss of generality, that $m < n$. Starting with the set $S_0 = S$, we do the following: replace one of the vectors of $S_0$ by a vector from $T$ such that the new set, say $S_1$, still spans the entire $V$. Also, if $v_i$ was the vector in $T$ which was added to $S_0$, we set $T_1 = T \setminus \{v_i\}$. 

We now repeat this step $m$ times. Further, we ensure that at each step, the element removed from $S_i$ is one of the $u_j$'s (not the $v_j$'s that have been added). We now show that such an operation is indeed always possible. More precisely, assume that we have two sets $S_i$ and $T_i$ (with $i<m$). We show that it is always possible to obtain a set $S_{i+1}$ such that:
\begin{enumerate}
\item $S_{i+1}$ is obtained from $S_i$ by removing one of the $u_j$'s from $S_i$ and adding one of the $v_j$'s (which is from $T_i$) (call it $x$) to it.
\item The span of $S_{i+1}$ is the same as the span of $S_i$.
\end{enumerate}
Further, we set $T_{i+1} = T_i \setminus \{x\}$. We prove this as follows.

Note that since initially we have that $S_0 = S$ is a basis, and at every step so far the span was preserved, we can assume $S_i$ spans the entire $V$. We may also assume that we have re-numbered the $u_i$'s and $v_i$'s such that $S_i = \{u_{i+1}, \dots, u_m, v_1, \dots, v_i\}$ and $T_i = \{v_{i+1}, \dots, v_n\}$.\footnote{If $i=0$, $S_0 = S$, $T_0 = T$, and we interpret summations of the form $(\sum_{j=1}^i \dots)$ as zero.} Since $S_i$ spans the whole of $V$, we have
\begin{equation}
v_{i+1} = \sum_{j=i+1}^m \alpha_j u_j + \sum_{j=1}^i \beta_j v_j
\end{equation}
Now, at least one of $\alpha_j$'s must be non-zero, else we would have a non-trivial combination of $v_i$'s as zero, which is not possible since $T$ is a basis. So assume without loss of generality $\alpha_{i+1}$ is non-zero. Then, by the lemma above, replacing $u_{i+1}$ by $v_{i+1}$, we still get a basis. This is the required $S_{i+1}$.

Thus, if we repeat this process $m$ times, the resulting set, $S_m$ will have just $v_j$'s and they span the whole of $V$. This is a contradiction, as we have seen in the outline above.
\end{proof}

So we have proved that given a vector space $V$, any basis for it will have the same size (assuming there exists a finite basis). Thus, this number is a property of $V$ alone, and it is called the {\bf dimension} of $V$. The above result proves that it is well-defined.

\section{Solutions to $Ax = {\bf 0}$}
We now want to look at the set of all solutions to the system $Ax = {\bf 0}$, where $A$ is an $m \times n$ matrix and $x$ is a vector in $\mathbb{R}^n$. From now on, denote $\mathcal{S} = \{x~:~Ax = {\bf 0}\}$. Note that $\mathcal{S}$ is a subspace of $\mathbb{R}^n$. This is because if $x \in \mathcal{S}$ then $A (\alpha x) = \alpha (Ax) = {\bf 0}$, so $\alpha x \in \mathcal{S}$. Similarly, we can verify that $x_1, x_2 \in \mathcal{S} \implies x_1+x_2 \in \mathcal{S}$. 

Having proved that $\mathcal{S}$ is a subspace, we ask the natural question -- what is it's dimension? We give an answer to this problem in terms of the matrix $A$. We will, in fact, prove the following in the coming lectures.

\newtheorem{thm2}{Theorem 2}
\begin{thm2}
Suppose $k$ is the number of linearly independent columns in the matrix $A$. Then, $dim(\mathcal{S}) = n-k$.
\end{thm2}
We will also prove a `row version' of this theorem.
\newtheorem{thm3}{Theorem 3}
\begin{thm3}
Suppose $k$ is the number of linearly independent rows in the matrix $A$. Then, $dim(\mathcal{S}) = n-k$.
\end{thm3}

This gives, as an interesting and non-trivial corollary, that the number of linearly independent rows in a matrix is equal to the number of linearly independent columns. This is interesting, because it is not obvious at all, at first sight. This number is defined to be the {\bf rank} of the matrix $A$. 

We will end with some ideas relating to the proof of Theorem 2. One observation is the fact that $x = (x_1~x_2~\dots~x_n)^T$ is in $\mathcal{S}$ iff $\sum_{i=1}^n x_i A^{(i)} = {\bf 0}$ where $A^{(i)}$ is the $i$th column of the matrix $A$. This is easy to see by writing out the above summation.

\end{document}
