\input{template}
\input{macros}
\begin{document}
\lecture{11}{Simplex Algorithm: Finding a neighbour of larger cost}{ Sanchit Garg}

\section{Overview}

\begin{enumerate}

\item 
Basic Approach for algorithm
\begin{enumerate}
\item Start with an extreme point.
\item Move to a neighbour of larger cost if one exists.
\end{enumerate}

\item
Issues with this approach
\begin{enumerate}
\item To find starting point
\item How to move from one extreme point to one of its neighbour ({\tt Peberting})
\item When algorithm stops, and to show that we have the point which maximizes $c^TA$
\end{enumerate}

\item
We have already accomplished following results in previous lectures
\begin{enumerate}
\item A linear function on {$Ax \leq b$} is maximized at an extreme point.
\item The intersection of $n$ linearly independent hyperplanes for equality (and hence corresponding $n$ linearly independent rows of $A$) gives an extreme point if it satisfies the other inequalities.
\end{enumerate}
\end{enumerate}

\section{Some statements/results}

\begin{enumerate}
\item 
The intersection of $n$ linearly independent rows/hyperplanes yield an extreme point. Also the intersection of $n-1$ linearly independent hyperplanes will give a line.
\item
For each extreme point, there will be $n$ neighbouring extreme points. One corresponding to each line given by different sets of $n-1$ hyperplanes excluding one from the $n$ hyperplanes that define the extreme point in consideration.
\item
The direction of the vectors from an extreme point to its neighbours are given by the columns of $-A'^{-1}$. $A'$ is a $n \times n$ matrix such that $A'x_0 = b$.
\\

Proof:
	$A'$ consists of $n$ linearly independent rows which solve to give a unique solution $x_0$. Consider points on the line in between $x_0$ and one of its neighbour $y_i$. Since they all lie on line segment joining $x_0$ and $y_i$, they satisfy $n-1$ rows of $A'$ with equality and remaining ones with inequality. Lets say $i^{th}$ row of $A'$ will be an inequality. Therefore points in between $x_0$ and $y_i$ satisfy the following equations. 
	\begin{align}
		A'_{i}y_i & < b_i \notag\\
		A'_{1}y_i & = b_1 \notag\\
		A'_{2}y_i & = b_2 \notag\\
		      & \vdots \notag\\
		A'_{i-1}y_i & = b_{i-1} \notag\\
		A'_{i+1}y_i & = b_{i+1} \notag\\
		      & \vdots \notag\\
		A'_{n}y_i & = b_n \notag
	\end{align}

Clearly, direction of vectors from $x_0$ to its neighbours will be given by 
\begin{equation}
y_1-x_0, y_2-x_0, y_3-x_0, \hdots ,y_n-x_0
\end{equation}
where $y_i$ is a neighbouring extreme point of $x_0$.

Consider a matrix $P$ whose columns are given by $y_i-x_0$.
\begin{equation}
P = (y_1-x_0, y_2-x_0, y_3-x_0, \hdots y_n-x_0)
\end{equation}
\begin{align}
Since \  &{A'}_jy_i = b_j = {A'}_jx_0,    \forall j \neq i \ ( i,j = 1 \hdots n) \notag \\
\implies &{A'}_j(y_i - x_0) = 0, \forall j \neq i \\
\notag \\
Also \ &{A'}_iy_i \leq b_i \notag \\
\& \ &{A'}_ix_0 = b_i \notag \\
\implies &{A'}_i(y_i - x_0) \leq 0
\end{align}

This gives $A'P$ as
\begin{gather*}
\begin{bmatrix}
(-)ve & 0 & 0 & \hdots & 0 \\
0 & (-)ve & 0 & \hdots & 0 \\
\vdots \\
0 & 0 & 0 & \hdots & (-)ve 
\end {bmatrix}
\end{gather*}

Now $A'(-A'^{-1})$ would be
\begin{gather*}
\begin{bmatrix}
-1 & 0 & 0 & \hdots & 0 \\
0 & -1 & 0 & \hdots & 0 \\
\vdots \\
0 & 0 & 0 & \hdots & -1 
\end {bmatrix}
\end{gather*}

This means that $y_i-x_0$ is a scaled version of the $i^{th}$ column of $-A'^{-1}$. Hence the direction vectors of the neighbours of $x_0$ are given by the columns of $-A'^{-1}$.
\end{enumerate}

\section{Simplex Algorithm}
Recall that we started with an extreme point and if there exists a neighbour of larger cost we moved to that neighbour. To check if $x_0$ has a neighbour $y_i$ with larger cost we need to determine if the cost increases in the direction of $y_i - x_0$ (which is given by $i^{th}$ column of ${-A'^{-1}}$). If for any $y_i$ this cost increases then $y_i$ is our next extreme point and we repeat the procedure. The Algorithm stops when we have no such neighbour, and the output is the final extreme point.\\
If we have found that the cost increases in the direction of $y_i-x_0$, we need to find $y_i$ to proceed further. Since $x_0$ \& $y_i$ lie on same line, they have $n-1$ hyperplanes in common. To find $y_i$ we need to get the $n^{th}$ hyperplane. Consider
\begin{equation}
	A(x_0 + \epsilon v_i) \leq b
\end{equation}
where $v_i$ is the direction vector ($y_i - x_0$). With $\epsilon = 0$, we have $A'x_0 = b$. As we gradually increase $\epsilon$, at some value one of the inequalities will become an equality. This yields the point $y_i$.\\
Note: There would be many hyperplanes that would solve with these $n-1$ hyperplanes to give a point. But they wont be extreme point as they might not satisfy other inequalities. Only, point closest to $x_0$ on line will satisfy all conditions and it would be an extreme point.

What remains is to prove that if all the neighbours of $x_0$ have cost at most the cost of $x_0$, then $x_0$ will be a point of maximum cost. This will be taken in the next lecture.

\end{document}