What does it actually mean to *solve the system* **Mx** = **b**?

For some students, it's entirely likely that after doing all the math in
your linear algebra course, you forgot the original reason why you defined
these matrices, and why you're learning Gaussian elimination.

# One Dimension

For the one dimensional case, one solves *mx* = *b* for
*x*. In this case, the solution is simple: *x* = *b*/*m*.

# Two Dimensions

Let's write out the matrix vector equaiton **Mx** = **b**. Given the matrix

and the vectors **x** = (*x*_{1}, *x*_{2})^{T} and **b** = (*b*_{1}, *b*_{2})^{T}, then expanding we get:

It's important here to remember that all the values *m*_{i, j} and
*b*_{i}* are given, a concrete example, is useful.*

*
*### Example of a 2×2 System of Linear Equations

What does **Mx** = **b** mean when

?
All this is, in disguise, is

4*x*_{1} + 2*x*_{2} = 3

3*x*_{1} + 5*x*_{2} = 4
which is equivalent to

4*x*_{1} + 2*x*_{2} − 3 = 0

3*x*_{1} + 5*x*_{2} − 4 = 0
However, note that the left hand side describes a plane. In Figure
1, we show f_{1}(**x**) = 4*x*_{1} + 2*x*_{2} − 3,
and in Figure 2, we show
f_{2}(**x**) = 3*x*_{1} + 5*x*_{2} − 4.

Figure 1. The function f_{1}(**x**) = 4*x*_{1} + 2*x*_{2} − 3.

Figure 2. The function f_{2}(**x**) = 3*x*_{1} + 5*x*_{2} − 4.

The first function, f_{1}(**x**) is zero on the line
*x*_{1} = 3/4 − 2/4*x*_{2}, while the second
function, f_{2}(**x**) is zero on the line
*x*_{1} = 4/3 − 5/3*x*_{2}. What it means to solve a linear
system is that we are finding a value of **x** = (*x*_{1}, *x*_{2})^{T} such
that both f_{1}(**x**) = 0 and f_{2}(**x**) = 0.

In this example, there is one unqiue point, and that point may be seen in Figure 3.

Figure 3. Both function f_{1}(**x**) and f_{2}(**x**).

These two functions are simultaneously zero at only one point, seen in Figure 3
as the point where the grey (*z* = 0), red, and blue planes intersect. From
the linear algebra you learned, you could solve this system to find that the
solution is **x** = (0.5, 0.5)^{T}.

Thus, solving a system of *n* linear equations is no more than finding
simultaneous roots of *n* different linear functions. This is why we can use
such techniques as when we perform Newton's method in *n* dimensions: we
convert a system of *n* nonlinear functions into a simplified problem with
*n* linear functions, find the root of the linear functions and show that, under
the appropriate conditions, the solution to the linear functions is a good approximation
to the nonlinear problem.

# Why are we Solving Systems of Linear Equations?

The most obvious example comes in examples such as trying to solve circuit in
Figure 4. Kirchhoff's voltage laws tells us that the sum of the voltages in each
of the loops is zero.

Figure 4. A simple circuit.

In the first loop, we get 7 V − 1 (*i*_{1} + *i*_{2}) − 2 *i*_{1} = 0, and in the
second, we get 2 *i*_{1} − 4 *i*_{2} = 0.
Solving these yeilds the values *i*_{1} = 2 A and *i*_{2} = 1 A.