Least squares is a standard approach to problems with more equations than unknowns, also known as overdetermined systems.

Consider the four equations:

```
x0 + 2 * x1 + x2 = 4
x0 + x1 + 2 * x2 = 3
2 * x0 + x1 + x2 = 5
x0 + x1 + x2 = 4
```

We can express this as a matrix multiplication `A * x = b`

:

```
A = np.array([[1, 2, 1],
[1,1,2],
[2,1,1],
[1,1,1]])
b = np.array([4,3,5,4])
```

Then solve with `np.linalg.lstsq`

:

```
x, residuals, rank, s = np.linalg.lstsq(A,b)
```

`x`

is the solution, `residuals`

the sum, `rank`

the matrix rank of input `A`

, and `s`

the singular values of `A`

. If `b`

has more than one dimension, `lstsq`

will solve the system corresponding to each column of `b`

:

```
A = np.array([[1, 2, 1],
[1,1,2],
[2,1,1],
[1,1,1]])
b = np.array([[4,3,5,4],[1,2,3,4]]).T # transpose to align dimensions
x, residuals, rank, s = np.linalg.lstsq(A,b)
print x # columns of x are solutions corresponding to columns of b
#[[ 2.05263158 1.63157895]
# [ 1.05263158 -0.36842105]
# [ 0.05263158 0.63157895]]
print residuals # also one for each column in b
#[ 0.84210526 5.26315789]
```

`rank`

and `s`

depend only on `A`

, and are thus the same as above.