## Example

Softmax regression (or multinomial logistic regression) is a generalization of logistic regression to the case where we want to handle multiple classes. It is particularly useful for neural networks where we want to apply non-binary classification. In this case, simple logistic regression is not sufficient. We'd need a probability distribution across all labels, which is what softmax gives us.

Softmax is computed with the below formula:

### ___________________________Where does it fit in? _____________________________

To normalise a vector by applying the softmax function to it with `numpy`

, use:

```
np.exp(x) / np.sum(np.exp(x))
```

Where `x`

is the activation from the final layer of the ANN.