machine-learning Neural Networks Softmax Function

Help us to keep this website almost Ad Free! It takes only 10 seconds of your time:
> Step 1: Go view our video on YouTube: EF Core Bulk Insert
> Step 2: And Like the video. BONUS: You can also share it!

Example

Softmax regression (or multinomial logistic regression) is a generalization of logistic regression to the case where we want to handle multiple classes. It is particularly useful for neural networks where we want to apply non-binary classification. In this case, simple logistic regression is not sufficient. We'd need a probability distribution across all labels, which is what softmax gives us.

Softmax is computed with the below formula:

Formula

___________________________Where does it fit in? _____________________________

Softmax To normalise a vector by applying the softmax function to it with numpy, use:

np.exp(x) / np.sum(np.exp(x))

Where x is the activation from the final layer of the ANN.



Got any machine-learning Question?