Softmax Slider

A visual aid for playing with the SoftMax algorithm.

Softmax is used in categorisation problems in neural networks to map the final layer of neurons from scalar input values to a probability distribution over different categories

In this example the input values for each category can vary on a scale from 0 up to 10 - move the sliders to change the values

Category Values

Category Probability Scores (Softmax Output)

Category Values (Softmax Input)

Categorical distribution: note that the total sum of all category probability scores is 1 since it represents a probability distribution
Softmax vs input: note that the softmax output values are quite sensitive to changes in input - a relatively small difference to an input value for a category can cause a large swing in the probability

The sourcecode is available here:

© Will Robertson