The Log Sum Exp Trick

In machine learning, arithmetic underflow can become a problem when multiplying together many small probabilities. In many models it can be useful to calculate the log sum of exponentials.

If $x_{i}$ is sufficiently large or small, this will result in an arithmetic overflow/underflow. To avoid this we can use a common trick called the Log Sum Exponential trick.

Where $b$ is $\max(x)$.

We can calculate this in Python with:

or using Sci Py

from scipy.misc import logsumexp
logsumexp(ns)

Jupyter notebook here.