If you’re interested in experimenting with neural networks and you know Python, my top recommendation is to use Theano.

Theano isn’t strictly about neural networks – it’s more of a mathematical engine, somewhere between Matlab and Mathematica:

  • It’s similar to Matlab because it uses vectorization (the resulting functions operate on vectors).
  • It’s akin to Mathematica because you first define expressions (functions as analytical formulas). You can compute analytical derivatives, which is critical for neural networks. Derivatives are essential, but no one wants to spend time calculating them manually, especially when a library can handle this for you.

Once you’ve defined the necessary function expressions (such as the activation function and the gradient of the loss function for a neural network), you can compile them with Theano. This gives you an efficient, compiled function that can evaluate arguments quickly. The compiled functions are vectorized and highly optimized (although the compilation process can take some time). Additionally, these functions can run on a GPU for even better performance.

This capability allows you to define new neural networks in just a few lines of code, simply by specifying the activation function. Impressive, isn’t it?

Even more examples (with less explanation) can be found here.