smallnn: Neural Network in 99 lines of C++
by Benjamin Kenwright
smallnn is a flexible neural network implementation with back propagation. It is 99 lines of C++, is free, simple and open source. The test case example
uses a 1-3-1 network configuration to emulate a sine wave function. As shown below in the illustration, you can see the ideal and the trained version of the output after 10,000 iterations.
* Neural network with back propagation training
* Support multiple layers - i.e., multilayer perceptron (MLP)
* Customizable (easy to configure for different topologies/layers/connections)
* Educational version (sometimes using libraries and packages masks the beauty of just how simple a neural network is at its heart)
* Not dependent on any libraries (vanilla C++)
* Importantly, it's a fun piece of code to play around with and learn about neural networks
Figure shows the result for training the neural network to emulate a sinewave (i.e., y=sin(x)).
Link to code (.cpp file) 99 - lines
Show me the code!
You can see the implementation below, it might look a bit complex if you've not written a neural network before, but remember, this little
baby is customizable! Easy to modify and add new layers/nodes/topologies - I could have probably got it into 50 lines if I hardcoded the
beast! But I wanted to show a workable example that you could start with and expand.
#include
#include
#include
While the code seems compact, it's a great starting point! Some interesting things to try out:
1. Modify the activation function (change it to ReLU)
2. Try other test conditions, use other trigonometry functions (like cos and tan or even mixing them together to create complex signals)
3. The configuration is a 1-3-1, but try more complex ones (1-4-5-1 or 1-10-1)
4. Modify the epoch updates (compare the accuracy and time vs number of epochs) - plot the error vs number of epochs
5. Test out different learning rates
6. Optimize the learning rate so it's dynamic (changes based on the error/speed of convergence)
7. Try and 'optimize' the code so it runs faster, for instance, using OpenCL or C++ tricks
8. Modify the implementation so the topology and training data can be setup from a script (e.g., JSON file)
9. Add more visualization information to the program, so you can see things changing and updating in real time (adding GUI interface)
10. Try porting across various simple NN projects (e.g., image processing, recondition, text analysis, signal and animation) [link]
References:
1. Deep Learning with Javascript: Example-Based Approach (Kenwright) ISBN: 979-8660010767
2. Game C++ Programming: A Practical Introduction (Kenwright) ISBN: 979-8653531095
|