Purpose This project aim to note, test, and visualize the various activation functions used i deep learning neural networks. The following activation function are implemented: Sigmoid Tanh ReLU Leaky ReLU ReLU 6 Softmax Identity Swish