NN = NeuralNetwork(input_size = train_data.shape[1])
NN.add_layer(32, "relu")
NN.add_layer(16, "relu")
NN.add_layer(10, "softmax")
NN.compile(loss = "categorical crossentropy")
NN.summary()
An educational "from scratch" implementation of classic feed-forward neural networks for binary/multi-class classification using ReLU activations, cross entropy loss and sigmoid/softmax output. Implemented in python using numpy.
Can be run online with Google Colab -> here.