Transforming Features (BoxCox Transformation and Log Transformation) and Labels
Cross-Validation Function that firstly applys the previously defined Transofrmations and Normalizes the data. Then it performs Cross-Validation for each of the given models and outputs the dictionary with train and test Accuracy and Standard deviation for each model.
Plotting the best models' accuracies and Standard Deviations, as well as the Confusion Matrices against each other.
The same steps were applied as in the 1st Part.
Raw NumPy implementation of a Neural Network. Accepts ReLU, Leaky ReLU, Softmax, Sigmoid and Tanh as the hidden layer activations. Output layer accepts only Softmax activation. Forward propagation is done with caching, and back propagation uses Adam as the optimizer.
Plotting the accuracy and loss over time, as well as the Confusion Matrix.