Implementing Neural Network from scratch (MLP and CNN), purely in numpy with optimizers
-
Updated
Apr 27, 2022 - Python
Implementing Neural Network from scratch (MLP and CNN), purely in numpy with optimizers
This is a reposatory for implementation of different types of optimizers (SGD, RMSprop, Adam etc.) with three different use cases Function Approximation, Multi-class Single-label Classification and Multi-class Multi-label Classification)
ConvNN is used to predict digits for the MNIST dataset
Testing out Schedule vs Schedule Free Optimizer
Fruit Classification dataset using ANN-CNN
This GitHub repository contains the code used for CS-671: Introduction to Deep Learning course offered by IIT Mandi during the Even Semester of 2022. The repository includes the implementations of various deep learning algorithms and techniques covered in the course.
Learn DSPy framework by coding text adventure game
🧑🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Code to conduct experiments for the paper Modified Gauss-Newton method for solving a smooth system of nonlinear equations.
Effect of Optimizer Selection and Hyperparameter Tuning on Training Efficiency and LLM Performance
This is an application for showing how optimization algorithms work
Evaluating optimization algorithms in IVHD method (interactive visualization of high-dimensional data)
Python Library for creating and training CNNs. Implemented from scratch.
Tutorials on optimizers for deep neural networks
0th order optimizers, gradient chaining, random gradient approximation
Cardis Optimizer is a simple but complex optimizer that will have your pc running better in minutes!
Cognitive bias inspired GLVQ optimizers
Add a description, image, and links to the optimizers topic page so that developers can more easily learn about it.
To associate your repository with the optimizers topic, visit your repo's landing page and select "manage topics."