This a collection of homwork I do for the Hongyi_Lee's online courses on machine learning and deep learning. Now, I have finished the first homework , I will continue to complete the rest of homework in my spare time in university . So, updating soon ...
The homework is aimed to use the data of first nine day PM2.5 information to predicted the MP2.5 on the 10th day. Instead of simply using the functions encapsulated in some libraries, I programmed the codes to make a regression by myself . Therefore, the codes can only be used to the models ,like y=ax+bX^2, and it can not be applied to the more complex models.
There are the main steps of the project:
- Preproccession: clean and preproccess the origin data to the one we need
- Nomalization
- Train : build my model and use Adagrad to opitimize the model
- Test : use the test data to see how well the model to make a prediction
- Prediction
If you want to see my codes , please turn to the master branch
Since I have learned many machine learning opitimiters on the course , I tend to see and compare the performance of the four popular optimizers: Adagray
, RMSprop
, SDGm
, Adam
.
Please view Adarag , RMSprop , Adam , if you want to learn more about the optimizers
Please view the optimizer.py , if you want to see the detail of how I program these four optimizers.
And the result is :
From the graph above , we know that SDGm
perform the best . And in most of other machine learning models, the Adam is actually the most commen and robust method we use to optimize models .