39. Reference
Bengio, Y., Louradour, J., Collobert, R., Weston, J. Curriculum Learning. ICML, 2009.
Bengio, Y. Evolving Culture vs Local Minima. arXiv, 2012.
Bengio, Y. Deep Learning of Representations: Looking Forward. arXiv, 2013.
Duchi, J., Hazan, E., Singer, Y. Adaptive Subgradient Methods for Online Learning and
Stochastic Optimization. COLT, 2010.
Gulcehre, C., Bengio, Y. Knowledge Matters: Importance of Prior Information for
Optimization. arXiv, 2013.
Hinton, G. E. Training Products of Experts by Minimizing Contrastive Divergence. Neural
Computation, 2002.
Hinton, G. E., Srivastava, N., Krizhevsky, A., Sutskever, I., Salakhutdinov, R. R.
Improving neural networks by preventing co-adaptation of feature detectors. arXiv, 2012.
39
40. Reference
40
LeCun, Y., Bottou, L., Bengio, Y., Haffner, P. Gradient based learning applied to
document recognition. Proc. IEEE, 1998.
Schaul, T., Zhang, S., LeCun, Y. No More Pesky Learning Rates. ICML, 2013a.
Schaul, T., LeCun, Y. Adaptive learning rates and parallelization for stochastic, sparse,
non-smooth gradients. ICLR, 2013b.
Tang, Y. Deep Learning using Support Vector Machines. arXiv, 2013.
Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A. Extracting and Composing
Robust Features with Denoising Autoencoders. ICML, 2008.
Vinyals, O., Jia, Y., Deng, L., Darrell, T. Learning with Recursive Perceptual
Representations. NIPS, 2012.
Wang, S. I., Manning, C. D. Fast dropout training. ICML, 2013.
47. 付録 : Referrence
47
Elman, J. Finding structure in time. Cognitive Science, 1990.
Jordan, M. Serial order: A parallel distributed processing aproach. Tech. Rep., 1986.
Mesnil, G., He, X., Deng, L., Bengio, Y. Investigation of Recurrent-Neural-Network
Architectures and Learning Methods for Spoken Language Understanding.
INTERSPEECH, 2013.
Socher, R., Lin, C. C.-Y., Ng, A. Y., Manning, C. D. Parsing Natural Scenes and Natural
Language with Recursive Neural Networks. ICML, 2011.
Sutskever, I., Martens, J., Hinton, G. Generating Text with Recurrent Neural Networks.
ICML, 2011.
Editor's Notes
1:00
2:00
3:00
4:00
5:00
6:00 さくっと飛ばす
7:00
8:00
9:00
10:00, 軽く説明
11:00
12:00 ここは軽くスキップ
13:00
14:00
15:00
16:00, ロジスティック回帰 , 20-newsgroup subtask alt.atheism vs. religion.misc, Batch GD