ã¯ãã㫠深層å¦ç¿ã®å¾é æ³ã«ã¯æ§ã ãªææ³ãæå±ããã¦ãã¾ãããã®ä¸ã§ãã©ã®ææ³ãç¨ããã®ãé©åã§ãããèªåã§ããã¾ãç解ã§ãã¦ããªãé¨åããããä»åã¯å¾é æ³ã®ä¸ã§ãå®éã«æ·±å±¤å¦ç¿ã§ä¸»ã«ç¨ãããã¦ããææ³(SGD, Momentum SGD, AdaGrad, RMSprop, AdaDelta, Adam)ã«ã¤ãã¦ãå®è£ ãããã¨ãåæã«èª¿ã¹ã¦ã¾ã¨ãã¦ã¿ã¾ãããå®è£ ãã¬ã¼ã ã¯ã¼ã¯ã¯Chainerãæ³å®ãã¦ãã¾ãã SGD SGD(Stochastic Gradient Descent : 確ççå¾é éä¸æ³)ã¯ãOptimizerã®ä¸ã§ãåæã«æå±ãããæãåºæ¬çãªã¢ã«ã´ãªãºã ã§ããéã¿$\mathbf{w}$ã®æ´æ°ã¯ä»¥ä¸ã®ããã«è¡ã£ã¦ããã¾ãããã®ã¨ãã$E$ã¯èª¤å·®é¢æ°ã$\eta$ã¯å¦ç¿ä¿æ°ã表ãã¦ãã¾ãã \mathbf{w}^{t + 1} \gets \mathbf{w}^{t} -
{{#tags}}- {{label}}
{{/tags}}