æ©æ¢°å¦ç¿ãã社ä¼å®è£ ãããéã«å¾ ã¡åãã¦ããç½ ã¨ããã®è§£æ±ºæ¹æ³ã®èå¯ (2024å¹´ç) ã§ããä»åã¯ãçæAIæ代ã¨ãå¼ã°ããæ¨ä»ã«ããã¦ãæã ã¯æ©æ¢°å¦ç¿ããã¸ã§ã¯ããã©ã®ããã«æããã©ã®ããã«åãåãã°ãããï¼ã®ç¾ éç¤ã«ãªãå 容ãçãè¾¼ã¿ã¾ããã â»ãã®è³æã¯ãæ±äº¬å¤§å¦ã¡ã¿ãã¼ã¹å·¥å¦é¨ãªã¹ããªã³â¦
ããã«ã¡ã¯ï¼ç±³å½ãã¼ã¿ãµã¤ã¨ã³ãã£ã¹ãã®ãã(@usdatascientist)ã§ãï¼ (追è¨)åç»çãå ¬éãã¾ããï¼å ¨38æéã®3é¨ä½ã¨ããè¶ å¤§ä½ã§ã ãæ¥æ¬ä¸ã®é«è©ä¾¡ãæ©æ¢°å¦ç¿è¶ å ¥éè¬åº§(åç·¨&å¾ç·¨)ãå ¬éãã¾ãã!! ãã¤ãã«3é¨å®çµãæ©æ¢°å¦ç¿è¶ å ¥éè¬åº§ã®æ¬çªç·¨ãå ¬éãã¾ãã!! ããã¼ã¤ãã«é·ãã£ããã¼ã¿ãµã¤ã¨ã³ã¹å ¥éæ©æ¢°å¦ç¿ç·¨35ååã®è¨äºãæ¸ãçµãã¾ããï¼ï¼ æ¬è¨äºã¯ãã®ã¾ã¨ãã§ãï¼ç®æ¬¡ã¨ãã¦ä½¿ã£ã¦ãã ããï¼ ç®æ¬¡ ç·å½¢å帰 第1å: æ©æ¢°å¦ç¿ã¨ã¯ï¼ãªã«ããã¦ããã®ãï¼ ç¬¬2å: ç·å½¢å帰ã®æ失é¢æ°ããããããã解説 第3å: ææ¥éä¸æ³ãå³ã¨æ°å¼ã§ç解ãã(è¶ éè¦) 第4å: æ£è¦æ¹ç¨å¼ãå®å ¨è§£èª¬(å°åºãã) 第5å: scikit-learnã使ã£ã¦ç·å½¢å帰ã¢ãã«ãæ§ç¯ãã 第6å: ç·å½¢å帰ã®ä¿æ°ã®è§£éã®ä»æ¹(på¤) è©ä¾¡ 第7å: (è¶ éè¦)éå¦ç¿ã¨æ±åæ§è½ãç解ãã(
æè¿è©±é¡ã«ãªã£ã大è¦æ¨¡è¨èªã¢ãã«ãã¾ã¨ãã¾ããã 1. ã¯ã©ã¦ããµã¼ãã¹1-1. GPT-4ãGPT-4ãã¯ããOpenAIãã«ãã£ã¦éçºããã大è¦æ¨¡è¨èªã¢ãã«ã§ãã ãã«ãã¢ã¼ãã«ã§ãããã¹ãã¨ç»åã®ããã³ãããåãå ¥ãããã¨ãã§ããããã«ãªãã¾ãããæ大ãã¼ã¯ã³æ°ã4Kãã32kã«å¢ãã¾ãããæ¨è«è½åãé£èºçã«åä¸ãã¦ãã¾ãã ç¾å¨ããChatGPT Plusã(ææç)ã§å¶éä»ãã§å©ç¨ã§ããä»ãã¦ã§ã¤ããªã¹ãã®ç»é²è ã対象ã«ãOpenAI APIãã§ã®å©ç¨ãéå§ãã¦ãã¾ãã
æ¬è¨äºã¯20åç¨åº¦ã§ãèªã¿ããã ãã¾ãã ããã«ã¡ã¯ãTC3ãã¼ã¿ãµã¤ã¨ã³ã¹é¨éã®æ¢ æ¬ã§ãã æ®æ®µã¯PyTorchã使ã£ã¦ããã®ã§ãããæ°ããã©ã¤ãã©ãªã触ãã®ãåå¼·ã«ãªãã¨æãã¾ãã®ã§ãä»æ¥ã¯æ°é²æ°éã®æ·±å±¤å¦ç¿ã©ã¤ãã©ãªã§ããJAX/Flaxã使ã£ã¦ãMNISTãå¦ç¿ããã¦ã¿ããã¨æãã¾ãã ã¯ããã« çãããåç¥ã®éããTensorFlowãKerasãPyTorch(Chainerâ¦)ã¨è¿å¹´ã¯æ§ã ãªæ·±å±¤å¦ç¿ã©ã¤ãã©ãªã使ããã¦ãã¾ããæè¿ãJAXã¨ããã©ã¤ãã©ãªã話é¡ã«ãªã£ã¦ãããã®ã®ãååãããã©ã¤ãã©ãªãããä¸ã§ãªãJAXãæ°ãã«åºã¦ããã®ã§ããããï¼ï¼ããã¦ãªã使ãã¹ããªã®ãï¼ããã®çç±ã«ã¯å¾çºã©ã¤ãã©ãªã®å¼·ã¿ã¨ãã¦ãå è¡ã©ã¤ãã©ãªã®åé¡ç¹ãæ¹è¯ãã¦ããã¨ããç¹ãæãããã¾ããç¾ç¶ä»¥ä¸ã®ãããªå©ç¹ãæãããã¾ã XLAã³ã³ãã¤ã«ã«ããé«éæ§ å³å¯ãªä¹±æ°ã®ç®¡çã«ããåç¾æ§ã®æ ä¿
ã¤ãã¼æ ªå¼ä¼ç¤¾ã¯ã2023å¹´10æ1æ¥ã«LINEã¤ãã¼æ ªå¼ä¼ç¤¾ã«ãªãã¾ãããLINEã¤ãã¼æ ªå¼ä¼ç¤¾ã®æ°ããããã°ã¯ãã¡ãã§ããLINEã¤ãã¼ Tech Blog ãã¯ãã«æ¤ç´¢æè¡ã¯ãç»åãé³å£°ãªã©ã®ãªãã¸ã§ã¯ããã¼ã¿ããæ©æ¢°å¦ç¿ã¢ãã«ãªã©ãå©ç¨ãã¦ãã¯ãã«ã§è¡¨ç¾ãããã¯ãã«éã®è·é¢ãè¨ç®ãããã¨ã§ãé¡ä¼¼ãããã¯ãã«ãæ¤ç´¢ããææ³ã§ãã é«æ¬¡å ãã¯ãã«ã®é¡ä¼¼æ¤ç´¢ã§ã¯è¨ç®éãå¢å ãããã¨ãããkNNï¼k-Nearest Neighborï¼ã§ã¯ãªãANNï¼Approximately Nearest Neighborï¼ãåºãå©ç¨ããã¦ãã¾ããæ¤ç´¢ã§å©ç¨ã§ãããã¼ã¿å½¢å¼ã¯ããã¯ãã«ã¸ã®å¤æãå¯è½ã§ããã°ãããã¹ããç»åãé³å£°ãåç»ããã¤ããªãªã©ãã¾ãã¾ãªãã¼ã¿ãå©ç¨ã§ãã¾ãã ãã¯ãã«æ¤ç´¢ã¯ãé¡ä¼¼ç»åæ¤ç´¢ã¯ãã¡ããã®ãã¨ãã¬ã³ã¡ã³ãã¼ã·ã§ã³ããã¼ã¿è§£æã«ãå©ç¨ã§ãã¾ããã¤ãã¼ã§ããå¾è¿°ãããYaho
ã¸ã§ã¤ã»ã¢ã©ãã¼ã«ã®ããã°ããã AIã«ããç»åçæã¯ã(ç§ãå«ãã¦)人ã ã®åº¦èãã¬ãææ°ã®AIã®è½åã§ããããã¹ãã®èª¬æããå°è±¡çãªãã¸ã¥ã¢ã«ãä½ãåºãè½åã¯ãéæ³ã®ãããªå質ãæã¡ã人éãã¢ã¼ããåµé ããæ¹æ³ã®å¤åãæ確ã«æã示ãã¦ãã¾ããStable Diffusionã®ãªãªã¼ã¹ã¯ãé«æ§è½(ç»è³ªã ãã§ãªããé度ãæ¯è¼çä½ããªã½ã¼ã¹/ã¡ã¢ãªè¦ä»¶ã¨ããæå³ã§ã®æ§è½)ãªã¢ãã«ãä¸è¬ã®äººã ã«æä¾ãããã¨ã«ãªã£ãã®ã¯ããã®éçºã«ãããæ確ãªãã¤ã«ã¹ãã¼ã³ã§ãã AIç»åçæã試ãã¦ã¿ã¦ããã®ä»çµã¿ãæ°ã«ãªãå§ããæ¹ãå¤ãã®ã§ã¯ãªãã§ããããã ããã§ã¯ãStable Diffusionã®ä»çµã¿ã«ã¤ãã¦åªããç´¹ä»ãã¾ãã Stable Diffusionã¯ãæ§ã ãªä½¿ãæ¹ãã§ããæ±ç¨æ§ã®é«ããã®ã§ããã¾ããããã¹ãã®ã¿ããã®ç»åçæ(text2img)ã«ç¦ç¹ãå½ã¦ã¾ããä¸ã®ç»åã¯ãããã¹ãå ¥åã¨ç
Stable Diffusionå®å ¨ã«ç解ãã ç»åçæAIã§è©±é¡ã®Stable Diffusionãå®å ¨ã«ç解ããç¶æ ã«ãªãããã§ãããç§ãã§ããå¤ãªå¤ãªãStable Diffusionç¡ç ä¸è¶³ã«ãªããªããã®èªåã®ç解ã¯ä»¥ä¸ã§ãã Stable Diffusionã¨ããAIã¢ãã«ã¯ãä¸è¨ã®ããã«ã2ã¤ã®ã¢ãã«ã§æ§æããã¦ãã¾ããåãã®ãDiffusion Modelã¨ãã°ãããã®ã§ãããã¯ã©ã³ãã ãã¤ãºçãªç»åãããã¯ãªãªãã£ã®é«ãçµµãçæãããã¨ãã§ãã¾ãã ãã ããã®ã¾ã¾ã ã¨ã©ããªçµµãçæããããåãããªãã®ã§ãçµµãã³ã³ããã¼ã«ããããã«ãããã³ããï¼èªç¶è¨èªï¼ãCLIPã¨å¼ã°ããTransformerã®ã¢ãã«ã«å ¥åãã¦ãåãè¾¼ã¿ãã¯ãã«ã«å¤æãã¾ãããã®ãã¯ãã«æ å ±ãDiffusion Modelã«å ¥ãã¦ãããã¨ã§ãèªåã®å¥½ããªç»åãçæãããã¨ãã§ãã¾ãã ç¡çããã«ã¡ã©ã¨ã®å¯¾
ãã¦ãè¦è¦ã»è¨èªãæ±ãåºç¤ã¢ãã«ã¨ãã¦ã¯ã2021å¹´ã® CLIP ããã¬ã¤ã¯ã¹ã«ã¼ã§ãããCLIPã¯ããã¹ãã¨ç»åãåãç¹å¾´ç©ºéã«ååãã2ã¤ã®ã¨ã³ã³ã¼ããããªãã¾ããCLIPã使ãã¨ã次ã®ããã«ãã¦ä»»æã®ç»ååé¡åé¡ã追å ã®å¦ç¿ãªãã§è§£ããã¨ãã§ãã¾ããã¾ããååè£ã¯ã©ã¹ãæç« ã®å½¢å¼ï¼ä¾ï¼ãç¬ã®åçãï¼ã«ããå¾ãããã¹ãã¨ã³ã³ã¼ãã«å ¥åãã¾ãã次ã«ãåé¡ãããç»åãç»åã¨ã³ã³ã¼ãã«å ¥åãã¾ããæå¾ã«ãç»åããå¾ããããã¯ãã«ã¨åè£ã¯ã©ã¹ãã¡ããå¾ãããè¤æ°ã®ãã¯ãã«ã¨ã®ã³ãµã¤ã³é¡ä¼¼åº¦ãè¨ç®ããæãé¡ä¼¼åº¦ãé«ãã¯ã©ã¹ãåºåçµæã¨ãã¾ãã CLIPã«ããã¼ãã·ã§ããç»ååé¡ã®æ¹æ³ãOpenAI Blogããå¼ç¨ CLIPã¯ç»åã¨ããã¹ãã¨ããã¢ã¼ãã®ç°ãªãæ å ±ãæå³çãªè¿ãã«ãã£ã¦çµã³ã¤ãããã¨ãå¯è½ã«ãã¾ãããCLIPãæ師ã®ããã«ãã¦ä½¿ããã¨ã§ãããã¹ãããç»åãçæããã¢ãã«ãè¨ç·´ãã
Interpretable Machine Learning A Guide for Making Black Box Models Explainable. Christoph Molnar 2021-05-31 è¦ç´ æ©æ¢°å¦ç¿ã¯ã製åãå¦çãç 究ãæ¹åããããã®å¤§ããªå¯è½æ§ãç§ãã¦ãã¾ãã ããããã³ã³ãã¥ã¼ã¿ã¯é常ãäºæ¸¬ã®èª¬æããã¾ããããããæ©æ¢°å¦ç¿ãæ¡ç¨ããéå£ã¨ãªã£ã¦ãã¾ãã æ¬æ¸ã¯ãæ©æ¢°å¦ç¿ã¢ãã«ãããã®å¤æã解éå¯è½ãªãã®ã«ãããã¨ã«ã¤ãã¦æ¸ããã¦ãã¾ãã 解éå¯è½æ§ã¨ã¯ä½ãã説æããå¾ã決å®æ¨ã決å®è¦åãç·å½¢å帰ãªã©ã®åç´ã§è§£éå¯è½ãªã¢ãã«ã«ã¤ãã¦å¦ã³ã¾ãã ãã®å¾ã®ç« ã§ã¯ãç¹å¾´éã®éè¦åº¦ (feature importance)ãALE(accumulated local effects)ããåã ã®äºæ¸¬ã説æããLIMEãã·ã£ã¼ãã¬ã¤å¤ã®ãããªã¢ãã«ã«éä¾åãªææ³(mo
1 ã¯ããã« æè¿ãæã +æ°åã§ã¹ãã¼ã¹ã¢ããªã³ã°ã¨ããåéãåå¼·ãã¦ãã¾ãã詳細ã¯ã¾ãå¥ã®è¨äºã«ã¦ç´¹ä»ããã«ãã¦ãä»åã¯ã¹ãã¼ã¹ã¢ããªã³ã°ã®å段éã«å½ãã ãªãã¸å帰(ridge regresion) ã«èå ãå½ã¦ã¾ã1ã èªè ã«ã¯é迦ã«èª¬æ³ããããã¾ãããããªãã¸å帰㯠L2 æ£ååã¨ãå¼ã°ãæ©æ¢°å¦ç¿ã®ä¸ã§ãé常ã«ã¹ã¿ã³ãã¼ããªæ¦å¿µã®ä¸ã¤ã«ãªã£ã¦ãã¾ãããããå°éçã«æ£ååæ³ãæ±ã£ã¦ã¿ã¦ãæ¡å¤ç¥ããªãã£ããã¨ãç¥ããã®ã§ã¾ã¨ãã¾ããã ã¾ãããªãã¸å帰ã§ã®æ失é¢æ°ã¯ä»¥ä¸ã®ãããªå¼ã§è¨è¿°ããã¾ãã \begin{align} E = (y - X \vec{w})^2 + \alpha \vec{w}^T \vec{w} \end{align} ä¸è¨ã®æ失ãæå°åããããã«ä¿æ°ã®éã¿ãã¯ãã« \(\vec{w}\) ãæ¨å®ãã¾ãã解æçã«ã¯ \(\vec{w}\) ã«ã¤ãã¦å¾®åããããã®
Understanding the backward pass through Batch Normalization Layer At the moment there is a wonderful course running at Standford University, called CS231n - Convolutional Neural Networks for Visual Recognition, held by Andrej Karpathy, Justin Johnson and Fei-Fei Li. Fortunately all the course material is provided for free and all the lectures are recorded and uploaded on Youtube. This class gives
å°å ¥ å帰ã¢ãã«æ§ç¯ã®éãæ±åæ§è½ãåä¸ãããããã«æ£ååã®ææ³ããã³ãã³ç¨ãããã¾ããããã¯ãèãã¦ãããã¼ã¿æ°ã«å¯¾ãã¦ç¹å¾´éã®æ°ãé常ã«å¤ãå ´åããç¹å¾´ééã«å¼·ãç¸é¢ï¼å¤éå ±ç·æ§ï¼ãããå ´åã«æå¹ãªæ¹æ³ã¨ãªã£ã¦ãã¾ãããã®ãããªå ´åã«ãé常ã®å帰ã¢ãã«æ§ç¯ã®éã«ç¨ãããã2ä¹èª¤å·®ãªã©ã®ç®çé¢æ°ã«å ãããã«ã ï¼ã¯æ£æ´æ°ï¼ã®ãããªæ£ååé ï¼ãããã¯ç½°åé ï¼å ãã¦æé©åããããªããã¨ã§å ç¨ã®åé¡ã解æ¶ãããã¨ãã§ãã¾ãããããã£ãæ£ååé ãå ããä¸ã§ã¢ãã«ã®æé©åããããªãï¼ = ãã©ã¡ã¼ã¿ãæ¨å®ããï¼æ¹æ³ããæ£ååæ³ã¨ããã¾ãã 代表çãªæ£ååæ³ã«ãLasso, Ridge, Elastic Netå帰ãããã¾ãããããã¯ã解éæ§ãå«ããç¹å¾´ããããå¿ ãããé«ç²¾åº¦ã®ãã®ã ããããããã§ã¯ãªããã¨ããã®ãç§ã®èãã§ãããããä¸æ¹ã§ã{caret}ã使ã£ã¦ãã®ä¸ã§æã精度ããããã®ãæ¡ç¨ãã¾ã
æ©æ¢°å¦ç¿ç³»ã®è©±é¡ãå¤ãæ¨ä»ã§ãããå®é触ã£ã¦ã¿ãã¨æå¾ ãã精度ã»çµæãåºãªããªãã¦ãã¨ã¯ãããããã¨ã§ã¯ãªãã§ããããã æ©æ¢°å¦ç¿ç¹æã®æ§è³ªã¨ãã¦ããã¼ã¿èªä½ãã¢ãã«ãå¤åãããçµæã¨ãã¦æ¥åã«å½±é¿ãä¸ããããã¾ãã ä»®ã«ãæ©æ¢°å¦ç¿å±ããã精度ãåºãã¢ãã«ãæ§ç¯ããã¨è¨ã£ã¦ãããããå°å ¥ããã¨ãã«ãã·ã¹ãã å ¨ä½ã§ã®å質ã®ç¶æã«è¦å´ããããã¾ãã ã¨ãããã¨ã§ãä¸ç¢ºå®æ§ã®å¤§ããæ©æ¢°å¦ç¿ç³»éçºã«ã¤ãã¦ã®ãè¨è¨ã»ãã¹ãæ¦ç¥ã§ã©ããã£ã¦ãªã¹ã¯ãä½æ¸ãã¦ãããããä¸ã¤ã«ã®ã«ãªã£ã¦ããã¨æããæ¹æ³è«ã«ã¤ãã¦åå¼·ãã¾ããã®ã§ããã®ã¡ã¢ã§ãã é常ã«åèã«ããã®ã¯ãã¡ãã arxiv.org ãã¹ããã®ãã®ã®ãã¯ããã¯ãªã©ã¯ãä¸è¬çãªãã¹ãé§åéçºã«é¢ããæ¸ç±ãåããã¦ããåèãã ããã ãã¹ãé§åéçº ä½è :Kent Beckçºå£²æ¥: 2017/10/14ã¡ãã£ã¢: åè¡æ¬ï¼ã½ããã«ãã¼ï¼ ãã¹ãé§åP
Conversations about the nature of intelligence, consciousness, love, and power. Subscribe to Lex Fridman or Lex Clips YouTube channels. Subscribe on Apple Podcasts, Spotify, RSS. If you enjoy it, consider rating it 5 stars. Connect on Twitter, LinkedIn, Instagram, TikTok, Facebook, Reddit. The best way to support this podcast is to support the Sponsors. They're awesome! But also, there's monthly d
ãªãªã¼ã¹ãé害æ å ±ãªã©ã®ãµã¼ãã¹ã®ãç¥ãã
ææ°ã®äººæ°ã¨ã³ããªã¼ã®é ä¿¡
å¦çãå®è¡ä¸ã§ã
j次ã®ããã¯ãã¼ã¯
kåã®ããã¯ãã¼ã¯
lãã¨ã§èªã
eã³ã¡ã³ãä¸è¦§ãéã
oãã¼ã¸ãéã
{{#tags}}- {{label}}
{{/tags}}