ã¯ããã« å¦ç¿æ¸ã¿BERTã試ãã«è§¦ã£ã¦ã¿ãããã ãã©ãæ¥æ¬èªä½¿ããBERTã®ç°å¢æ´ããã®é¢åï¼ã£ã¦ãã人åãã«ãã試ãã§BERTã使ãã Docker Image ä½ã£ã¦ã¿ã¾ããã BERT ã¯Transformers(æ§pytorch-transformersãæ§pytorch-pretrained-bert) ã使ç¨ã é»æ©ã»æ²³åç 究室ã®WEBãµã¤ãã«æ²è¼ããã¦ãããæ¥æ¬èªpretrainedã¢ãã«ã®Whole Word Maskingçã使ã£ã¦ã¾ãã Transformers â transformers 2.2.0 documentation BERTæ¥æ¬èªPretrainedã¢ãã« - KUROHASHI-KAWAHARA LAB Docker Image ããã«ç½®ãã¦ããã¾ãã https://hub.docker.com/r/ishizakiyuko/japanese_be
{{#tags}}- {{label}}
{{/tags}}