Abstract This page publishes a Bidirectional Encoder Representations from Transformers (BERT) model that was pre-trained with a huge Japanese clinical text (approximately 120 million lines). This model is released under the Creative Commons 4.0 International License (CC BY-NC-SA 4.0). To develop the model, we leverage the Tensorflow implementation of BERT published by Google on this page. This stu
{{#tags}}- {{label}}
{{/tags}}