Turkish BERT/DistilBERT, ELECTRA and ConvBERT models
-
Updated
Nov 18, 2024 - Python
Turkish BERT/DistilBERT, ELECTRA and ConvBERT models
STS 문장 유사도 측정 프로젝트 (네이버 부스트캠프 AI Tech 7기 Level 1 프로젝트)
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Public files for electra.one project
Open-source benchmark datasets and pretrained transformer models in the Filipino language.
MSc thesis project: classification of italian certified electronic mails using SOTA machine learning, fine-tuning pre-trained deep learning models and data augmentation techniques
Official implementation of "Using Pre-Trained Language Models in an End-to-End Pipeline for Antithesis Detection" accepted in LREC-2024
Official implementation of "Using Pre-Trained Language Models in an End-to-End Pipeline for Antithesis Detection" accepted in LREC-2024
AI and Memory Wall
Pretrained ELECTRA Model for Korean
ML and Natual Language Processing
Factuality check of the SemRep Predications
Simple NER Pipeline Using KoCharELECTRA
Pretrain and finetune ELECTRA with fastai and huggingface. (Results of the paper replicated !)
randomnerd_esp32_wifi_manager
上海交通大学自然语言理解2020春:CoLA Task
Add a description, image, and links to the electra topic page so that developers can more easily learn about it.
To associate your repository with the electra topic, visit your repo's landing page and select "manage topics."