Skip to content

ssunbear/NLP_Paper_Review

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NLP Paper Review

NLP와 관련된 논문을 읽고 정리한 레포입니다.

구성

Category Paper Date Link
Seq2seq Sequence to Sequence Learning with Neural Networks 2024.08.30 링크
Transformer Attention is All You Need 2024.09.13 링크
PLM-Encoder ELMo Deep contextualized word representations 2024.09.13 링크
PLM-Encoder BERT BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 2024.09.27 링크
PLM-Decoder GPT-2 Language Models are Unsupervised Multitask Learners 2024.09.27 링크
PLM-seq2seq T5 Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer 2024.10.17 링크
PLM-Decoder InstructGPT Training language models to follow instructions with human feedback 2024.10.17 링크
PLM-Decoder LLaMA LLaMA: Open and Efficient Foundation Language Models 2024.10.24 링크
Alignment Tuning DPO Direct Preference Optimization: Your Language Model is Secretly a Reward Model 2024.10.24 링크
Instruction Tuning LIMA [LIMA: Less Is More for Alignment] 2024.10.31 링크
PEFT LoRA [LoRA: Low-Rank Adaptation of Large Language Models] 2024.10.31 링크
Scaling Law [Scaling Laws for Neural Language Models] 2024.11.07 링크
Scaling Law Chinchilla [Training Compute-Optimal Large Language Models] 2024.11.07 링크

About

NLP Paper를 읽고 정리한 Repository

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published