🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
Nov 27, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
Open-source offline translation library written in Python
Explain, analyze, and visualize NLP language models. Ecco creates interactive visualizations directly in Jupyter notebooks explaining the behavior of Transformer-based language models (like GPT2, BERT, RoBERTA, T5, and T0).
Reasoning in Large Language Models: Papers and Resources, including Chain-of-Thought and OpenAI o1 🍓
🏡 Fast & easy transfer learning for NLP. Harvesting language models for the industry. Focus on Question Answering.
[Paper List] Papers integrating knowledge graphs (KGs) and large language models (LLMs)
LangChain & Prompt Engineering tutorials on Large Language Models (LLMs) such as ChatGPT with custom data. Jupyter notebooks on loading and indexing data, creating prompt templates, CSV agents, and using retrieval QA chains to query the custom data. Projects for using a private LLM (Llama 2) for chat with PDF files, tweets sentiment analysis.
A family of diffusion models for text-to-audio generation.
日本語LLMまとめ - Overview of Japanese LLMs
Running Llama 2 and other Open-Source LLMs on CPU Inference Locally for Document Q&A
💁 Awesome Treasure of Transformers Models for Natural Language processing contains papers, videos, blogs, official repo along with colab Notebooks. 🛫☑️
[ACL 2023] Reasoning with Language Model Prompting: A Survey
[ACL 2021] LM-BFF: Better Few-shot Fine-tuning of Language Models https://arxiv.org/abs/2012.15723
PhoBERT: Pre-trained language models for Vietnamese (EMNLP-2020 Findings)
LLM-based ontological extraction tools, including SPIRES
Official implementation for HyenaDNA, a long-range genomic foundation model built with Hyena
Active Learning for Text Classification in Python
This repository contains a collection of papers and resources on Reasoning in Large Language Models.
Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
Add a description, image, and links to the language-models topic page so that developers can more easily learn about it.
To associate your repository with the language-models topic, visit your repo's landing page and select "manage topics."