An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
-
Updated
Jul 30, 2024
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
Fast inference engine for Transformer models
Flops counter for convolutional networks in pytorch framework
The Enterprise-Grade Production-Ready Multi-Agent Orchestration Framework Join our Community: https://discord.com/servers/agora-999382051935506503
[NeurIPS‘2021] "TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up", Yifan Jiang, Shiyu Chang, Zhangyang Wang
solo-learn: a library of self-supervised methods for visual representation learning powered by Pytorch Lightning
A curated list of foundation models for vision and language tasks
Universal Graph Transformer Self-Attention Networks (TheWebConf WWW 2022) (Pytorch and Tensorflow)
💭 Aspect-Based-Sentiment-Analysis: Transformer & Explainable ML (TensorFlow)
[VLDB'22] Anomaly Detection using Transformers, self-conditioning and adversarial training.
🌕 [BMVC 2022] You Only Need 90K Parameters to Adapt Light: A Light Weight Transformer for Image Enhancement and Exposure Correction. SOTA for low light enhancement, 0.004 seconds try this for pre-processing.
How to use our public wav2vec2 dimensional emotion model
Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive summarization datasets to the extractive task.
Efficient Inference of Transformer models
FlashAttention (Metal Port)
The official code repo of "HTS-AT: A Hierarchical Token-Semantic Audio Transformer for Sound Classification and Detection"
Based on the Pytorch-Transformers library by HuggingFace. To be used as a starting point for employing Transformer models in text classification tasks. Contains code to easily train BERT, XLNet, RoBERTa, and XLM models for text classification.
LLM notes, including model inference, transformer model structure, and llm framework code analysis notes
MinT: Minimal Transformer Library and Tutorials
Punctuation Restoration using Transformer Models for High-and Low-Resource Languages
Add a description, image, and links to the transformer-models topic page so that developers can more easily learn about it.
To associate your repository with the transformer-models topic, visit your repo's landing page and select "manage topics."