《李宏毅深度学习教程》(李宏毅老师推荐👍,苹果书🍎),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
-
Updated
Nov 12, 2024 - Jupyter Notebook
《李宏毅深度学习教程》(李宏毅老师推荐👍,苹果书🍎),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
The GitHub repository for the paper "Informer" accepted by AAAI 2021.
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
Graph Attention Networks (https://arxiv.org/abs/1710.10903)
Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Datasets, tools, and benchmarks for representation learning of code.
The implementation of DeBERTa
CCNet: Criss-Cross Attention for Semantic Segmentation (TPAMI 2020 & ICCV 2019).
Recent Transformer-based CV and related works.
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
list of efficient attention modules
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Official PyTorch Implementation of MambaVision: A Hybrid Mamba-Transformer Vision Backbone
Text classification using deep learning models in Pytorch
(ICLR 2022 Spotlight) Official PyTorch implementation of "How Do Vision Transformers Work?"
[ICLR 2024] Official PyTorch implementation of FasterViT: Fast Vision Transformers with Hierarchical Attention
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Universal Graph Transformer Self-Attention Networks (TheWebConf WWW 2022) (Pytorch and Tensorflow)
A Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
Add a description, image, and links to the self-attention topic page so that developers can more easily learn about it.
To associate your repository with the self-attention topic, visit your repo's landing page and select "manage topics."