Encoding position with the word embeddings.
-
Updated
May 17, 2018 - Jupyter Notebook
Encoding position with the word embeddings.
Continuous Augmented Positional Embeddings (CAPE) implementation for PyTorch
Add a description, image, and links to the positional-encoder topic page so that developers can more easily learn about it.
To associate your repository with the positional-encoder topic, visit your repo's landing page and select "manage topics."