The Wayback Machine - https://web.archive.org/web/20190704131718/http://textsynth.org:80/tech.html

Text Synth - Technical Notes

Text Synth is build using the GPT-2 language model released by OpenAI. It is a neural network of 345 million parameters based on the Transformer architecture.

GPT-2 was trained to predict the next word on a large database of 40 GB of internet texts.

This implementation is original because instead of using a GPU, it runs using only 4 cores of a Xeon E5-2640 v3 CPU at 2.60GHz. With a single user, it generates 40 words per second. It is programmed in plain C using the LibNC library.

The user interface is inspired from talktotransformer.com. Thanks to OpenAI for providing their GPT-2 model.

[Back to the main page]