ChatGPT Is a Blurry JPEG of the Web | The New Yorker
A very astute framing by Ted Chiang—large language models as a form of lossy compression for text.
When we’re dealing with sequences of words, lossy compression looks smarter than lossless compression.
A lot of uses have been proposed for large language models. Thinking about them as blurry JPEGs offers a way to evaluate what they might or might not be well suited for.