Nov 20, 2024

6 Posts

Two people reading in bed, one with a book on library functions and a head labeled with AI layers.
Nov 20, 2024

AI Is Part of Your Online Audience: Some webpages are written not for humans but for large language models to read. Developers can benefit by keeping the LLM audience in mind.

A small number of people are posting text online that’s intended for direct consumption not by humans, but by LLMs (large language models).
Two people reading in bed, one with a book on library functions and a head labeled with AI layers.
Nov 20, 2024

Next-Gen Models Show Limited Gains, Real-Time Video Generation, China AI Chips Blocked, Transformer Training Streamlined

The Batch AI News and Insights: A small number of people are posting text online that’s intended for direct consumption not by humans, but by LLMs (large language models).
Efficient Foundations animation showing layered AI model components.
Nov 20, 2024

More-Efficient Training for Transformers: Researchers reduce transformer training costs by 20% with minimal performance loss

Researchers cut the processing required to train transformers by around 20 percent with only a slight degradation in performance.
Close-up of a Chinese-made server chip labeled with the logo and text ‘710’ mounted on a motherboard.
Nov 20, 2024

Further Chip Restrictions on China: TSMC stops advanced chip production for China on U.S. orders

The largest manufacturer of AI chips told its Chinese customers it would stop fabricating their most advanced designs, further limiting China’s access to AI hardware.
Comparison of Minecraft terrain with and without player modifications.
Nov 20, 2024

No Game Engine Required: AI creates an interactive Minecraft-like world in real time

A real-time video generator lets you explore an open-ended, interactive virtual world — a video game without a game engine.
Graph showing test loss decreases with more tokens and larger model sizes (103-109 parameters).
Nov 20, 2024

Next-Gen Models Show Limited Gains: AI giants rethink model training strategy as scaling laws break down

Builders of large AI models have relied on the idea that bigger neural networks trained on more data and given more processing power would show steady improvements. Recent developments are challenging that idea.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox