Google I/O is where Google previews its plans for Android, Pixel, and beyond for the rest of the year. This year, we’re expecting a large focus on AI, as Google explains how its Gemini and Gemma models will be implemented on the web and on Pixel phones. The event kicks off on May 14th with a keynote at 1PM ET / 10AM PT.
As Google plugs AI into search, what happens to the web? Nilay Patel discussed that topic with Google CEO Sundar Pichai this week on the Decoder podcast. It quickly became a deeper discussion about the new AI Overviews results, but you can start with a small bite here.
Google CEO Sundar Pichai on AI-powered search and the future of the web
The head of Google sat down with Decoder last week to talk about the biggest advancements in AI, the future of Google Search, and the fate of the web.
AI assistants are so back
On The Vergecast: what’s new from OpenAI and Google, the future of search, and more.
Google I/O 2024: all the news from the developer conference
It’s Google’s most AI-focused developer conference yet, presenting a faster Gemini, more capable Search, and a scam call detector.
In response to malware and social engineering attacks that work by snooping notifications or activating screen sharing, Google says Android 15 will hide notifications with one-time passwords (with some exceptions, like wearable companion apps).
They’re also automatically hidden during screen sharing, and developers can enable their apps to check if Google Play Protect is active, or if another app might be capturing the screen during use.
The blink-and-you-missed-it AR glasses at Google I/O? “The glasses shown are a functional research prototype from our AR team at Google. We do not have any launch plans to share,” Google spokesperson Jane Park tells The Verge.
However: “Looking further ahead, we anticipate that capabilities demonstrated with Project Astra could be utilized through wearables and other next-generation devices.”
We cut down the nearly two-hour presentation just for you, ICYMI. You can also read about everything that was announced if you prefer words. Happy Wednesday!
We have to stop ignoring AI’s hallucination problem
AI might be cool, but it’s also a big fat liar, and we should probably be talking about that more.
Google is distributing these little handbooks for prompting AI, which is kind of adorable? It has color-coded highlights breaking down the basic components of a prompt. There’s an early internet “How to use a search engine” vibe about it — I’m gonna hang on to this one for posterity.
Sergey posted up outside the area where Google was giving demos of Project Astra multi-modal chats. He said he thinks Sundar is doing a good job making hard decisions as CEO, said he mostly uses AI for coding tasks, and politely declined to answer a question from Bloomberg’s Shirin Ghaffary about Larry Page accusing Elon Musk of being a “speciesist.”
At Google I/O 2024 today, Google announced a multimodal version of Gemini Nano, allowing the on-device processing-powered AI model to recognize images, sounds, and spoken language in addition to text.
Those multimodal capabilities are also coming to the Android accessibility feature TalkBack, using AI to fill in missing information about unlabeled images, without requiring a connection to the internet.
Google’s Gemini video search makes factual error in demo
Google even highlighted the wrong answer in the video!
Obviously, someone noticed our video that clipped every single AI mention at I/O 2023 last year. Sundar Pichai closed the 2024 keynote by showing how AI can save us some work by using it to keep track. At the time, it was up to 121 AI mentions.
...by the time they were finished, it was probably more like 124.
Head over to Google’s Vertex AI Studio site and click “Try it in console” to goof around with some of the AI tools Google talked about at I/O today. The site is meant for developers who want to test the company’s models out while deciding what works best for their software, but anyone can play with it.
The scam detection feature Google just announced requires Android users to opt in, and Google claims it’s on-device only, but it’s still essentially listening to your every conversation to look for fraudulent-sounding language.
Are we really ready to swap scamming concerns with privacy-related ones?
Toward the end (maybe?) of the I/O keynote, Google threw in a cute little ditty about all the things you can do with Gemini prompts: generate photos of cats playing guitar, find smart things to say about Renoir, etc.
It includes the phrase “There’s no wrong way to prompt,” which, have you met people?
Here’s a quick look at the new multimodal AI project Google just announced that’s called Astra and how it can help you find misplaced glasses.
Note: this video was edited for length and clarity, but the original video was one single take.