Skip to main content

Google’s AI ‘Reimagine’ tool helped us add wrecks, disasters, and corpses to our photos

Google’s AI ‘Reimagine’ tool helped us add wrecks, disasters, and corpses to our photos

/

The new feature on the Pixel 9 series is way too good at creating disturbing imagery — and the safeguards in place are far too weak.

Share this story

Magic Editor’s new tool helped us add the bike and car with nothing more than a text prompt.
Magic Editor’s new tool helped us add the bike and car with nothing more than a text prompt.
Photo by Chris Welch / The Verge

As it turns out, a rabbit wearing an AI-generated top hat was just the tip of the iceberg.

Google is the latest phone company this year to announce AI photo editing tools, following Samsung’s somewhat troubling, mostly delightful sketch-to-image feature and Apple’s much more seemingly tame Image Playground coming this fall. The Pixel 9’s answer is a new tool called “Reimagine,” and after using it for a week with a few of my colleagues, I’m more convinced than ever that none of us are ready for what’s coming.

Reimagine is a logical extension of last year’s Magic Editor tools, which let you select and erase parts of a scene or change the sky to look like a sunset. It was nothing shocking. But Reimagine doesn’t just take it a step further — it kicks the whole door down. You can select any nonhuman object or portion of a scene and type in a text prompt to generate something in that space. The results are often very convincing and even uncanny. The lighting, shadows, and perspective usually match the original photo. You can add fun stuff, sure, like wildflowers or rainbows or whatever. But that’s not the problem.

A couple of my colleagues helped me test the boundaries of Reimagine with their Pixel 9 and 9 Pro review units, and we got it to generate some very disturbing things. Some of this required some creative prompting to work around the obvious guardrails; if you choose your words carefully, you can get it to create a reasonably convincing body under a blood-stained sheet.

It took very little effort to turn the original image on the left into the one on the right.

In our week of testing, we added car wrecks, smoking bombs in public places, sheets that appear to cover bloody corpses, and drug paraphernalia to images. That seems bad. As a reminder, this isn’t some piece of specialized software we went out of our way to use — it’s all built into a phone that my dad could walk into Verizon and buy.

When we asked Google for comment on the issue, company spokesperson Alex Moriconi responded with the following statement:

Pixel Studio and Magic Editor are helpful tools meant to unlock your creativity with text to image generation and advanced photo editing on Pixel 9 devices. We design our Generative AI tools to respect the intent of user prompts and that means they may create content that may offend when instructed by the user to do so. That said, it’s not anything goes. We have clear policies and Terms of Service on what kinds of content we allow and don’t allow, and build guardrails to prevent abuse. At times, some prompts can challenge these tools’ guardrails and we remain committed to continually enhancing and refining the safeguards we have in place.

To be sure, our creative prompting to work around filters is a clear violation of these policies. It’s also a violation of Safeway’s policies to ring up your organic peaches as conventionally grown at the self-checkout, not that I know anyone who would do that. And someone with the worst intentions isn’t concerned with Google’s terms and conditions, either. What’s most troubling about all of this is the lack of robust tools to identify this kind of content on the web. Our ability to make problematic images is running way ahead of our ability to identify them.

When you edit an image with Reimagine, there’s no watermark or any other obvious way to tell that the image is AI-generated — there’s just a tag in the metadata. That’s all well and good, but standard metadata is easily stripped from an image simply by taking a screenshot. Moriconi tells us that Google uses a more robust tagging system called SynthID for images created by Pixel Studio since they’re 100 percent synthetic. But images edited with Magic Editor don’t get those tags.

1/16

Photos: The Verge

To be sure, tampering with photos is nothing new. People have been adding weird and deceptive stuff to images since the beginning of photography. But the difference now is that it has never been this easy to add these things realistically to your photos. A year or two ago, adding a convincing car crash to an image would have taken time, expertise, an understanding of Photoshop layers, and access to expensive software. Those barriers are gone; all it now takes is a bit of text, a few moments, and a new Pixel phone.

It’s also never been easier to circulate misleading photos quickly. The tools to convincingly manipulate your photos exist right inside the same device you use to capture it and publish it for all the world to see. We uploaded one of our “Reimagined” images to an Instagram story as a test (and quickly took it down). Meta didn’t tag it automatically as AI-generated, and I’m sure nobody would have been the wiser if they’d seen it.

Who knows, maybe everyone will read and abide by Google’s AI policies and use Reimagine to put wildflowers and rainbows in their photos. That would be lovely! But just in case they don’t, it might be best to apply a little extra skepticism to photos you see online.