Notification

Users can now migrate Google Podcasts subscriptions to YouTube Music or to another app that supports OPML import. Learn more here

Report child sexual abuse imagery

What is CSAM

Nude or sexually explicit imagery of someone under the age of 18 may constitute Child Sexual Abuse Material (CSAM). CSAM consists of any visual depiction, including but not limited to photos, videos, and computer-generated imagery, involving the use of a minor engaging in sexually explicit conduct.

CSAM is illegal. If you suspect a child is in immediate danger, contact the police immediately.

Google is deeply committed to ensuring our platforms are safe for all users, including children. We invest heavily in fighting child sexual abuse and exploitation online and use the tools we’ve developed to deter, detect, remove, and report offences on our platform. To learn more about the steps Google is taking to fight child sexual abuse and exploitation on our own platform and services, please visit our Transparency Report Help Center and our Protecting Children site.

How to request CSAM be removed from Google Search

For our teams to consider content for removal from Search, it must meet either of the following requirements:

  1. The URL(s) contains content that you believe depicts an individual who was under 18 years old at the time the content was created and that individual is engaged in nude or sexually explicit conduct.
  2. The URL(s) contain content that appears to offer or advertise CSAM (such as a download link).

Important: To comply with local laws, your removal experience may differ depending on the country you are located in. For example, if you indicate that you’re located in an EU member state when you start the removal process, you’ll be directed to the Legal Help Center.

Start removal request

If your request for removal doesn’t meet the requirements above but involves nude or sexual imagery depicting someone over the age of 18 that was created or shared without consent, go to Google Search’s Report a Problem page for more options.

What happens after you submit the removal request

  1. Automated email confirmation: If you choose to provide your contact information in the webform, you’ll receive an email that confirms we received your request. If you’ve chosen to report anonymously, we’ll be unable to provide any user responses.
  2. We review your request: Each request is evaluated based on the requirements above.
  3. We request more information, if needed: In some cases, we may ask you for more information. If the request doesn’t have enough information for us to evaluate, like missing URLs, we’ll share specific instructions and ask you to resubmit the request.
  4. We remove and report confirmed CSAM: If the information provided in your report is assessed by Google to be apparent CSAM, it will be removed from Search and a CyberTipline report will be made to the National Center for Missing and Exploited Children (NCMEC), a clearinghouse and comprehensive reporting center in the US for issues related to child exploitation.
  5. You get a notification of any action taken: If the request doesn't meet the requirements for removal, we’ll also include a brief explanation. If your request is denied and later you have additional information to support your request, you can re-submit your request.

Other reporting options

If you would prefer not to report directly to Google or you didn’t find the content on Google Search, you can contact one of the organizations listed here.

Frequently asked questions

What does Google do to deter users from seeking out CSAM on Search?
Google deploys safety by design principles to deter users from seeking out CSAM on Search. It's our policy to block search results that lead to CSAM that appears to sexually victimize, endanger, or otherwise exploit children. We’re constantly updating our algorithms to combat these evolving threats. We apply extra protections to Searches that we understand are seeking CSAM content. We filter out explicit sexual results if the Search query seems to be seeking CSAM, and for queries seeking adult explicit content, Search won’t return imagery that includes children, to break the association between children and sexual content. In addition to removing CSAM content from Search’s index when it is identified, we also demote all content from sites with a high proportion of CSAM content. In many countries, users who enter queries clearly related to CSAM are shown a prominent warning that child sexual abuse imagery is illegal, with information on how to report this content to trusted organizations. When these warnings are shown, we have found that users are less likely to continue looking for this material.
What about computer-generated imagery?
AI-generated CSAM or computer-generated imagery depicting child sexual abuse is a threat Google takes very seriously. Our work to detect, remove, and report CSAM has always included violative content involving actual minors, modified imagery of an identifiable minor engaging in sexually explicit conduct, and computer-generated imagery that is indistinguishable from an actual minor engaging in such conduct.
The intake form has multiple options for removals. Which option do I choose?
When asked “Why are you requesting personal content removal from Google Search?” you may select either “Content contains nudity or sexual material” or “Content shows a person under 18.” This will take you to the next screen where you will be asked, “The content shows someone who is both under 18 and in a nude or sexually explicit manner.”
Can I submit a report anonymously?
Yes. To ensure your report is anonymous, make sure you’re logged out of your Google account(s) before submitting the report and do not provide your contact information in the optional contact information boxes.
Please note we can’t provide status updates on anonymous reports. If you would like to know the status of your report, including if the content has been removed, your name and email address must be provided in the webform.
Which URLs do I submit for review?
The URL(s) submitted must be of a webpage, image, or video that contain the suspected CSAM imagery, or URL(s) that ask users to click and download content that may contain CSAM.
How do I find the URL of the content I want to report?
To find the URL of the content, search for the page or image you want to report.
Learn how to:
How do I submit more than one URL for review?
Add one URL per line. You can submit up to 1,000 URLs.
How do I request removal of content that’s no longer live?
If the content no longer appears on a webpage, but appears in Google Search results or as a cached page, you can request a refresh of outdated content.
What happens when content is removed from Google Search?
Google Search organizes information published on the open web. We don’t have control over the content on third-party web pages. Even if Google removes the content from our Search results, it still exists on the original site hosting the content. This means it may be found through the URL to the site, social media sharing, or other Search engines.
Your best option to remove content is to contact the website owner if you’re comfortable doing so, because they may be able to remove the content entirely.

Was this helpful?

How can we improve it?
false
Search
Clear search
Close search
Google apps
Main menu
16053400854775759383
true
Search Help Center
true
true
true
true
true
100334
false
false