Across Amazon’s services, we devote significant resources to combating CSAM, and we continue to invest in new technologies and tools to improve our ability to prevent, detect, respond to, and remove it.

As we strive to be Earth’s Most Customer-Centric Company, Amazon and its subsidiaries provide services directly to customers as well as enable businesses to use our technology and services to sell and provide their own products and services. In all cases, our policies, teams, and tools work together to prevent and mitigate CSAM.

Amazon and its subsidiaries are committed to preventing child sexual abuse material (CSAM) across all of Amazon's services.

Our consumer services use a variety of tools and technologies such as machine learning, keyword filters, automated detection tools, and human moderators to screen images, videos, or text in public-facing content for policy compliance before it’s allowed online. These measures enforce multiple policies, including prohibitions on CSAM, and they appear to be an effective deterrent, as reflected in the low number of reports for CSAM. As one example, Amazon Photos uses Thorn’s Safer technology to detect hash matches in images uploaded to the service and verifies these positive matches using human reviewers.

We also work directly with businesses that use Amazon technologies and services to mitigate abuse in their products and services. For example, we make Thorn’s Safer technology available to businesses via the Amazon Web Services (AWS) Marketplace so they can proactively identify and address CSAM.

Our services enable anyone to report inappropriate, harmful, or illegal content to us. When we receive reports of prohibited content, we act quickly to investigate and take appropriate action. We have relationships with U.S. and international hotlines, like the National Center for Missing & Exploited Children (NCMEC) and the Internet Watch Foundation (IWF), to allow us to receive and quickly act on reports of CSAM.

How we partner with our subsidiaries and non-profits to keep our communities safe.

Depending on the specifics of the service, we remove or disable, as applicable: URLs, images, chat interactions, or accounts. Here’s what we did in 2022.

Amazon and its subsidiaries collectively submitted 67,073 reports of CSAM to NCMEC in 2022. In particular, Amazon Photos detected and reported 52,633 images using Safer. Amazon also received 23 reports of other content such as chat interactions and URLs from third parties. In some cases, the content we removed was reported to us by trusted entities like NCMEC, IWF, Canadian CyberTipline, and INHOPE hotlines. Trusted reporters submitted a total of 398 reports for content that we quickly removed. All of the pieces of content that Amazon actioned were found in a total of 7,322 accounts. Information about Twitch’s efforts is available here.

Commitments and partnerships

As part of our work to fight CSAM, we engage with a variety of organizations and support their work to protect children.

Amazon has endorsed the Voluntary Principles to Counter Child Sexual Exploitation and Abuse and is part of the WePROTECT Global Alliance. In 2022, we joined NCMEC’s board while continuing to serve as a board member and active participant of the Tech Coalition.

Beginning in 2023, Amazon is providing NCMEC millions of dollars in AWS technology and services to reliably operate mission-critical infrastructure and applications to help missing and exploited children. Amazon continues to partner closely with Thorn, including by providing millions of dollars in free advanced cloud services and technical support from AWS for Thorn to develop and operate their services.

More information on our partnerships

  • National Center for Missing and Exploited Children
    Amazon is a member of the National Center for Missing and Exploited Children (NCMEC) Board and regularly reports CSAM to NCMEC. NCMEC relies on AWS technology and services to reliably operate its mission critical infrastructure and applications. Our subsidiary Ring has partnered with NCMEC on safety education content and training, and makes missing child posters visible directly in the Ring Neighbors app to users whose defined neighborhoods fall within the area of interest. Posts prompt users to contact the proper authorities if they have any information that could help in the search for a missing child.
  • Tech Coalition
    Amazon is a member of the Tech Coalition (TC), which convenes technology companies in an effort to find solutions to combat CSAM. We serve on the Board of the TC and actively participate in each working group. In 2022, we co-chaired the TC’s research working group and the Transparency Framework working group.
  • WePROTECT Global Alliance
    Amazon has endorsed the Voluntary Principles to Counter Child Sexual Exploitation and Abuse and is a part of the WePROTECT Global Alliance. Publishing the information here is part of our commitment under Principle 11 to share meaningful information about how we combat CSAM.
  • Internet Watch Foundation
    Amazon supports the Internet Watch Foundation (IWF), an internet hotline for the public and IT professionals to report potentially criminal online content within its remit and to be the "notice and takedown" body for this content. IWF works in partnership with the internet and tech industries, global law enforcement, governments, the education sector, charities and nonprofits across the world, and the public to minimize, disrupt, and stop the availability of CSAM.
  • INHOPE Network of Hotlines
    Amazon is a member of the International Association of Internet Hotlines. INHOPE is made up of hotlines around the world which enable the rapid identification and removal of child sexual abuse material from the digital world.
  • Family Online Safety Institute
    Amazon is a member of the Family Online Safety Institute, an international, nonprofit organization that works to make the online world safer for kids and their families. FOSI convenes leaders in industry, government, and the nonprofit sectors to collaborate and innovate new solutions and policies in the field of online safety.
  • Paris Peace Forum
    Amazon was proud to stand with other companies, civil society groups, and governments at the 2021 Paris Peace Forum in standing up for child safety online. In 2022, Amazon joined as a founding member the Children Online Protection Lab, an initiative gathering governments, industry, NGOs, researchers and other stakeholders to identify, assess and develop concrete protocols and solutions enabling children to use digital tools safely and benefit from their full potential without being exposed to abuse.

Learn more

  • Where can I report suspected CSAM?
    If you encounter CSAM on an Amazon service, you can report it through the abuse reporting tools provided by that service. You can also report suspected CSAM that you encounter anywhere online directly to NCMEC or to your local hotline. In many cases, you can report anonymously.
  • What is NCMEC? What is a CyberTip?
    The National Center for Missing and Exploited Children (NCMEC) is a nonprofit corporation whose mission is to help find missing children, reduce child sexual exploitation, and prevent child victimization. NCMEC works with families, victims, private industry, law enforcement, and the public to assist with preventing child abductions, recovering missing children, and providing services to deter and combat child sexual exploitation. NCMEC operates the CyberTipline®, a national mechanism for the public and electronic service providers to report instances of suspected child sexual exploitation. Since its inception in 1998, the CyberTipline has received more than 100 million reports.
  • What is Thorn and what is Safer? How can I learn more?
    Thorn is a tech nonprofit focused on stopping the spread of CSAM on the internet. Its Safer tool, powered by AWS since 2019, is a CSAM detection product built and made available by Thorn to AWS customers. Safer provides tech companies with both detection and wellness-forward post-detection moderation tools, and is available in AWS Marketplace. The application works within customer storage environments to detect known and unknown CSAM globally; after that, it elevates suspected CSAM for review and helps report confirmed material to NCMEC, which is uniquely positioned to engage law enforcement to rescue victims. AWS customers can find Safer in AWS Marketplace; other companies can contact Thorn for more information.
  • What is Amazon’s approach to children’s safety?
    We are committed to developing age-appropriate products and services for kids to help parents and guardians navigate their families through an ever-changing digital landscape. Read more about Amazon's approach to children's safety.

Read the 2021 update.
Read the 2020 update.