Amazon and its subsidiaries are committed to preventing child sexual abuse material (CSAM) in every segment of our business. We expressly prohibit the use of our services to create, store, or distribute CSAM.


Across our services, we devote significant resources to combating CSAM and continue investing in new technologies and tools to improve our ability to prevent, detect, respond to, and remove it.

Prevent. We use a variety of tools and technologies that prevent the distribution of CSAM on our consumer services. Some of our services use technology, such as machine learning and automated detection tools, to screen content. Other services use human moderators to ensure prohibited content, including CSAM, is not shared or displayed.

Detect. Amazon Photos uses Thorn’s Safer technology to detect hash matches in images uploaded to the service and verifies these positive matches using human reviewers.

Respond. Our services enable anyone to report inappropriate, harmful, or illegal content to us. When we receive reports of prohibited content, we act quickly to investigate and take appropriate action. We have relationships with U.S. and international hotlines, like the National Center for Missing and Exploited Children (NCMEC) and the Internet Watch Foundation (IWF), to allow us to receive and quickly act on reports of CSAM.

As part of our work to fight CSAM, we engage with a variety of organizations and support their work to protect children.

Amazon has partnered closely with Thorn. We provide millions of dollars in free advanced cloud services and technical support from AWS for Thorn to develop and operate their services. We also make Thorn’s Safer technology available to AWS customers via AWS Marketplace so they can identify and address CSAM on their own services and resources.

Amazon has endorsed the Voluntary Principles to Counter Child Sexual Exploitation and Abuse and is part of the WePROTECT Global Alliance. Amazon is a member of the Tech Coalition, which convenes technology companies in an effort to find solutions to the problem of online CSAM, and the Family Online Safety Institute, an international, nonprofit organization that works to make the online world safer for children and their families.

We remove or disable URLs, images, chat interactions, and accounts as appropriate. Here’s what we did in 2021:

Amazon and its subsidiaries collectively submitted 33,848 reports of CSAM to NCMEC in 2021. Amazon’s investments in new technology and the growth of our services drove an increase in detected material. In particular, Amazon Photos detected 25,540 images using Safer. Amazon also received 1,704 reports of other content such as chat interactions and URLs from third parties. In some cases, the content we removed was reported to us by trusted entities like NCMEC, the Internet Watch Foundation, the Canadian CyberTipline, and other InHOPE Reporters. Trusted reporters submitted a total of 780 reports for content that was removed in under 24 hours from time of notice. All of the pieces of content that we actioned were found in a total of 2,451 accounts. Information about Twitch’s efforts is available here.

Read our update from 2020.

More information on our partnerships

  • Tech Coalition
    Amazon is a member of the Tech Coalition (TC), which convenes technology companies in an effort to find solutions to combat CSAM. We serve on the Board of the TC and participate in each working group. In 2021, we were closely involved in the TC’s research working group, which awarded grants to five global organizations seeking to combat CSAM from many different angles, as well as a forum that convened stakeholders to discuss emerging vectors of abuse.

  • WePROTECT Global Alliance
    Amazon has endorsed the Voluntary Principles to Counter Child Sexual Exploitation and Abuse and is a part of the WePROTECT Global Alliance. Publishing the information here is part of our commitment under Principle 11 to share meaningful information about how we combat CSAM.
  • National Center for Missing and Exploited Children (NCMEC)
    In addition to our relationship with NCMEC as a CSAM reporting mechanism, Amazon invests in NCMEC’s child sex trafficking recovery services, and NCMEC relies on AWS technology and services to reliably operate its mission critical infrastructure and applications. Our subsidiary Ring has partnered with NCMEC on safety education content and training, and makes missing child posters visible directly in the Ring Neighbors app to users whose defined neighborhoods fall within the area of interest. Posts prompt users to contact the proper authorities if they have any information that could help in the search for a missing child.
  • Internet Watch Foundation
    Amazon supports the Internet Watch Foundation, an internet hotline for the public and IT professionals to report potentially criminal online content within its remit and to be the "notice and takedown" body for this content. IWF works in partnership with the internet and tech industries, global law enforcement, governments, the education sector, charities and nonprofits across the world, and the public to minimize, disrupt, and stop the availability of CSAM.
  • Family Online Safety Institute
    Amazon is a member of the Family Online Safety Institute, an international, nonprofit organization that works to make the online world safer for kids and their families. FOSI convenes leaders in industry, government, and the nonprofit sectors to collaborate and innovate new solutions and policies in the field of online safety.
  • Paris Peace Forum
    Amazon was proud to stand with othercompanies, civil society groups, and governments at the 2021 Paris Peace Forum in standing up for child safety online.

Learn more

  • Where can I report suspected CSAM?
    If you encounter CSAM on an Amazon service, you can report it through the abuse reporting tools provided by that service. You can also report suspected CSAM that you encounter anywhere online directly to NCMEC or to your local hotline. In many cases, you can report anonymously.
  • What is NCMEC? What is a CyberTip?
    The National Center for Missing and Exploited Children (NCMEC) is a nonprofit corporation whose mission is to help find missing children, reduce child sexual exploitation, and prevent child victimization. NCMEC works with families, victims, private industry, law enforcement, and the public to assist with preventing child abductions, recovering missing children, and providing services to deter and combat child sexual exploitation. NCMEC operates the CyberTipline®, a national mechanism for the public and electronic service providers to report instances of suspected child sexual exploitation. Since its inception in 1998, the CyberTipline has received more than 100 million reports.
  • What is Thorn and what is Safer? How can I learn more?
    Thorn is a tech nonprofit focused on stopping the spread of CSAM on the internet. Its Safer tool, powered by AWS since 2019, is a CSAM detection product built and made available by Thorn to AWS customers. Safer provides tech companies with both detection and wellness-forward post-detection moderation tools, and is available in AWS Marketplace. The application works within customer storage environments to detect known and unknown CSAM globally; after that, it elevates suspected CSAM for review and helps report confirmed material to NCMEC, which is uniquely positioned to engage law enforcement to rescue victims. AWS customers can find Safer in AWS Marketplace; other companies can contact Thorn for more information.