Amazon and its subsidiaries are committed to preventing child sexual abuse material (CSAM) in every segment of our business. We expressly prohibit the use of our services to create, store, or distribute CSAM.


Across our services, we devote significant resources to combating CSAM and continue investing in new technologies and tools to improve our ability to prevent, detect, respond to, and remove it.

Prevent.We use a variety of tools and technologies that prevent the distribution of CSAM on our consumer services. Some of our services use technology, such as machine learning and automated detection tools, to screen content. Other services use human moderators to ensure prohibited content, including CSAM, is not shared or displayed.
Detect. Amazon Photos uses Thorn’s Safer technology to detect hash matches in images uploaded to the service and verifies these positive matches using human reviewers.
Respond. Our services enable anyone to report inappropriate, harmful, or illegal content to us. When we receive reports of prohibited content, we act quickly to investigate and take appropriate action. We have relationships with U.S. and international hotlines, like the National Center for Missing and Exploited Children (NCMEC) and the Internet Watch Foundation (IWF), to allow us to receive and quickly act on reports of CSAM.

As part of our work to fight CSAM, we engage with a variety of organizations and support their work to protect children.

Amazon has partnered closely with Thorn. We provide millions of dollars in free advanced cloud services and technical support from AWS for Thorn to develop and operate their services. We also make Thorn’s Safer technology available to AWS customers via AWS Marketplace so they can identify and address CSAM on their own services and resources.
Amazon has endorsed the Voluntary Principles to Counter Child Sexual Exploitation and Abuse and is part of the WePROTECT Global Alliance. Amazon is a member of the Tech Coalition, which convenes technology companies in an effort to find solutions to the problem of online CSAM, and the Family Online Safety Institute, an international, nonprofit organization that works to make the online world safer for children and their families.

We remove or disable URLs, images, chat interactions, and accounts as appropriate. Here’s what we did in 2021:

Amazon and its subsidiaries collectively submitted 33,848 reports of CSAM to NCMEC in 2021. Amazon’s investments in new technology and the growth of our services drove an increase in detected material. In particular, Amazon Photos detected 25,540 images using Safer. Amazon also received 1,704 reports of other content such as chat interactions and URLs from third parties. In some cases, the content we removed was reported to us by trusted entities like NCMEC, the Internet Watch Foundation, the Canadian CyberTipline, and other InHOPE Reporters. Trusted reporters submitted a total of 780 reports for content that was removed in under 24 hours from time of notice. All of the pieces of content that we actioned were found in a total of 2,451 accounts. Information about Twitch’s efforts is available here.

More information on our partnerships

More information on our partnerships

FAQs