Across Amazon’s services, we devote significant resources to combating CSAM, and we continue to invest in new technologies and tools to improve our ability to prevent, detect, respond to, and remove it.

As we strive to be Earth’s Most Customer-Centric Company, Amazon and its subsidiaries provide services directly to customers as well as enable businesses to use our technology and services to sell and provide their own products and services. In all cases, our policies, teams, and tools work together to prevent and mitigate CSAM.
Our consumer services use a variety of tools and technologies such as machine learning, keyword filters, automated detection tools, and human moderators to screen images, videos, or text in public-facing content for policy compliance before it’s allowed online. These measures enforce multiple policies, including prohibitions on CSAM, and they appear to be an effective deterrent, as reflected in the low number of reports for CSAM. As one example, Amazon Photos uses Thorn’s Safer technology to detect hash matches in images uploaded to the service and verifies positive matches using human reviewers.
We also work directly with businesses that use Amazon technologies and services to mitigate abuse in their products and services. For example, we make Thorn’s Safer technology available to businesses via the Amazon Web Services (AWS) Marketplace so they can proactively identify and address CSAM.
Our services enable anyone to report inappropriate, harmful, or illegal content to us. When we receive reports of prohibited content, we act quickly to investigate and take appropriate action. We have relationships with U.S. and international hotlines, like the National Center for Missing & Exploited Children (NCMEC) and the Internet Watch Foundation (IWF), to allow us to receive and quickly act on reports of CSAM.
Here’s what we did in 2023.
Amazon Photos improved the actionability of its reports to NCMEC by adding additional information, which enables law enforcement to better investigate potential violations.
Depending on the specifics of the service, we remove or disable, as applicable: URLs, images, chat interactions, or accounts.
Amazon and its subsidiaries collectively submitted 31,281 reports of CSAM to NCMEC in 2023. Amazon Photos saw a decline in uploaded CSAM in 2023, detecting and reporting 24,653 images using Safer – a 44% decline YOY. Amazon received 103 reports from third parties for potential CSAM content including chat interactions and URLs. Hotlines, such as those administered by NCMEC, IWF, Canadian CyberTipline, and INHOPE, submitted a total of 611 reports for content that we promptly reviewed and actioned as appropriate (average time 3.8 hours). All of the pieces of content that Amazon actioned in 2023 were found in a total of 4,111 accounts. Information about Twitch’s efforts is available here.

Commitments and partnerships

As part of our work to fight CSAM, we engage with a variety of organizations and support their work to protect children.
Amazon has endorsed the Voluntary Principles to Counter Child Sexual Exploitation and Abuse and is part of the WePROTECT Global Alliance. We sit on the boards of NCMEC and the Tech Coalition. Together with Thorn and All Tech is Human, we have committed to safely design our generative AI services to reduce the risk that they will be misused for child exploitation.
Amazon provides NCMEC millions of dollars in AWS technology and services to reliably operate mission-critical infrastructure and applications to help missing and exploited children. In 2023, Amazon provided financial support to NCMEC’s Exploited Child Division to advance their hash sharing and Notice and Tracking initiatives that help remove CSAM from the internet. Amazon continues to partner closely with Thorn, including by providing millions of dollars in free advanced cloud services and technical support from AWS for Thorn to develop and operate their services. In 2023, with financial support from AWS, Thorn's Safer Essential was developed and added to the AWS Marketplace for customers to easily detect known CSAM.

More information on our partnerships

FAQs