Across Amazon’s services, we devote significant resources to combating CSAM, and we continue to invest in new technologies and tools to improve our ability to prevent, detect, respond to, and remove it.

As we strive to be Earth’s Most Customer-Centric Company, Amazon and its subsidiaries provide services directly to customers as well as enable businesses to use our technology and services to sell and provide their own products and services. In all cases, our policies, teams, and tools work together to prevent and mitigate CSAM.
Our consumer services use a variety of tools and technologies such as machine learning, keyword filters, automated detection tools, and human moderators to screen images, videos, or text in public-facing content for policy compliance before it’s allowed online. These measures enforce multiple policies, including prohibitions on CSAM, and they appear to be an effective deterrent, as reflected in the low number of reports for CSAM. As one example, Amazon Photos uses Thorn’s Safer technology to detect hash matches in images uploaded to the service and verifies these positive matches using human reviewers.
We also work directly with businesses that use Amazon technologies and services to mitigate abuse in their products and services. For example, we make Thorn’s Safer technology available to businesses via the Amazon Web Services (AWS) Marketplace so they can proactively identify and address CSAM.
Our services enable anyone to report inappropriate, harmful, or illegal content to us. When we receive reports of prohibited content, we act quickly to investigate and take appropriate action. We have relationships with U.S. and international hotlines, like the National Center for Missing & Exploited Children (NCMEC) and the Internet Watch Foundation (IWF), to allow us to receive and quickly act on reports of CSAM.
Depending on the specifics of the service, we remove or disable, as applicable: URLs, images, chat interactions, or accounts. Here’s what we did in 2022.
Amazon and its subsidiaries collectively submitted 67,073 reports of CSAM to NCMEC in 2022. In particular, Amazon Photos detected and reported 52,633 images using Safer. Amazon also received 23 reports of other content such as chat interactions and URLs from third parties. In some cases, the content we removed was reported to us by trusted entities like NCMEC, IWF, Canadian CyberTipline, and INHOPE hotlines. Trusted reporters submitted a total of 398 reports for content that we quickly removed. All of the pieces of content that Amazon actioned were found in a total of 7,322 accounts. Information about Twitch’s efforts is available here.

Commitments and partnerships

As part of our work to fight CSAM, we engage with a variety of organizations and support their work to protect children.
Amazon has endorsed the Voluntary Principles to Counter Child Sexual Exploitation and Abuse and is part of the WePROTECT Global Alliance. In 2022, we joined NCMEC’s board while continuing to serve as a board member and active participant of the Tech Coalition.
Beginning in 2023, Amazon is providing NCMEC millions of dollars in AWS technology and services to reliably operate mission-critical infrastructure and applications to help missing and exploited children.Amazon continues to partner closely with Thorn, including by providing millions of dollars in free advanced cloud services and technical support from AWS for Thorn to develop and operate their services.

More information on our partnerships

FAQs

Read the 2021 update.
Read the 2020 update.