Google Teams Up with UK Nonprofit StopNCII to Combat Nonconsensual Intimate Images on Search

Google partners with UK nonprofit to detect and remove nonconsensual intimate images from Search
TechCrunch

Key Points

  • Google partners with U.K. nonprofit StopNCII to fight nonconsensual intimate images.
  • The partnership uses StopNCII’s image hashes to proactively detect and remove such content from Search.
  • Only digital fingerprints (hashes) are shared, keeping original images on users’ devices.
  • Google’s existing tools already allow user‑requested removals and ranking improvements.
  • The move follows Microsoft’s similar integration with Bing and aligns Google with many other platforms using StopNCII’s system.
  • Feedback from survivors and advocates highlights the need for continued action.
  • Google aims to reduce the visibility of harmful content and ease the burden on affected individuals.

Google is joining forces with the U.K. nonprofit StopNCII to strengthen its fight against nonconsensual intimate images, often called revenge porn. By using StopNCII’s image hashes, Google will proactively identify and remove such content from its search results. The partnership builds on existing tools that let users request removals and improve ranking to reduce visibility. The approach keeps original images on users’ devices, uploading only the digital fingerprint. The move follows similar integrations by Microsoft Bing and aligns Google with a growing list of platforms that have adopted StopNCII’s system.

Partnership Overview

Google announced a partnership with the U.K. nonprofit StopNCII to enhance its ongoing efforts to curb the spread of nonconsensual intimate images, commonly referred to as revenge porn. The collaboration will see Google incorporate StopNCII’s hashes—digital fingerprints of images and videos—into its search infrastructure. These hashes enable the search engine to proactively detect and remove matching content from its results, reducing the likelihood that such material appears in response to user queries.

How the Technology Works

StopNCII creates a unique identifier, or hash, for each intimate image submitted by an adult who wishes to protect their privacy. Only the hash, not the original image, is uploaded to StopNCII’s system, ensuring that the private content never leaves the user’s device. Partner platforms, including Google, can then compare these hashes against content they host or index. When a match is found, the offending material is removed or de‑ranked, effectively limiting its visibility.

Google’s Existing Tools and New Enhancements

Google highlighted that its current tools already allow individuals to request the removal of nonconsensual intimate images from Search. In a blog post, the company noted, "Our existing tools allow people to request the removal of NCII from Search, and we’ve continued to launch ranking improvements to reduce the visibility of this type of content." The partnership adds an additional layer of automation, enabling proactive identification rather than relying solely on user‑initiated removal requests.

Industry Context and Broader Adoption

The partnership arrives after Microsoft integrated StopNCII’s system into Bing, making Google’s move a later but significant addition to the industry’s response. StopNCII’s technology has also been adopted by a range of social media and content platforms, including Facebook, Instagram, TikTok, Reddit, Bumble, Snapchat, OnlyFans, X, and others. By joining this coalition, Google aligns itself with a growing ecosystem of companies committed to combating the distribution of nonconsensory intimate media.

Impact on Survivors and Advocates

Google cited feedback from survivors and advocates, stating, "We have also heard from survivors and advocates that given the scale of the open web, there’s more to be done to reduce the burden on those who are affected by it." The partnership aims to ease that burden by reducing the exposure of harmful content and streamlining the removal process.

Future Outlook

While Google’s integration of StopNCII’s hashes marks a notable step forward, the company acknowledges that ongoing work is needed to address the broader challenges of nonconsensual intimate image distribution. Continued collaboration with nonprofits, technology improvements, and expanded adoption across the internet ecosystem are positioned as key components of Google’s longer‑term strategy to protect privacy and safety online.

#Google#StopNCII#nonconsensual intimate images#revenge porn#search#hash technology#online privacy#digital safety#Microsoft Bing#social media platforms
Generated with  News Factory -  Source: TechCrunch

Also available in: