On Wednesday, Google announced a partnership with StopNCII.org to combat the spread of non-consensual intimate imagery (NCII), the company announced today. Over the next few months, Google will start using StopNCII’s hashes to proactively identify nonconsensual images in search results and remove them. Hashes are algorithmically-generated unique identifiers that allow services to identify and block imagery flagged as abuse without sharing or storing the actual source. StopNII says it uses PDQ for images and MD5 for videos.
As Bloomberg points out, Google has been called out for being slower than others in the industry to take this approach and its blog post seemed to acknowledge that. “We have also heard from survivors and advocates that given the scale of the open web, there’s more to be done to reduce the burden on those who are affected by it,” the post reads. Facebook, Instagram, TikTok, and Bumble all signed on with StopNCII as early as 2022, and Microsoft integrated it into Bing in September of last year.
Read the full article here