The tech giant has “heard concerns from victims, experts and other stakeholders” over the years when it comes to intimate imagery of people being shared online without their consent and has teamed up with StopNCII.org to combat the issue.
In a blog post, Microsoft said: “On July 30, Microsoft released a policy White Paper, outlining a set of suggestions for policymakers to help protect Americans from abusive AI deepfakes, with a focus on protecting women and children from online exploitation.
“Advocating for modernised laws to protect victims is one element of our comprehensive approach to address abusive AI-generated content – today, we also provide an update on Microsoft’s global efforts to safeguard its services and individuals from non-consensual intimate imagery (NCII).
“We have heard concerns from victims, experts and other stakeholders that user reporting alone may not scale effectively for impact or adequately address the risk that imagery can be accessed via search. As a result, today, we are announcing that we are partnering with StopNCII to pilot a victim-centred approach to detection in Bing, our search engine.”
Their collaborative platform would enable adults from across the globe to protect themselves and encouraged others to report to StopNCII if necessary.
“StopNCII is a platform run by SWGfL that enables adults from around the world to protect themselves from having their intimate images shared online without their consent. StopNCII enables victims to create a ‘hash’ or digital fingerprint of their images, without those images ever leaving their device (including synthetic imagery). Those hashes can then be used by a wide range of industry partners to detect that imagery on their services and take action in line with their policies.
“In March 2024, Microsoft donated a new PhotoDNA capability to support StopNCII’s efforts. We have been piloting use of the StopNCII database to prevent this content from being returned in image search results in Bing. We have taken action on 268 899 images up to the end of August. We will continue to evaluate efforts to expand this partnership. We encourage adults concerned about the release – or potential release – of their images to report to StopNCII.”
BANG SHOWBIZ