国产三级大片在线观看-国产三级电影-国产三级电影经典在线看-国产三级电影久久久-国产三级电影免费-国产三级电影免费观看

Set as Homepage - Add to Favorites

【??? ?? ????】Microsoft's AI makes racist error and then publishes stories about it

Source:Feature Flash Editor:synthesize Time:2025-07-03 02:30:25

Hey,??? ?? ???? at least Microsoft's news-curating artificial intelligence doesn't have an ego. That much was made clear today after the company's news app highlighted Microsoft's most recent racist failure.

The inciting incident for this entire debacle appears to be Microsoft's late May decision to fire some human editors and journalists responsible for MSN.com and have its AI curate and aggregate stories for the site instead. Following that move, The Guardianreported earlier today that Microsoft's AI confused two members of the pop band Little Mix, who both happen to be women of color, in a republished story originally reported by The Independent. Then, after being called out by band member Jade Thirlwall for the screwup, the AI then published stories about its own failing.

So, to recap: Microsoft's AI made a racist error while aggregating another outlet's reporting, got called out for doing so, and then elevated the coverage of its own outing. Notably, this is after Microsoft's human employees were reportedly told to manually remove stories about the Little Mix incident from MSN.com.


You May Also Like

Still with me?

"This shit happens to @leighannepinnock and I ALL THE TIME that it's become a running joke," Thirlwall reportedly wrote in an Instagram story, which is no longer visible on her account, about the incident. "It offends me that you couldn't differentiate the two women of colour out of four members of a group … DO BETTER!"

As of the time of this writing, a quick search on the Microsoft News app shows at least one such story remains.

Mashable Light Speed Want more out-of-this world tech, space and science stories? Sign up for Mashable's weekly Light Speed newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!
Mashable ImageA story from T-Break Tech covering the AI's failings as it appears on the Microsoft News app. Credit: screenshot / microsoft news app

Notably, Guardian editor Jim Waterson spotted several more examples before they were apparently pulled.

"Microsoft's artificial intelligence news app is now swamped with stories selected by the news robot about the news robot backfiring," he wrote on Twitter.

We reached out to Microsoft in an attempt to determine just what, exactly, the hell is going on over there. According to a company spokesperson, the problem is not one of AI gone wrong. No, of course not. It's not like machine learning has a long history of bias (oh, wait). Instead, the spokesperson insisted, the issue was simply that Microsoft's AI selected the wrong photo for the initial article in question.

"In testing a new feature to select an alternate image, rather than defaulting to the first photo, a different image on the page of the original article was paired with the headline of the piece," wrote the spokesperson in an email. "This made it erroneously appear as though the headline was a caption for the picture. As soon as we became aware of this issue, we immediately took action to resolve it, replaced the incorrect image and turned off this new feature."

Unfortunately, the spokesperson did not respond to our question about humanMicrosoft employees deleting coverage of the initial AI error from Microsoft's news platforms.

Microsoft has a troubled recent history when it comes to artificial intelligence and race. In 2016, the company released a social media chatbot dubbed Tay. In under a day, the chatbot began publishing racist statements. The company subsequently pulled Tay offline, attempted to release an updated version, and then had to pull it offline again.

As evidenced today by the ongoing debacle with its own news-curating AI, Microsoft still has some work to do — both in the artificial intelligence and not-being-racistdepartments.

Topics Artificial Intelligence Microsoft Racial Justice

0.1458s , 9846.8125 kb

Copyright © 2025 Powered by 【??? ?? ????】Microsoft's AI makes racist error and then publishes stories about it,Feature Flash  

Sitemap

Top 主站蜘蛛池模板: 国产午夜久久久婷婷 | 亚洲阿v天堂无码z2024 | 日日夜夜久久鸭 | 天天操天天草 | 国产一区二区三区啪视频 | 日本一本免费线观看视频 | 一本一本久久AA综合精品 | 久久88台湾三级香港三级 | 国产欧美日韩在线中文一区 | 丁香五月激情缘综合区 | 色色综合| 国产福利97精品一区二区 | 欧美日韩久久精品一区二区三区四区 | 国产精品va无码二区 | 日韩国产成人无码AV毛片蜜柚 | 午夜亚洲国产理论片中文飘 | 午夜人妻一区二区三区熟女 | 无码中文字幕久久久一区二区 | 麻豆国产av丝袜白领传媒 | 欧美性猛交AAA片免费观看 | 91精品啪国产在线观看免费牛牛 | 国产精品无码免费专区午夜党 | 风流少妇又紧又爽又丰满 | 99久久精品免费国产一区二区三区 | 撕开奶罩揉吮奶头的A片 | 波多野结衣一区二区三区av高清 | 四虎天堂| www夜片内射视频日韩精品成人 | 人妻夜夜爽天天爽三区麻豆au | 中文字幕人乱码中文 | 宅男噜噜噜 | 精品国产丝袜在线 | 青青久在线视频免费观看 | 日本人妻中文字幕乱码系 | 国产av白丝娇喘小仙女 | 中文字幕卡二和卡三的视频 | 日本强好片久久久久久AAA | 日本无码人妻一区二区色欲 | 日韩中文字幕在线播放 | 亚洲AV色无码乱码在线观看 | 国产丰满老熟妇乱XXX1区 |