Don't worry,japanese video sex militia members will have to wait until afterElection Day to be algorithmically pointed to Facebook groups of like-minded individuals.
At Wednesday's Senate hearing on (at least in theory) Section 230, Facebook CEO Mark Zuckerberg let slip a slight behind-the-scenes change his company has taken in the lead up to Nov. 3. Specifically, Zuckerberg offhandedly mentioned that Facebook has temporarily stopped recommending political issue Facebook groups to its users.
Of course, Facebook intends to spin this presumably dangerous — or, at the very least, worrisome — recommendation feature right back up again after the election. So reports BuzzFeed News, which was able to confirm that the new policy is only temporary.
"This is a measure we put in place in the lead-up to Election Day," Facebook spokesperson Liz Bourgeois told the publication. "We will assess when to lift them afterwards, but they are temporary."
Because obviously we won't have any social media-juiced instances of violence after the election. Heavens no.
Notably, this move comes at a time when Zuckerberg — as expressed in his Thursday earnings call — is "worried that with our nation so divided and election results potentially taking days or weeks to be finalized, there is a risk of civil unrest across the country."
Facebook, which recently attempted to ban QAnon conspiracy groups, has particular reason to be concerned about the upcoming election and possible associated violence. Well, concern for its reputation, anyway. The platform has served as a breeding group for violent conspiracy theories for years, and a simple QAnon ban isn't going to change that.
There is a real possibility that the next Kenosha-style tragedy is already being planned, coordinated, or hyped with Facebook tools — only now with an Election Day twist. Facebook's attempt to cool things down by pausing an element of its own recommendation system calls attention to the simple fact that Facebook itself is fundamentally problematic.
Facebook knows this. In May of this year, the Wall Street Journalreported that Facebook had ignored its own internal research showing that its algorithms were making the site more divisive.
"Our algorithms exploit the human brain's attraction to divisiveness," read a slide from a 2018 presentation. "If left unchecked," it warned, Facebook would feed users "more and more divisive content in an effort to gain user attention & increase time on the platform."
SEE ALSO: People are fighting algorithms for a more just and equitable future. You can, too.
No temporary pause of a single recommendation feature, no matter how well intentioned, is going to change that.
Topics Facebook Social Media
Previous:Nisei Week Events at JACCC
Early Prime Day Roomba deals: Get the Roomba 105 Combo for $220YouTube is injecting more AI into your recommendationsPS5 vs. PS5 Slim: What are the differences?Moon phase today explained: What the moon will look like on June 23, 2025Listen to the eerie sounds of Mars recorded by a NASA roverHow Airbnb is aiding Hurricane Helene victimsAtari 2600: The Atlantis of Game ConsolesA Place of WoundsA Hubble scientist was urged not to take a risky cosmic image. He didn't listen.War Stories Duke vs. UVM basketball livestreams: How to watch live Solana blockchain overrun with racist memecoins in latest cryptocurrency trend 2024 Microsoft Surface event: Everything announced, including Surface Pro 10, Laptop 6 TCU vs. USU basketball livestreams: How to watch live Instagram and Threads are automatically limiting political content Samsung TV deal: Pre Arizona vs. UD basketball livestreams: How to watch live Gonzaga vs. McNeese basketball livestreams: How to watch live Amazon spring sale 2024: 45+ headphones and speaker deals BYU vs. Duquesne basketball livestreams, game time
0.2165s , 8421.828125 kb
Copyright © 2025 Powered by 【japanese video sex】Facebook pauses groups recommendation feature until after election,Feature Flash