-

Facebook Tests News Feed Control to Reduce Group and Specific Content Display | Today Nation News

TechnologyFacebook Tests News Feed Control to Reduce Group and...

On November 18th, Facebook announced that it will be testing to give users more control over what is displayed on the platform.

The test will be conducted on the Facebook app for English-speaking users. Three submenus, “Friends and Family,” “Groups and Pages,” and “Celebrities,” will be added to the menu that manages what’s displayed in Facebook’s news feed. Users participating in the test can choose to keep the percentage of these posts “normal” or change it to “more” or “less”, depending on their preference.

Participants in the test will be able to specify topics that they are interested in or do not want to see as well. Facebook said in a blog post that the test will be available to “a small percentage of people” around the world, and that it will gradually expand in the coming weeks.

Facebook will also extend tools that allow advertisers to exclude content from specific topic areas so that brands don’t appear next to “news and politics,” “social issues,” and “crime and catastrophe.” To. “When an advertiser selects one or more topics, the ad isn’t delivered to people who have recently responded to those topics in their news feed,” the company wrote in a blog post.

READ MORE  BMW electric scooter "CE 04", company executives believe that style and technology will bring new customers | Today Nation News

Facebook’s algorithms are notorious for promoting sensational content and dangerous false alarms. As a result, Facebook and its new parent company, Meta, are under pressure from regulators to cleanse the platform and make its practices more transparent. Congress is looking at solutions to give users control over what’s displayed and to eliminate algorithmic content opacity, but Facebook hopes there’s still time for self-regulation. Seems to be.

In October 2021, Facebook whistleblower Frances Haugen pointed out that Facebook’s opaque algorithms are dangerous, especially in countries other than the markets the company is most scrutinizing.

In the United States and Europe as well, the decision to prioritize engagement in news feed ranking systems has led to a surge in fragmented content and politically instigating posts.

In “60 Minutes,” also aired in October, Haugen said, “One of the consequences of choosing Facebook’s content today is to optimize it for engagement and reaction. But, according to our own research, hate, division, and biased content are more likely to inspire people‘s anger than other emotions. “

Image credit: Facebook

[To the original text]

(Sentence: Taylor Hatmaker, Translation: Akihito Mizukoshi)

.

Latest news

You might also likeRELATED
Recommended to you