Facebook quietly made a huge acknowledgment

Must read


Back to February, Facebook announced a small experiment. It will reduce the amount of political content displayed to some users in several countries, including the United States, and then ask about their experience. “Our goal is to maintain the ability of people to find and interact with political content on Facebook, while respecting everyone’s interest in political content at the top of the news feed,” Product Management Director Aastha Gupta explained in a blog post.

On Tuesday morning, the company Update providedThe survey results came out, and they suggested that users like to see less political content in their feeds. Now, Facebook intends to repeat this experiment in more countries/regions, and says it “will expand further in the coming months.” For a company that is in perpetual trouble because of its alleged influence on politics, it makes sense to depoliticize people’s feeds.After all, this move was first announced one month after Donald Trump’s supporters stormed into the U.S. Capitol, this episode some people, Including elected officials, tried to blame Facebook. This change may eventually have a major ripple effect on political groups and media organizations that have become accustomed to relying on Facebook for distribution.

However, the most important part of the Facebook announcement has nothing to do with politics.

The basic premise of any AI-driven social media feed (such as Facebook, Instagram, Twitter, TikTok, YouTube) is that you don’t need to tell it what you want to watch. Just observe that you like, share, comment or just stay, and the algorithm will understand what kind of material will interest you and keep you on the platform. Then it will show you more similar things.

In a sense, this design feature provides social media companies and their defenders with a convenient defense against criticism: If something shines on the platform, it is because it is liked by users. If you have questions about this, maybe your problem is related to the user.

However, at the same time, optimizing engagement is at the core of many social platform criticisms. Algorithms that focus too much on engagement may push users to content that may have super engagement but low social value. It may provide them with increasingly attractive posts because they are becoming more and more extreme. It may encourage the viral spread of false or harmful materials because the system first selects content that will trigger participation, rather than content that should be seen.The list of ills associated with participating in priority design helps explain why Mark Zuckerberg, Jack Dorsey, and Sundar Pichai are not Will admit At the Congressional hearing in March, the platform under their control was built exactly like this. Zuckerberg insisted that “meaningful social interaction” is Facebook’s true goal. “Participation,” he said, “is just a sign that if we provide this value, then people will naturally use our services more.”

However, in a different context, Zuckerberg admitted that things may not be that simple.In 2018 postal, Explaining why Facebook would suppress those “edge” posts that tried to push to the edge of the platform rules without violating the platform rules, he wrote, “No matter where we draw the permissible limit, because a piece of content is close to a line, People will participate more on average-even if they later tell us that they don’t like content.” But this observation seems to be limited to the question of how to implement Facebook’s banned content policy, rather than rethinking the design of its ranking algorithm more broadly.



Source link

- Advertisement -spot_img

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement -spot_img

Latest article