Facebook’s success has been built on algorithms. Can they also fix it?


Now, hours of testimony and thousands of pages of documents posted by Facebook whistleblower Frances Haugen has renewed the examination of the impact of Facebook and its algorithms on teens, democracy and society in general. It has also raised the question of just how much Facebook, and perhaps similar services, can or should rethink using a multitude of algorithms to determine what images, videos and news we see.

Haugen, a former Facebook product manager with a background in ‘algorithmic product management’, mainly focused in her critiques of the company’s algorithm designed to show users what content they are most likely to use. to commit. She said this was responsible for many of Facebook’s problems, including polarization, misinformation and other toxic content. Facebook, she said in an appearance in “60 Minutes,” understands that if it makes the algorithm safer, “people will spend less time on the site, they will click less ads, they will earn less money “. (Facebook CEO Mark Zuckerberg pushed back against the idea that the company prioritizes profit over user safety and well-being.)
Facebook’s chief global policy officer Monika Bickert said in an interview with CNN after Haugen’s hearing with a Senate subcommittee on Tuesday that it is “not true” that the algorithms of the company are designed to promote inflammatory content, and that the company is actually doing “the opposite” by demoting what is called click-bait.
At times in her testimony, Haugen appeared to suggest a radical overhaul of how the Newsfeed should work to address the issues she presented through extensive documentation within the company. “I am a strong believer in chronological ordering, in chronological order,” she said when testifying before a Senate subcommittee this week. “Because I think we don’t want computers to decide what we focus on.”

But the algorithms that choose what we see are at the heart of not just Facebook, but many social media platforms that have followed in Facebook’s footsteps. TikTok, for example, would be unrecognizable without the content recommendation algorithms running the show. And the larger the platform, the greater the need for algorithms to filter and sort content.

Algorithms do not disappear. But there are ways Facebook can improve them, experts in algorithms and artificial intelligence told CNN Business. It will, however, require something that Facebook has so far seemed reluctant to offer (despite talking points from executives): more transparency and control for users.

What’s in an algorithm?

The Facebook you meet today, with a constant flow of information and algorithmically selected ads, is a very different social network than it was when it started. In 2004, when Facebook was first launched as a site for students, navigation was both easier and more tedious: if you wanted to see what your friends were posting, you had to visit their profiles one by one. .
This started to change in a major way in 2006, when Facebook introduced the News Feed, giving users a fire hose of updates from family, friends and that guy they had a few bad times with. appointment. From the start, Facebook reportedly used algorithms to filter the content that users saw in the News Feed. In a 2015 Time Magazine article, the company’s product manager Chris Cox said retention was necessary even then because there was too much information to show everything to every user. Over time, Facebook’s algorithms evolved and users got used to the algorithms determining how Facebook content would be presented.

An algorithm is a set of steps or mathematical instructions, especially for a computer, telling it what to do with certain inputs to produce certain outputs. You can think of it as pretty much a recipe, where the ingredients are the starters and the end dish is the exit. On Facebook and other social media sites, however, you and your actions – what you write or what images you post – are the entrance. What the social network shows you, whether it’s a post from your best friend or an advertisement for camping gear, is the result.

At their best, these algorithms can help personalize feeds so that users discover new people and content that matches their interests based on their past activity. At worst, as Haugen and others have pointed out, they run the risk of directing people to disturbing rabbit holes that can expose them to toxic content and misinformation. Either way, they allow people to scroll for longer, which potentially helps Facebook make more money by showing more ads to users.

Many algorithms work together to create the experience you see on Facebook, Instagram, and elsewhere online. This can make it even more complicated to find out what is going on inside such systems, especially in a large company like Facebook where several teams are building various algorithms.

“If a higher power were to go to Facebook and say, ‘Correct the algorithm in XY’, that would be really difficult because they have become really complex systems with many inputs, many weights, and they are like several systems working. together, “said Hilary Ross, senior program manager at the Berkman Klein Center for Internet & Society at Harvard University and head of its Institute for Restarting Social Media.

More transparency

However, there are ways to make these processes clearer and give users more say in how they work. Margaret Mitchell, who leads AI ethics for AI model builder Hugging Face and formerly co-lead of Google’s AI ethics team, believes it could be done by allowing you to view details about why you see what you see on a social network. , for example in response to posts, advertisements and other material that you view and interact with.
Why whistleblower Frances Haugen is Facebook's worst nightmare

“You can even imagine having your say. You might be able to select preferences for the type of things you want to optimize for you,” she said, such as how often you want to see content from your immediate family, high school friends or baby photos. All of these things can change over time. Why not let the users control them?

Transparency is essential, she said, because it encourages good social media behavior.

According to Sasha Costanza-Chock, director of research and design at the Algorithmic Justice League, social media could also be pushed towards greater transparency. They envision this to include fully independent researchers, investigative journalists, or people within regulatory bodies – not the social media companies themselves or the companies they hire – who have the knowledge, the skills and legal authority to demand access to algorithmic systems to ensure laws are not violated and best practices are followed.

James Mickens, professor of computer science at Harvard and co-director of the Institute for Rebooting Social Media at the Berkman Klein Center, suggests looking for ways to audit elections without revealing private information about voters (such as for whom every person voted) for information on how algorithms can be audited and reformed. He thinks this could provide clues to creating an auditing system that would allow people outside of Facebook to provide oversight while protecting sensitive data.

Other indicators of success

A major barrier to meaningful improvement, experts say, is social media’s current focus on the importance of engagement, or the time users spend scrolling, clicking, and otherwise interacting with posts and ads on. social networks.

Haugen revealed internal Facebook documents that show the social network is aware that its “commodity mechanisms, such as virality, recommendations, and engagement optimization, are an important part” of the reason for which hate speech and disinformation “flourish” on its platform.

Changing that is tricky, experts said, although several agreed that it may involve taking into account the feelings users have when using social media and not just the time they spend there.

“Commitment is not synonymous with good mental health,” said Mickens.

Can Algorithms Really Help Solve Facebook Problems? Mickens, at least, hopes the answer is yes. He thinks they can be optimized further towards the public interest. “The question is: what will convince these companies to start thinking this way? ” he said.

In the past, some might have said this would require pressure from the advertisers whose dollars back these platforms. But in his testimony, Haugen seemed to bet on a different answer: pressure from Congress.



Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button