Facebook

Facebook
[image_credit]REUTERS/Johanna Geron/Illustration[/image_credit]
From online extremism that triggered the U.S. Capitol riots to false coronavirus claims and vaccine misinformation, it’s no secret that social media platforms like Facebook spread outrageous and erroneous content. What’s more, the Facebook Files, a collection of documents released by former Facebook manager Frances Haugen, show that the company played down these harmful effects. If there’s ever a moment to push for social media reform, it’s now.

So far, companies like Facebook say they just need to continue hiring more content moderators to screen and remove dangerous posts or lies. However, content moderation still hasn’t stopped lots of extremism and misinformation from spreading. In fact, it remains an endless game of whack-a-mole. Here’s why.

First, consider the problem of scale. Facebook, for instance, has billions of users, many who post daily. The resulting magnitude of user-generated content makes it impractical to hire a team of moderators large enough to screen and remove all harmful posts.

Second, consider another problem working against content moderators: the social media algorithms distributing content online. Algorithms that make Facebook so effective for advertising and sharing also make this platform increasingly difficult — if not impossible — to moderate. That’s because the algorithms spreading (mis)information work faster than the content moderators trying to screen and remove it.

To wrap our heads around this dilemma, consider what social media algorithms are designed to do: First, hijack your attention — say, with click-bait ads or viral content — when you log on to social media. Next, collect all data you leave traces of online, including your “likes” and amount of time spent scrolling and glancing at ads or content. Then, sell access to your data to advertisers or outside parties, whose goal is to target you with more ads or content that’ll keep you “liking” and scrolling.

By harvesting your private data to manipulate what you see online, social media algorithms work around the clock to target people with nonstop ads and content, especially information that keeps everyone addicted to “liking” and scrolling.

Unfortunately, this addictive design can incentivize a great deal of outrageous and erroneous information. After all, what’s going to addict people to “liking” and scrolling is often what’s outrageous, and what’s outrageous isn’t necessarily truthful. The result is a society left vulnerable to bad actors who provoke mobs by ginning up online outrage and spreading falsehoods.

Content moderation may sound good in theory, but it repeatedly fails in practice. That’s because social media’s problem isn’t merely an epidemic of extremism and misinformation. It’s that these platforms manipulate what people see online, with the goal of addicting users to “liking” and scrolling through endless content. As a result, social media platforms end up generating more information — including extreme and misleading info — than can possibly be taken down.

Christopher Cocchiarella
[image_caption]Christopher Cocchiarella[/image_caption]
If we want to mitigate extremism and misinformation on social media, we need to change the manipulative algorithms and addictive design that characterize social networking sites like Facebook. Fortunately, it’s a solvable problem. For example, here are two reforms that could make a difference.

First, lawmakers should implement policies that enhance personal privacy and data protection. Regulating how social media algorithms harvest private data would put a reasonable check on how these platforms manipulate users online. Such regulation may include limiting how much private data can be collected by social media companies, as well as restricting how data can be accessed or used by advertisers and outside parties.

Terry Chaney
[image_caption]Terry Chaney[/image_caption]
To date, some states have passed laws that move us in this direction, including the California Consumer Privacy Act and Virginia’s Consumer Data Protection Act. Minnesota might be the next state to lead the way with the Minnesota Consumer Data Privacy Act. This legislation may compel national action to protect our private data, not unlike the European Union’s General Data Protection Regulation.

Second, social media companies like Facebook should redesign their platforms by eliminating addictive features, such as “like” buttons and infinite scrolling. Curtailing addictive design would transform social networking sites for the better. Instead of addicting users to outrageous and erroneous information, these designs could help people encounter credible and accurate information.

As we continue losing people to a prolonged pandemic, finding credible and accurate information can often mean the difference between life and death. In this way, our future well-being may depend on reforming social media.

Christopher Cocchiarella is a training and development specialist with a background in technical communication and user experience. Terry Chaney is a policy analyst with a background in economics and technology policy. They live and work in the Twin Cities.

Join the Conversation

8 Comments

  1. The only solution is to quit Facebook. That was easy.

    The problem is getting your grandma to quit Facebook.

  2. A solution should focus on those who post content. Banning those who consistently post and make money off lies and hate speech. Apparently, some well known vaccine deniers are getting rich off their lies. Ban them and if someone attempts to post their lies, ban them as well. Of course, Facebook will resist losing eyeballs, but if something causes mass departures, their profitability is gone.

  3. If only “disinformation” was a something everyone agreed upon. Or didn’t change from month to month. Or do we really want to crush opinion or dissent?

    1. I get that there is a slippery slope, but a lot of misinformation is objectively false. For example, Trump’s election fraud claims. There is zero doubt whatsoever that is all completely false. That would be easy to fix.

      Covid is harder, but there are a lot of parts that are objectively false – microchips in vaccines, etc.

  4. Nice to see an article with good ideas, and that calls back the trend of censorship.

  5. What’s worse than misinformation, propaganda and general nonsense online is the Govt. trying to invent rules to curb them. Who gets to decide what is misinformation and what is the truth ? A committee set up by Amy Klobuchar ?

    I don’t care for misinformation so I don’t read it. Try it.

  6. content moderation as practiced by the likes of twitter and facebook will only silence half the extremists. of course, that’s the whole point.

  7. The social networking companies are very good at tracking users. Any page that gets to, say, 10,000 daily impressions is now public media like a cable network or newspaper and should be subject to the same liability provisions as FOX, CNN, MSNBC, the NY Times, or the WAPO.

    Look at the reaction to false stories about Dominion Voting Systems once they started suing for billions. All of a sudden even NewsMax started to be concerned about their content and its’ truthfulness.

    A few billion dollar lawsuits and Facebook will figure things out pretty quickly.

Leave a comment