What’s a Platform To Do?

I’ve written a few threaded tweets on this subject this week, the first prior to reading about the special event Facebook held to tell journalists all about the actions they are taking to limit the spread of fake news, or, as I prefer to call it, lies, conspiracy theories and propaganda. That event turned into a PR disaster, but it has revealed a lot more about their thinking and apparent lack of either understanding or commitment to fixing the problem. Tweets, even threaded ones, are not a great place to write detailed thoughts on a subject as complex and important as this, so I am writing it here.

How did we get here?

Before we talk about solutions it is important to get a good understanding of the problem, and how it came about. Where these platforms are concerned there have been two distinct issues with their products that, when combined, allowed various groups to mislead people on a large scale, and keep reinforcing their lies even today.

The first was to allow third parties to gain access to personal information on millions of their users via their APIs. While I believe naïveté was probably the root cause of this, the platforms affected must take responsibility for that, and ensure that any data they do provide to third parties is limited to just what they need and nothing more. User’s must explicitly grant permission, and that permission must be limited to just their data (i.e. my permission cannot also release information about my contacts, including the fact that they are my contacts).

The second is more egregious in my opinion: taking the stance that they will be neutral platforms that will allow their users to say & share anything under the guise of free speech. Maybe this was naïveté too, but it feels much more like a business decision to maximize the number of users, and keep even extreme organizations, like InfoWars, paying to promote content and/or generating ad revenue.

Once governments around the world started looking into these activities, Facebook in particular reverted to its tech roots and handled it like any other bug report. Promising to tighten up rules for data access, a implement new algorithms that demote articles their fact-checkers feel are false.

Demotion is their solution!

In response to a tweet from Oliver Darcy from CNN saying that Facebook did not have a good answer to his question as to why InfoWars, a site that pushes little if anything but extreme conspiracy theories and lies, still had a Facebook page, Facebook’s Twitter account posted this:

Later in the same thread, they posted this in response to a tweet from Kevin Roose of the New York Times:

It appears clear from this thread that Facebook has no intention of banning even the most egregious of conspiracy theory sites from its platform. I’m not even talking about blocking users from sharing these stories in their own posts. This is talking about these sites maintaining their own page on the service. Like it or not, that gives these sites some degree of legitimacy.

Remember, Facebook is the service that insisted people use their real names. Even those with very good reasons to hide their true identities when online for fear of persecution. Yet they are happy to host services that spew conspiracy theories, lies and in many cases hate. Talk about a double standard.

Demotion, the solution they have opted for, might reduce the number of people exposed to their lies, but it doesn’t stop the story spreading. It also appears this will only happen after a manual review by a fact checker, which in turn is triggered by user reports. A lot of damage can be done before that chain completes and the post is demoted.

Better Solutions

The most obvious solution is to ban sites that repeatedly spread lies. For some, that could be done immediately. I’m sure Facebook et al could look at their analytics and extract the top 100 sites based on shares. Let their fact checkers spend some time scanning those sites and assessing how trustworthy the site is. If a majority of the content is lies, propaganda or conspiracy theories, simply ban them. Ban them from having a page, ban users from sharing links to their site.

Once those sites are dealt with move down the list. User reports can also trigger a review, and maybe for sites that only sometimes share untruthful content a three strikes system can be employed, similar to the mechanism YouTube uses.

Maybe over time, artificial intelligence models can be created that can help recognize and classify posts as well. At the very least they should be able to watch for posts that are spreading fast, and perhaps also try to analyze them, and the comments being posted under them, to determine something about the nature of the content.

Benefits to Society

Banning sites from these platforms sounds a lot like censorship, but it is not really. Nobody is taking the sites down. It is just a third party platform taking some responsibility for the quality of the content it promotes or hosts. Facebook and Twitter are not ISPs. They are not required to be neutral, and, indeed, they have rules and will delete posts and ban users who violate those rules.

Reflect on this quote:

“If you tell a big enough lie and tell it frequently enough, it will be believed.”
― Adolf Hitler

If nothing else, Fox News, Breitbart, InfoWars et al have demonstrated quite convincingly that this is indeed the case. In a retweet earlier this week, I said this:

The viewers of Fox News, along with those hooked on these conspiracy theory sites, are indeed victims. They have been mentally abused by these sites into both believing that the main stream media is all lies, and that the lies they tell are the truth.

Somehow we need to reach these people and help them see that they have been lied to and taken advantage of. Changing Fox News is a lost cause, but changing how social media handles these sites to prevent them spreading their lies is, I believe, achievable.

Claiming that the reason these sites need to be allowed on social media platforms is to protect freedom of speech is nonsense. These sites have their own servers. Nobody is denying them their freedom to speak.

Facebook needs to consider whether facilitating the sharing of conspiracy theories and lies has any benefit to society at large. Their new mission statement says this:

Founded in 2004, Facebook’s mission is to give people the power to build community and bring the world closer together. People use Facebook to stay connected with friends and family, to discover what’s going on in the world, and to share and express what matters to them.

How does allowing the sharing of lies and conspiracy theories, many of which promote fear or hate of large groups of society, help “bring the world closer together?”

Terms of Service

Spreading lies is also already covered by the Facebook Terms of Service:

You may not use our Products to do or share anything:

  • That violates these Terms, our Community Standards, and other terms and policies that apply to your use of Facebook.
  • That is unlawful, misleading, discriminatory or fraudulent.
  • That infringes or violates someone else’s rights.

The phrase “You may not use our Products to do or share anything … misleading” sounds a lot like it would apply to conspiracy theories and lies. So, they’re already in violation of the terms of service.

One thought on “What’s a Platform To Do?

  1. Pingback: What’s a platform to do?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.