Is Facebook Morally Responsible?

21 August 2021

If you’ve been following the news, you’re aware that the Delta variant of the Coronavirus is all around us. While it only poses a very minor risk to the vaccinated, it is wreaking havoc on the unvaccinated

At the time of writing, we’re back up to around 50,000 new cases a day in the United States, the vast majority of which are unvaccinated (around 99% of those hospitalized for Delta are unvaccinated). So with the easy availability of vaccines (at least in the US), why doesn’t the remainder of the population just get the jab? 

There are many and various answers to that question. But (obviously) one major causal factor is misinformation, much of which falsely casts vaccines in a negative light. And it is fair to say—despite Facebook’s positive efforts to spread accurate information—that it is still one of the major conduits of misinformation as well. Perhaps the major conduit. It turns out that just twelve people, including anti-vaxxer Robert F. Kennedy Jr., are responsible for generating 65% of the vaccine misinformation that gets shared on Facebook. Facebook, effectively, is their megaphone.

This brings us to our main question: 

Is Facebook morally responsible for the dissemination of misinformation on its platform? 

Opinion on this is divided. The Biden administration has been laying blame at the feet of social media and on Facebook in particular. Facebook has argued that its positive efforts at spreading correct information let them off the hook. 

What might a philosophical view of moral responsibility say on the matter? Here’s why this is a hard question. Facebook, in strange ways, combines personal communication among friends with widespread broadcasting of (mis)information: some posts end up reaching millions. That makes it very unlike media that have come before, for which we have adequate moral frameworks. To see the difficulty, consider two very different cases.

Case 1: The Paper Company

Suppose you and I own a paper company. We produce and sell paper, on which people can write whatever they like: letters, drawings, poems, histories, or . . . misinformation about vaccines, which may then be widely disseminated through mail or whatever. Let’s say lots of misinformation is being printed and shared on paper we produced.

If we pose our question in relation to The Paper Company, it is intuitive that the answer is no, The Paper Company is not morally responsible. The Paper Company’s job is to provide a medium for communication—paper—not to police that communication. The responsibility for spreading misinformation in this case falls solely on the individuals who are producing and disseminating it. 

Case 2: The Traditional Media Broadcaster

Now suppose you and I own a company that broadcasts traditional media: we do radio and television news and talk shows. Let’s say also that lots of our guests and on air personalities are propagating misinformation about vaccines. Consequently, millions of people are seeing, hearing, and in many cases believing it. 

Is The Traditional Media Broadcaster morally responsible for the spread of misinformation? In this case yes, absolutely. Traditional broadcasters are in the business of curating content that gets shared on their platforms. Nor would it be hard to weed out misinformation if we tried, since we are choosing and paying the people who go on our stations. 

So what about Facebook?

These two cases help us see why the issue of Facebook’s responsibility is a puzzle. It’s not intuitively clear whether Facebook is more like The Paper Company or The Traditional Media Company. It is, in different ways, like both.  

Facebook is like The Paper Company in two significant ways: (1) Basically anyone can put content on it, and (2) it is to a large extent used for individual personal communication. But it is like The Traditional Media Company in two other significant ways: (a) Any information put on it easily spreads to millions of people, and (b) there is some chance of curating the information that goes up (e.g., through algorithms that promote or demote content).

This amalgam situation might seem to make our problem insoluble. Fortunately, however, there is a philosophical framework that can add clarity. 

John Fisher and Mark Ravizza’s now-classic Responsibility and Control: A Theory of Moral Responsibility proposes that what they call “guidance control” is what’s required for a person to have moral responsibility for happenings of various sorts (actions, omissions, consequences, etc.). Guidance control comes down to two conditions (here I paraphrase): 

  • First, the mechanism that produces the behavior or outcome in question must be the agent’s own.

  • Second, that mechanism must respond to reasons. 

So, for example, if I steal something, I am morally responsible because the psychological mechanisms that chose that path are mine and are responsive to reasons (had I had strong enough reason, I would have chosen something else). 

How can we extend this analysis to the dissemination of vaccine misinformation on Facebook’s platform?

A lot of work would be needed to do the theoretical extension from individuals to corporate bodies, like Facebook. But I think Fisher and Ravizza’s two conditions do illuminate why Facebook is to some degree morally responsible for the dissemination of misinformation—much more so than The Paper Company would be.

First, the mechanisms of dissemination (the whole platform, basically) do indeed belong to Facebook, so Facebook satisfies the first condition in a way that a paper company does not and cannot (once sold, the paper is out of their hands). But note that, since it is designed as a platform, there are many other mechanisms (housed in the minds of users) that are also contributing to dissemination, which Facebook does not own. So this makes Facebook a joint owner in the collective mechanisms of dissemination, rather than sole proprietor. 

Second, Facebook’s design of its platform is indeed responsive to reasons. They can change their algorithms if it suits their purposes, and they regularly do. And those algorithms were designed for reasons in the first place: the goal was to maximize the amount of time users spend on the platform.

So Facebook indeed has some guidance control over the spread of misinformation on its platform, unlike The Paper Company, and to that extent it has moral responsibility as well. 

The only rejoinder I see here for Facebook is that the steps it would have to take to prevent the spread of misinformation would be too Draconian and contrary to the spirit of the platform, which is about people connecting with each other and allowing them to talk. But I’m not convinced. If Facebook's mission is indeed to connect people, there are two things it could do better that wouldn’t compromise that mission.

First, they could stop being a megaphone for misinformation super-spreaders entirely. This step could take various forms, but one obvious thing to do would be to have much higher standards for any page that has more than, say, 2,000 followers; doing this wouldn’t compromise what individuals can say to other individuals—it would just force liars with large followings to go elsewhere. Second, they could use their algorithms to comprehensively demote misinformation: basically, such improved algorithms would imply that if I posted, say, the Gates microchip conspiracy theory on my wall, then that would be seen in no one’s feed—only people who deliberately came to my wall would see it. 

Both steps, I think, are feasible and would have an impact, especially as Delta rages around the globe. Since Facebook is, if my analysis is correct, at least partly morally responsible for the spread of misinformation that is stopping people from being vaccinated, it is also partly morally to blame for the preventable deaths that have followed. 

Image from Wikimedia Commons

Comments (9)


Harold G. Neuman's picture

Harold G. Neuman

Tuesday, August 24, 2021 -- 7:15 AM

I offered some remarks on

I offered some remarks on this post. Thought they had been accepted. Guess not. Was it something I said?

I've read and agree to abide by the Community Guidelines
Tim Smith's picture

Tim Smith

Tuesday, August 24, 2021 -- 1:05 PM

That theoretical work is

That theoretical work is needed to extend Fisher and Ravizza's guidance control model of moral responsibility to Facebook, and commercial interests is an understatement. Neil doesn't do this work here. Still, he makes an intuitive stab that is not only praiseworthy but promising in the face of artificial intelligence used by FB, Google, and the Chinese government (I am giving the NSA a pass for now, but Neil is coming for them as am I soon enough.)

Fisher and Ravizza don't do this work. They specifically state as much in Responsibility and Control.

"It should be noted that the term "responsibility" admits of a variety of uses in
addition to causal and moral responsibility. For example, the general term "responsibility" also is used to refer to legal responsibility, corporate responsibility, role responsibility (i.e., the type of responsibility the captain of a ship assumes for the safety of his vessel), and so forth. In this book, we shall be focusing primarily on the issues surrounding moral responsibility."

If this theoretical work is done, I am unsure where this leaves the moral responsibility of the 12 bad actors, the board of directors, shareholders, or CEOs.

Applying individual morality to commercial concerns is a novel idea to me. Previously I had thought the moral responsibility of Facebook was in the algorithmic amplification of these voices and not in FBs inherent guidance control.

There is an established literature on corporate social responsibility going back to the 1950s (maybe earlier if you take Quaker sentiments that would articulate with this guidance control idea?)

It would be good to disambiguate misinformation from disinformation. These 12 bad actors are making money off their postings. That money should be captured at the source and redirected to the victims. The intentionality and money-making aspects of this information change the misinform to disinform, in my understanding. It is OK to make mistakes. It is not OK to preach anti-vaccine ideology and get the vaccine in private (as Trump did - despite his previous antibody treatment - this is immorality transcending stupidity.)

David Livingstone Smith did a series on the Intellectual Dark Web a while back that gets at some core issues. I'm guilty of not reading his work intensely. PT is very lucky to have intelligent Philosophers like David and Neil posting in this space. I can't entirely agree with their thought, but I am learning not to dismiss them without rethinking my own. That doesn't always work out well for me.

Read David's entire series on the IDW:
1. Dark Knowledge https://www.philosophytalk.org/blog/dark-knowledge
2. Dark Knowledge: A User's Guide https://www.philosophytalk.org/blog/dark-knowledge-users-guide
3. An Antidote to Bullshit https://www.philosophytalk.org/blog/antidote-bullshit
4. Enlightenment Peddlers https://www.philosophytalk.org/blog/enlightenment-peddlers

I've read and agree to abide by the Community Guidelines
Harold G. Neuman's picture

Harold G. Neuman

Saturday, August 28, 2021 -- 1:17 PM

They are a commercial

They are a commercial enterprise. Moral responsibility is not in the job description. This is not ethics or morality..
Only economics. Nothing less. If you wish to read about bullshit, read Harry Frankfurt. Or, John Dewey's remarks about beliefs... Ahem..

I've read and agree to abide by the Community Guidelines
Tim Smith's picture

Tim Smith

Sunday, August 29, 2021 -- 12:39 AM

Don't be evil.

Don't be evil.

I've read and agree to abide by the Community Guidelines
Tim Smith's picture

Tim Smith

Monday, August 30, 2021 -- 9:51 PM

Harold,

Harold,

It is perfectly valid for the Taliban to execute a comedian for violating fundamental moral laws. That doesn't make it right and good, but it is a crude morality. When Google took the "Don't be evil." statement into their code of ethics, they didn't comprehend its complications, but they did take a stance. Your contention that commercial interests don't have moral responsibility is the extreme minority view among philosophers but valid nonetheless. I'm not sure if I see selling ice cream in the West Bank settlements in the same light that I do Google's search algorithms limiting results for profit. I do understand the tenet – don't be evil. I think your guidance here is evil if legal and commercial interests disinform people to the point of causing death without any conception of a moral compass.

Frankfurt's definition of bullshit is a disregard for the truth. I'm not sure if you are saying Neil's post is bullshit or my comment, but I welcome the statement if you back it up with detail. I get that you disagree.

Dismissing someone's point of view as bullshit is not something that should be done lightly without taking on the onus of bullshit itself. We may all be slinging BS, but honestly, I don't see your point. The idea of guidance control, or consequentialist reasoning, is more than enough to posit legitimate moral responsibility to corporations.

It is possible to disagree while getting to the truth regardless. I'm not sure why you call people's ideas bullshit or stupid (like you did Nick Riggle's in the awesome thread.)

Dewey was unaware of Citizens United, the powerful and more recent writing on corporate social responsibility, information theory, or AI. You seem to intimate that Dewey's thoughts on belief are definitive. Dewey was, above all, a pragmatist and open to changing his view based on new ideas. It is impossible to say what Dewey would say here, but I think it would be thoughtful. Frankfurt says bullshit comes when people feel compelled to give an opinion on issues. I appreciate the pointer. I've read and even may understand it. Am I wrong?

I see that Frankfurt has a newer book out I wasn't even aware of, "On Inequality." Maybe there is something there as well to reign in my thoughts. If one reads, I'm sure we can get it right eventually if we reflect a bit. I, at least, will do that.

I've read and agree to abide by the Community Guidelines
Harold G. Neuman's picture

Harold G. Neuman

Monday, September 13, 2021 -- 2:48 PM

Don't see any of my remarks

Don't see any of my remarks as evil, Tim. Capitalism is about money. Period. It is not altruistic. It is about money---lots of it. Exclamation point... how wealthy are you, amigo?

I've read and agree to abide by the Community Guidelines
Tim Smith's picture

Tim Smith

Tuesday, September 14, 2021 -- 3:22 AM

Your remarks are perfectly

Wealth doesn't buy friendship, amigo. If you want to go down this path, enjoy the ride.

There is a reason Google put 'Don't be evil.' in their code of conduct. There is a reason FB didn't. There's a reason Google distanced itself from the phrase and that Steve Jobs called the sentiment bullshit. Why is FB calling on academic oversight to get it through this next election? The need for guidance control in specific industries is paramount. In others, guidance is foolish. One rarely embargoes safety.

There isn't one capitalist state. Not one. It is an idea. It stops about when it comes time to take out the trash, put out fires, or print money.

Ecuador can give up its bank to Bitcoin. It won't save them. It might help in the short run. But you need to control your money - in the long view.

You are entitled to your belief. Your remarks are valid if not universally shared.

Neil's point is also valid. It is worth discussing and is not bullshit. Ahem...

I encourage you and your amigos not to be evil. Evil is what evil does. One way not to harm is to respect what others think, say, and do enough to allow yourself room to grow or at least see their perspective. That way pays dividends. It is what philosophy does.

I've read and agree to abide by the Community Guidelines
Neil Van Leeuwen's picture

Neil Van Leeuwen

Thursday, September 16, 2021 -- 1:20 PM

Just seeing this exchange now

Just seeing this exchange now!

Harold I think your position is absurd. I agree that corporations can and do seek to enrich their shareholders. That's fine. But that doesn't entail that morality doesn't apply to them. If, for example, a corporation knowingly pollutes a community's water supply with chemicals that cause sickness (as has happened often enough), then it has done something morally bad. The fact that this action may have saved them money (and hence helped them enrich their shareholders) is no excuse.

I've read and agree to abide by the Community Guidelines
Tim Smith's picture

Tim Smith

Thursday, September 16, 2021 -- 11:10 PM

Neil,

Neil,

I would push back, pile on and maybe point to the exceptional case FB, Google and others offer. In general, though, I appreciate your post and your response here and look forward to Harold’s reaction. Companies have moral responsibility above and beyond profit. 

However, corporations don’t have full moral responsibility and rights even if we extend them personhood (which the supreme court has done.) For example, we don’t open the moral right to life to companies with the solitary exception where they are too big to fail and would cause more damage by being dissolved. For the large part, companies fail more often than not and when they do without too much moral grief or concern.

In all fairness and coming expectation of the rebroadcast of Gandhi as a Philosopher show next week, Gandhi famously did not consider corporations as moral agents. His opinion on this is a prime example from which he lacks philosophical chops, in my opinion. I doubt he would have felt the same after the Bhopal disaster. 

But it doesn’t stop there. The US has passed legislation protecting corporations from legal responsibility for vaccine safety to incentivize this critical research. Might we consider that Johnson and Johnson is less morally responsible for the vaccine failures in Baltimore in that light – it did cost lives. 

Specifically, the Cutter incident raised the bar for corporate good deeds. Congress then lowered its moral responsibility when the National Vaccine Injury Compensation Program was introduced in 1986 to protect vaccine manufacturers from litigation. Acts like this seem to argue for reduced moral responsibility for corporations who act in low-profit high-risk scenarios that serve the public good. (After all, we don’t hold doctors and nurses liable for helping people in distress on planes in flight– as long as they act in good faith.)

Harold’s argument is too extreme, however, as was Gandhi’s. How Volkswagen can be held accountable for falsifying emission tests and executives don’t suffer consequences also eludes my moral compass. Both the guidance controllers and the corporations as a whole need to bear the brunt of the responsibility. The project you are calling for in this blog extending guidance control to companies would parse this out.

Your blog here, in addition, has made me think instead that certain corporations (FB, Alphabet, MS, Apple, Baidu, Twitter, and others) have specific concerns as their policies affect the core moral judgment of people, often without their knowledge. They are bordering on Case 2 liability by offering the hypertextual paper for disinformation. If so, this puts heightened interest in thinking out the details of corporate guidance control to fix the inhuman forces driving elections, economies, and the environment to the point of catastrophic failure.

Good post and appreciated response here. Thanks for taking the time.

I've read and agree to abide by the Community Guidelines