A blog from the Centre for Research Ethics & Bioethics (CRB)

Month: February 2021

Should social media platforms censor misinformation about COVID-19?

When the coronavirus began to spread outside China a year ago, the Director General of the World Health Organization said that we are not only fighting an epidemic, but also an infodemic. The term refers to the rapid spread of often false or questionable information.

While governments fight the pandemic through lockdowns, social media platforms such as Facebook, Twitter and YouTube fight the infodemic through other kinds of lockdowns and framings of information considered as misinformation. Content can be provided with warning signs and links to what are considered more reliable sources of information. Content can also be removed and in some cases accounts can be suspended.

In an article in EMBO Reports, Emilia Niemiec asks if there are wiser ways to handle the spread of medical misinformation than by letting commercial actors censor the content on their social media platforms. In addition to the fact that censorship seems to contradict the idea of ​​these platforms as places where everyone can freely express their opinion, it is unclear how to determine what information is false and harmful. For example, should researchers be allowed to use YouTube to discuss possible negative consequences of the lockdowns? Or should such content be removed as harmful to the fight against the pandemic?

If commercial social media platforms remove content on their own initiative, why do they choose to do so? Do they do it because the content is scientifically controversial? Or because it is controversial in terms of public opinion? Moreover, in the midst of a pandemic with a new virus, the state of knowledge is not always as clear as one might wish. In such a situation it is natural that even scientific experts disagree on certain important issues. Can social media companies then make reasonable decisions about what we currently know scientifically? We would then have a new “authority” that makes important decisions about what should be considered scientifically proven or well-grounded.

Emilia Niemiec suggests that a wiser way to deal with the spread of medical misinformation is to increase people’s knowledge of how social media works, as well as how research and research communication work. She gives several examples of what we may need to learn about social media platforms and about research to be better equipped against medical misinformation. Education as a vaccine, in other words, which immunises us against the misinformation. This immunisation should preferably take place as early as possible, she writes.

I would like to recommend Emilia Niemiec’s article as a thoughtful discussion of issues that easily provoke quick and strong opinions. Perhaps this is where the root of the problem lies. The pandemic scares us, which makes us mentally tense. Without that fear, it is difficult to understand the rapid spread of unjustifiably strong opinions about facts. Our fear in an uncertain situation makes us demand knowledge, precisely because it does not exist. Anything that does not point in the direction that our fear demands immediately arouses our anger. Fear and anger become an internal mechanism that, at lightning speed, generates hardened opinions about what is true and false, precisely because of the uncertainty of the issues and of the whole situation.

So I am dreaming of one further vaccine. Maybe we need to immunise ourselves also against the fear and the anger that uncertainty causes in our rapidly belief-forming intellects. Can we immunise ourselves against something as human as fear and anger in uncertain situations? In any case, the thoughtfulness of the article raises hopes about it.

Pär Segerdahl

Written by…

Pär Segerdahl, Associate Professor at the Centre for Research Ethics & Bioethics and editor of the Ethics Blog.

Niemiec, Emilia. 2020. COVID-19 and misinformation: Is censorship of social media a remedy to the spread of medical misinformation? EMBO Reports, Vol. 21, no 11, article id e51420

This post in Swedish

We recommend readings

How do we take responsibility for dual-use research?

We are more often than we think governed by old patterns of thought. As a philosopher, I find it fascinating to see how mental patterns capture us, how we get imprisoned in them, and how we can get out of them. With that in mind, I recently read a book chapter on something that is usually called dual-use research. Here, too, there are patterns of thought that can capture us.

In the chapter, Inga Ulnicane discusses how one developed responsibility for neuroscientific dual-use research of concern in the Human Brain Project (HBP). I read the chapter as a philosophical drama. The European rules that govern HBP are themselves governed by mental patterns about what dual-use research is. In order to take real responsibility for the project, it was therefore necessary within HBP to think oneself free from the patterns that governed the governance of the project. Responsibility became a philosophical challenge: to raise awareness of the real dual-use issues that may be associated with neuroscientific research.

Traditionally, “dual use” refers to civilian versus military uses. By regulating that research in HBP should focus exclusively on civil applications, it can be said that the regulation of the project was itself regulated by this pattern of thought. There are, of course, major military interests in neuroscientific research, not least because the research borders on information technology, robotics and artificial intelligence. Results can be used to improve soldiers’ abilities in combat. They can be used for more effective intelligence gathering, more powerful image analysis, faster threat detection, more accurate robotic weapons, and to satisfy many other military desires.

The problem is that there are more problematic desires than military ones. Research results can also be used to manipulate people’s thoughts and feelings for non-military purposes. They can be used to monitor populations and control their behaviour. It is impossible to say once and for all what problematic desires neuroscientific research can arouse, military and non-military. A single good idea can cause several bad ideas in many other areas.

Therefore, one prefers in HBP to talk about beneficial and harmful uses, rather than civilian and military. This more open understanding of “the dual” means that one cannot identify problematic areas of use once and for all. Instead, continuous discussion is required among researchers and other actors as well as the general public to increase awareness of various possible problematic uses of neuroscientific research. We need to help each other see real problems, which can occur in completely different places than we expect. Since the problems moreover move across borders, global cooperation is needed between brain projects around the world.

Within HBP, it was found that an additional thought pattern governed the regulation of the project and made it more difficult to take real responsibility. The definition of dual-use in the documents was taken from the EU export control regulation, which is not entirely relevant for research. Here, too, greater awareness is required, so that we do not get caught up in thought patterns about what it is that could possibly have dual uses.

My personal conclusion is that human challenges are not only caused by a lack of knowledge. They are also caused by how we are tempted to think, by how we unconsciously repeat seemingly obvious patterns of thought. Our tendency to become imprisoned in mental patterns makes us unaware of our real problems and opportunities. Therefore, we should take the human philosophical drama more seriously. We need to see the importance of philosophising ourselves free from our self-incurred captivity in enticing ways of thinking. This is what one did in the Human Brain Project, I suggest, when one felt challenged by the question of what it really means to take responsibility for dual-use research of concern.

Read Inga Ulnicane’s enlightening chapter, The governance of dual-use research in the EU. The case of neuroscience, which also mentions other patterns that can govern our thinking about governance of dual-use research.

Pär Segerdahl

Written by…

Pär Segerdahl, Associate Professor at the Centre for Research Ethics & Bioethics and editor of the Ethics Blog.

Ulnicane, I. (2020). The governance of dual-use research in the EU: The case of neuroscience. In A. Calcara, R. Csernatoni, & C. Lavallée (Editors), Emerging security technologies and EU governance: Actors, practices and processes. London: Routledge / Taylor & Francis Group, pages 177-191.

This post in Swedish

Thinking about thinking