Conformity and information cascade

Conformity refers to the tendency to align one’s own attitudes, beliefs, and behaviours, to those around oneself. It is most commonly understood as trying to “fit in” with a group, as a response to social pressure, or to be perceived as “normal”. This phenomenon has been well documented by psychologists since the mid-20th century, and has come to be known as ‘conformity bias’.

The most commonly referred to study to convey conformity bias is that of Solomon Asch (1951). In Asch’s experiments, participants could either conform with clearly incorrect judgements of their peers (who were also “in” on the experiment), or diverge from the group, and state the clearly correct answer. Asch found that conformity on average, participants chose to conform about 30% of the time.

O’Connor and Weatherall (2018) note that subsequent studies have yielded similar results, though results vary based on subtle experimental differences. These studies suggest two things. First, that humans do not like to disagree with others, and second, that we sometimes trust the judgment of other agents more than our own (O’Connor & Weatherall, 2018, p.7258, O’Connor & Weatherall, 2019, p.81, & Zollman, 2010, p.318).

More recently, epistemologists have taken models from other fields, such as economics, and applied them to epistemic networks. In The Misinformation Age: How False Beliefs Spread (2019), O’Connor and Weatherall apply a framework developed by economists Venkatesh Bala and Sanjeev Goyal (1998), to analyze conformity across networks of both scientists and laypeople. Before considering this model, a quick overview of Bayesian inference will be helpful.

Bayesian inference, in the most basic sense, refers to the application of Bayes’ Rule (a mathematical formula) to understand how an individual agent should change their beliefs as they encounter new evidence. Beliefs, or more so, confidence levels in beliefs, can be understood in degrees of probability, represented by a number between 0 (complete disbelief) and 1 (complete belief), with .5 representing indifference. When new evidence is encountered, an epistemic agent makes use of previous beliefs (priors) to assess the probability of new data (likelihood), ultimately to arrive at an updated (posterior) probability. For example, an agent might have a belief with a credence of 0.5. As new evidence is incorporated, their credence might then be 0.75, meaning that they have a higher confidence level in that belief. The basic takeaway here, however, is that as we gather more evidence, our beliefs change to reflect this evidence.

Bala and Goyal’s model, as summarized by O’Connor & Wetherall (2019, p.54) is as follows. There are a group of agents, trying to choose between two possible actions, guided not only by their own evidence, but also that of their peers. The two possible actions are assumed to differ in likelihood of achieving a desired result. Over a series of rounds, each person chooses one action or the other, based on current beliefs (priors) and new information gathered after each round, importantly, based not only on the outcome of their own previous actions, but also those of their peers (O’Connor & Wetherall, 2019, p.54).

O’Connor & Weatherall highlight that Bala-Goyal models show how it is extremely likely, that once evidence is shared, an epistemic network will converge on the same beliefs – whether that be true or falsebeliefs. They also note that in these models, convergence on a single belief does not happen as a result of individual psychological responses (for example, attempting to “fit in” or “be normal”), as these things are not accounted for, but instead, arise solely from the sharing of evidence (O’Connor & Wetherall, 2019, p.60-61).

Social influence on beliefs can also be epistemically productive. Kevin Zollman notes that individuals are better at forming accurate beliefs when part of a group, rather than relying solely on their own judgement. However, Zollman also notes that this is not always the case (Zollman, 2010, p.335-336). That is, while sharing of evidence can be beneficial to the formation of accurate beliefs, it can also backfire, leading to consensus on a false belief. Bikhchandani, Hirshleifer, and Welch, have highlighted how a false belief can spread through a group of individuals as a result of social connections, regardless of contradictory evidence – a phenomenon known as “information cascade” (Bikhchandani, Hirshleifer, and Welch, 1998, in O’Connor & Weatherall, 2019, p.82). In those cases, where the spread and acceptance of misleading evidence leads to false beliefs, it would be better for individual agents not to have communicated. This buffer would allow some individual agents to be protected from misleading evidence and allow them to pursue more accurate evidence and subsequent true beliefs (O’Connor & Wetherall, 2019, p.63).

It must also be noted that information cascade differs from conformity bias. With the former, agents are making rational decisions based on available evidence, whereas with the latter, an agent is conforming, not as a result of reasoning in light of evidence, but instead, to ‘fit in’ with a social group. That is not to say that these processes are not happening simultaneously. The key takeaway here, however, is that both of these phenomena, in different ways, can lead to the same outcome. That is, as a result of our social embeddedness, we might give less credence to some evidence than we ought to, and as such, arrive at false or inaccurate beliefs.

What does this all mean for beliefs about non-human animals? Consideration of conformity suggests that our beliefs are impacted, in a variety of ways, by those who make up our epistemic network. First, there is conformity in the psychological sense, in which an individual wants to ‘fit in’. For example, if all of my friends, my family, and others that I am concerned with consume non-human animals as part of their diet, it is possible that I will also consume animals, even if I recognise the problems (moral, environmental, health-related, etc.) of doing so. To do otherwise would make me ‘stand out’ from my peers.

Second, there is conformity as explored by the social epistemologist, in which our beliefs conform to that of our group as a result of the sharing of evidence. More importantly, this can result in widespread false, inaccurate, or seemingly irrational beliefs. This phenomenon, I argue, is considerably impactful in the formation of our beliefs about non-human animals. More specifically, it reinforces the idea that some animals are food.

Put simply, a number of the misconceptions that I discussed in a previous post develop as a result of widespread dissemination of inaccurate information, which over time, through the sharing of these beliefs, become accepted, normalised, and reinforced. At the same time, those who hold a different position to the status quo are dismissed as outliers.  But, what if the outlier is right?

*This blog post has been adapted from my Masters of Research thesis, ‘How Misinformation Reinforces the Status of Animals as Food’ (2021). Thank you to my supervisors Jane Johnson and Mark Alfano.

REFERENCES

Asch, S. E. (1951). Effects of group pressure upon the modification and distortion of judgments. in H. Guetzkow (ed.), Groups, leadership, and men Pittsburgh, PA: Carnegie Press.

Bala, V., & Goyal, S. (1998). Learning from neighbours. Review of Economic Studies, 65(3), 595–621.

Bikhchandani, S., Hirshleifer, D., & Welch, I. (1998). Learning from the behaviour of others: Conformity, fads, and informational cascades. The Journal of Economic Perspectives, 12(3), 151–170.

O’Connor, C., & Weatherall, J. O. (2019). The Misinformation Age: How False Beliefs Spread. London, UK: Yale University Press.

Zollman, K. J. S. (2010). Social structure and the effects of conformity. Synthese, 172(3), 317–340.