Conformity

Cass R. Sunstein & Robert H Frank

53 annotations Mar 2024 data

Introduction

  • A Republican-appointed judge sitting with two other judges appointed by Republican presidents becomes much more likely to vote in a stereotypically conservative direction in cases that involve civil rights, sexual harassment, environmental protection, and much more. Perhaps more remarkably, a Democratic-appointed judge sitting with two Republican appointees also becomes more likely to vote in a stereotypically conservative direction. And something important happens when three Republican appointees sit together: the likelihood of a stereotypically conservative result skyrockets. Democratic appointees show a similar pattern. When three such appointees sit together, a stereotypically liberal leaning is highly likely.
  • If uninformed people are trying to decide whether immigration or climate change is a serious problem, or whether they should be concerned about existing levels of arsenic in drinking water, they are likely to be responsive to the views of confident and consistent others.
  • But there is an important qualification to these claims, to which I will return: Sherif's conformity findings significantly decrease if the experimenter uses a confederate whose membership in a different social group is made salient to subjects. If you know that the confident person belongs to a group different from yours—one that you distrust or dislike—you might not be influenced at all.
  • When asked to decide on their own, subjects erred less than 1 percent of the time. But in rounds in which group pressure supported the wrong answer, subjects erred no less than 36.8 percent of the time. Indeed, in a series of twelve questions, no less than 70 percent of subjects went along with the group and defied the evidence of their own senses, at least once. We should not overstate this finding. Most people, most of the time, say what they actually see. But Asch's most noteworthy finding is that most people, some of the time, are willing to yield, even in the face of clear reason indicating that the group is wrong.
  • Note that Asch's findings contain two conflicting lessons. First, a significant number of people are independent all or much of the time. About 25 percent of people are consistently independent;18 such people are uninfluenced by the group. Moreover, about two-thirds of total individual answers do not conform. Hence "there is evidence of extreme individual differences" in susceptibility to group influences, with some people remaining completely independent and others "going with the majority without exception." While independent subjects "present a striking spectacle to an observer," giving "the appearance of being unshakable,"20 other people show a great deal of anxiety and confusion. Second, most subjects, at least some of the time, are willing to yield to the group even on an apparently easy question on which they have direct and unambiguous evidence.
  • We should stress a separate point here: many people are not willing to disclose their own information to the group, even though it is in the group's interest to learn what is known or thought by individual members. To see this point, imagine a group almost all of whose members believe something to be true even though it is false. Imagine too that one member of the group or a very few members of the group know the truth. Are they likely to correct the dominant view? If Asch's findings generalize, the answer is that they may not be. They are not reticent because they are irrational. They are making a perfectly sensible response to the simple fact that the dominant view is otherwise—a fact that suggests either that the small minority is wrong or that they are likely to risk their own reputations if they insist they are right.
  • Some of the most interesting work makes a sharp distinction between compliance and acceptance. People comply when they defer to others whom they believe to be wrong. In that case, they will conform in public but not in private. People accept when they internalize the view of the group. As we have seen, Asch's findings involve a degree of both compliance and acceptance.
  • If, for example, people are reminded of circumstances in which they have acted without inhibition, they are more likely to conform.
  • The deadening effect of public opinion was of course a central concern of John Stuart Mill, who insisted that protection "against the tyranny of the magistrate is not enough" and that it was also important to protect "against the tyranny of the prevailing opinion and feeling; against the tendency of society to impose, by other means than civil penalties, its own ideas and practices as rules of conduct on those who dissent from them."
  • Mill's focus here is on the adverse effects of conformity not only on the individuals who are thus tyrannized but also on society itself, which is deprived of important information. I
  • Consistent with Sherif's findings, people are less likely to conform if they have high social status or are extremely confident about their own views. They are more likely to conform if the task is difficult or if they are frightened.
  • In that event, a financial incentive, rewarding correct answers, actually increases conformity. People are more willing to follow the crowd when they stand to profit from a correct answer if the question is hard. Perhaps most strikingly, the level of conformity is about the same, when financial incentives are absent, in low-difficulty and high-difficulty tasks—but the introduction of financial rewards splits the results on those tasks dramatically apart, with significantly decreased conformity for low-difficulty tasks and significantly increased conformity for high-difficulty tasks.
  • A certain number of people, in the Asch experiments, actually know the right answer and give conforming answers only because it is not worthwhile to reject the shared view of others in public. But when a financial incentive is offered, peer pressure is outweighed by the possibility of material gain. The implication is that an economic reward can counteract the effects of social pressures. There is a lesson here for groups of all kinds—schools, private employers, and governments. If people know they will gain if they say what they know, then groups are more likely to obtain crucial information
  • By contrast, difficult tasks leave people with a great deal of uncertainty about whether they are right. In such circumstances, people are all the more likely to give weight to the views of others, simply because those views may well be the most reliable source of information. If you are asked to solve a difficult math problem, or to describe the most sensible approach for reducing deaths on the highways, you might defer to the wisdom of the room
  • There is a disturbing implication. A "majority consensus" is "often capable of misleading individuals into inaccurate, irrational, or unjustified judgments." Such a consensus "can also produce heightened confidence in such judgments as well." It follows that "so long as the judgments are difficult or ambiguous, and the influencing agents are united and confident, increasing the importance of accuracy will heighten confidence as well as conformity—a dangerous combination."
  • Extremists are often following one another.
  • Asch's original studies found that varying the size of the group of confederates, unanimously making the erroneous decision, mattered only up to a number of three; increases from that point had little effect. Using one confederate did not increase subjects' errors at all; using two confederates increased errors to 13.6 percent; and using three confederates increased errors to 31.8 percent, not substantially different from the level that emerged from further increases in group size
  • More significantly, a modest variation in the experimental conditions makes all the difference. The existence of at least one compatriot, or voice of sanity, dramatically reduces both conformity and error. When one confederate made a correct match, errors were reduced by three-quarters, even if there was a strong majority the other way. There is a clear implication here: If a group is embarking on an unfortunate course of action, a single dissenter might be able to turn it around, by energizing ambivalent group members who would otherwise follow the crowd.
  • Brooke Harrington's brilliant study of the performance of investment clubs—small groups of people who pool their money to make joint decisions about stock market investments. The worst-performing clubs were built on affective ties and primarily social; the best-performing clubs had limited social connections and were focused on increasing returns. Dissent was far more frequent in the high-performing clubs. The low performers usually had unanimous votes, with little open debate. Harrington found that the votes in low-performing groups were "cast to build social cohesion rather than to make the best financial choice." In short, conformity resulted in significantly lower returns.
  • In the real world, would-be dissenters might silence themselves when and because they are in a group of like-minded others—partly because they do not want to risk the opprobrium of those others and partly because they fear they will, through their dissent, weaken the effectiveness and reputation of the group to which they belong.
  • Later variations on the original experiments produced even more remarkable results. In those experiments, the victim expresses a growing level of pain and distress as the voltage increases. Small grunts are heard from 75 volts to 105 volts, and at 120 volts, the subject shouts, to the experimenter, that the shocks are starting to become painful. At 150 volts, the victims cries out, "Experimenter, get me out of here! I won't be in the experiment any more! I refuse to go on!"67 At 180 volts, the victim says, "I can't stand the pain." At 270 volts he responds with an agonized scream. At 300 volts he shouts that he will no longer answer the questions. At 315 volts he screams violently. At 330 volts and after, he is not heard. In this version of the experiment, there is no significant change in Milgram's results: twenty-five of forty participants went to the maximum level, and the mean maximum level was over 360 volts. In a somewhat gruesome variation, the victim says, before the experiment begins, that he has a heart condition, and his pleas to discontinue the experiment include repeated reference to the fact his heart is "bothering" him as the shocks continue. This too did not lead subjects to behave differently. Notably, Milgram's basic findings were generally replicated in 2009, with only slightly lower obedience rates than Milgram found forty-five years earlier; men and women did not differ in their rates of obedience.
  • Milgram himself explains his results as showing obedience to authority, in a way reminiscent of the behavior of many Germans under Nazi rule, and indeed Milgram was partly motivated by the goal of understanding how the Holocaust could have happened.
  • An expert or an authority can be a lot like unanimous others. And on this account, some or many of the subjects might have put their moral qualms to one side, not because of blind obedience but because of a judgment that those qualms are likely to have been ill founded. That judgment might be based in turn on a belief that the experimenter is not likely to ask subjects to proceed if the experiment is truly harmful or objectionable.
  • In short, Milgram's subjects might be responding to an especially loud informational signal—the sort of signal sent by a specialist or a crowd. And on this view, Milgram was wrong to draw an analogy between the behavior of his subjects and the behavior of Germans under Hitler. His subjects were not simply obeying a leader but responding to someone whose credentials and good faith they thought they could trust.
  • Why was the defiance of peers so potent? I suggest that the subjects, in this variation, were very much like those subjects who had at least one supportive confederate in Asch's experiments. One such confederate led Asch's subjects to say what they saw; so too, peers who acted on the basis of conscience freed Milgram's subjects to give less weight to the instructions of the experimenter and to follow their consciences as well. Milgram himself established, in yet another variation, that without any advice from the experimenter and without any external influences at all, the subject's moral judgment was clear: do not administer shocks above a very low level.
  • In Milgram's experiments, it was the experimenter's own position—that the shocks should continue and that no permanent damage would be done—that had a high degree of influence, akin to the influence of Asch's unanimous confederates. But when the subject's peers rejected the position of Milgram's experimenter, the informational content of that position was effectively negated by the information presented by the refusals of peers. Hence subjects could rely on their own moral judgments or even follow the moral signals indicated by the peers' refusals.
  • When the morality of a situation is not evident, people are likely to be influenced by someone who seems to be an expert, able to weigh the concerns and risks involved. But when the expert's questionable moral judgment is countered by reasonable people who bring their own moral judgments to bear, people become less likely to follow experts. They are far more likely to do as their conscience dictates.
  • In short, Watts and his coauthors were exploring the relationship between social influences and consumer choices. What do you think happened? Would it make a small or a big difference, in terms of ultimate numbers of downloads, if people could see the behavior of others? The answer is that it made a huge difference. While the worst songs (as established by the control group) never ended up at the very top, and the best songs never ended up at the very bottom, essentially anything else could happen. If a song benefited from a burst of early downloads, it could do exceedingly well. If it did not get that benefit, almost any song could be a failure. As Watts and his coauthors later demonstrated, you can manipulate outcomes pretty easily, because popularity is a self-fulfilling prophecy. If a site shows (falsely) that a song is getting downloaded a lot, that song can get a tremendous boost and eventually become a hit.
  • John F. Kennedy's father, Joe Kennedy, was said to have purchased tens of thousands of early copies of his son's book, Profiles in Courage. The book became a bestseller.
  • Their experiment showed that early popularity can have long-term effects, because people learn from what other people do and seem to like. As people learn from early popularity, they can make something into a huge hit, even if the same song would do poorly in another world in which the early listeners were unenthusiastic.
  • The system of legal precedent also produces cascades, as early decisions lead later courts to a certain result, and eventually most or all courts come into line, not because of independent judgments but because of a decision to follow the apparently informed decisions of others. The sheer level of agreement will be misleading if most courts have been influenced, even decisively influenced, by their predecessors, especially in highly technical areas.
  • In an informational cascade, people cease relying, at a certain point, on their private information or opinions. They decide instead on the basis of the signals conveyed by others. Once this happens, the subsequent statements or actions of few or many others add no new information. They are just following their predecessors
  • A particular problem arises if people think the large number of individuals who say or do something are acting on independent knowledge; this can make it very hard to stop the cascade. Because so many people have done or said something—a politician is great, a product is dangerous, or someone is a criminal—people think to themselves, How can they all be wrong? The reality is that they can be, if they are mostly reacting to what others have said or done, and so are amplifying the volume of a signal by which they have themselves been influenced.
  • Suppose that both Anderson and Barber have prescribed hormone therapy but that Carlton's own information suggests that the risk is high. At least if he is not confident, Carlton might well ignore what he knows and prescribe the therapy. After all, both Anderson and Barber apparently saw a low risk, and unless Carlton thinks his own information is better than theirs, he should follow their lead. If he does, Carlton is in a cascade.
  • Importantly, participants in cascades act rationally in suppressing their private information, whose disclosure would benefit the group more than the individual who has it. The failure to disclose private information therefore presents a free-rider problem. To overcome that problem, some kind of reform seems to be necessary; it might involve changing institutional arrangements.
  • But even among specialists and indeed doctors, cascades are common. "Most doctors are not at the cutting edge of research; their inevitable reliance upon what colleagues have done and are doing leads to numerous surgical fads and treatment-caused illnesses." Thus an article in the prestigious New England Journal of Medicine explores "bandwagon diseases" in which doctors act like "lemmings, episodically and with a blind infectious enthusiasm pushing certain diseases and treatments primarily because everyone else is doing the same."
  • Consider a legal analog, which offers lessons for those engaged in activities outside of law: There is a disputed issue under the Endangered Species Act. The question is what exactly the government has to do to protect endangered species. The stakes are high; environmental groups argue that the government has to do much more than the government is now doing. The first court of appeals to decide the question finds the issue genuinely difficult but resolves the issue favorably to the government. The second court of appeals tends to favor, very slightly, the view that the government is wrong, but the holding of the previous court of appeals is enough to tip the scales in the government's favor. A third court of appeals is also slightly predisposed to rule against the government, but it lacks the confidence to reject the shared view of two circuits. Eventually all circuits come into line, with the final few feeling the great weight of the unanimous position of others, and perhaps insufficiently appreciating the extent to which that weight is a product of an early and somewhat idiosyncratic judgment. Because the courts of appeals are in agreement, the Supreme Court refuses to get involved in the dispute. This can happen a lot—and it makes for bad law.
  • In terms of improving current practice, the implication is clear: judicial panels should be cautious about giving a great deal of weight to the shared view of two or more courts of appeals. A patient who seeks a second opinion should not disclose the first opinion to his new doctor; the goal is to obtain an independent view. In a similar vein, a court of appeals should be alert to the possibility that the unanimity of previous courts does not reflect independent agreement.
  • First, people will often neglect their own private information and defer to the information provided by their predecessors. Second, people are alert to whether their predecessors are especially informed; more informed people can shatter a cascade. Third, and perhaps most intriguingly, cascade effects are greatly reduced if people are rewarded not for correct individual decisions but for correct decisions by a majority of the group to which they belong. Fourth, cascade effects, and blunders, are significantly increased if people are rewarded not for correct decisions but for decisions that conform to the decisions made by most people. As we shall see, these general lessons have implications for institutional design. They suggest that errors are most likely when people are rewarded for conforming and least likely when people are rewarded for helping groups and institutions to decide correctly.
  • But the existence of two early signals, producing rational but incorrect judgments, led all others to fall in line. "Initial misrepresentative signals start a chain of incorrect decisions that is not broken by more representative signals received later."
  • Now turn to the actions of followers. In the hormone therapy case given above, none of the doctors is assumed to have, or believed to have, more information than his or her predecessors. But in many cases, people know, or think they know, a great deal. It is obvious that such people are far less likely to follow those who came before. Whether they will do so should depend on a comparison between the amount of information provided by the behavior of predecessors and the amount of private information that they have. And in theory, the most informed people will often shatter cascades, possibly initiating new and better ones. Whether this will happen, in practice, depends on whether the people who come later know, or believe, that the deviant agent was actually well informed. If so, the most informed people operate as fashion leaders.
  • In systems with freedom of speech and free markets, it is always possible to debunk supposedly authoritative sources. And within groups, it is possible to structure decision-making to reduce the relevant risks. Votes might, for example, be taken in reverse order of seniority, to ensure that less experienced people will not be unduly influenced by the judgments of their predecessors; this is in fact the practice of the U.S. Supreme Court.
  • This claim has an implication for appropriate institutional arrangements: any system that creates incentives for individuals to reveal information to the group is likely to produce better outcomes. A system of majority rule in which individuals know their well-being will be promoted (or not) depending on the decision of the group therefore has significant advantages.
  • In this light, we might even offer a suggestion about the nature of civic responsibility: in case of doubt, citizens should reveal their private signal, rather than disguising that signal and agreeing with the crowd. Perhaps counterintuitively, this kind of behavior is not optimal from the point of view of the individual who seeks to get things right, but it is best from the point of view of a group or nation that seeks to use all relevant information.
  • By contrast, we can imagine a different kind of person, the contrarian, who thinks he will be rewarded, financially or otherwise, simply for disagreeing with others. There is no reason to celebrate the contrarian.
  • If conformity is rewarded, the problem is especially severe for the earliest disclosers or dissenters, who "may bear especially high costs because they are conspicuous, individually identified, and easy to isolate for reprisals." And if the earliest dissenters are successfully deterred, dissent is likely to be exceedingly rare. Authoritarian governments are well aware of that fact; they try to nip dissent in the bud. But once the number of disclosers or dissenters reaches a certain level, there may be a tipping point, producing a massive change in behavior. Indeed a single discloser, or a single skeptic, might be able to initiate a chain of events by which a myth is shattered.
  • Or consider the question of dissent in wartime. It is important for those who wage war to know what citizens really think and also to have a sense of actual and potential errors. But it is also important, especially in wartime, for citizens to have a degree of solidarity, to be broadly optimistic, and to believe they are involved in a common endeavor; this belief can help solve collective action problems that otherwise threaten success.
  • It is important to acknowledge that the problem I am emphasizing—the failure to disclose accurate information that will benefit the public—is closely paralleled by the problems raised in many cases in which silence, not revelation, is a collective good. And if disclosure will spread inaccurate information, it is unlikely to be beneficial, especially if it negates the beneficial effects of previous decisions or produces a cascade of its own (recall the spread of fake news). Because my focus is on the failure to disclose information, I will not devote attention to situations in which silence is golden, except to note that the basic analysis of those situations is not so different from the analysis here.
  • Dissenters may be self-serving, and they may be trying to spur their stalled careers. It happens all the time. People who run a website might become popular because of their iconoclastic or even wild views. A political dissenter, challenging some widespread practice, sometimes becomes more prominent and more successful as a result. Judges who dissent in high-profile cases might not greatly fear that their reputation will be harmed; they might think the dissent will redound to their benefit.
  • Public dissenters might impair their reputation in one group but simultaneously strengthen it in another. On a radio show, on Facebook, or on Twitter, they might be saying, "Look at me!" And if people look at them, they might be able to advance in some way that matters to them. Of course, some people say and do exactly what they think and do not greatly care about their reputations; they want to add information. They are rebels with a cause.
  • But return to my main concern. Too much of the time, people do not want to lose the good opinion of relevant others, and the result of this desire is to reduce the information that the public obtains. Apart from information, people might have preferences and values. They might believe that new immigrants should be welcomed. They might believe in animal rights. But in either case they might not reveal what they think, simply because of the pressure to conform. I have suggested that from the standpoint of democratic practice, this is a problem as well. Most of the time, it is valuable for people to disclose what they want and what they value. The basic findings, as in the urn experiments, would undoubtedly be the same for preferences and values as well as facts, with rewards for conformity greatly increasing the apparent (not real) degree of agreement.
  • The practice of sexual harassment long predated the idea of sexual harassment, and the innumerable women who were subject to harassment did not like it. But too much of the time they were silent, partly because they feared the consequences of public complaint. It is interesting to speculate about the possibility that many current practices fall in the same general category: those that produce harm, and are known to produce harm, but persist because most of those who are harmed believe they will suffer if they object in public.
  • A memorable claim by the philosopher Joseph Raz clarifies the point: "If I were to choose between living in a society which enjoys freedom of expression, but not having the right myself, or enjoying the right in a society which does not have it, I would have no hesitation in judging that my own personal interest is better served by the first option." The claim makes sense in light of the fact that a system of free speech confers countless benefits on people who do not much care about exercising that right.