by Jennifer Ouellette

From Boom Fall 2015, Vol 5, No 3

“People are generally better persuaded by the reasons which they have themselves discovered than by those which have come into the mind of others.”
—Blaise Pascal, Pensees

This past July, when California Governor Jerry Brown signed into law the bill ending personal exemptions from state vaccination requirements, many cheered this action to protect public health. But actor Jim Carrey did not, taking to Twitter to lambast Brown for the decision and calling him a “corporate fascist.”

In vain did scientists and other factually minded folk step in to point out the many fallacies in Carrey’s stance, which rested on the trace amounts of mercury contained in some vaccines. In fact, the mercury in vaccines and the mercury we justly fear in fish are completely different, chemically speaking. It makes as much sense to claim that wood alcohol—highly toxic methanol, chemically distinct from the ethanol used in drinkable spirits—and coq au vin are equally bad for you. Presented with the scientific evidence, did Carrey change his views? He did not. He doubled down, instead voicing doubt about the validity of countless studies showing no link between vaccines and autism, tweeting to his fourteen million followers, “A trillion dollars buys a lot of expert opinions. Will it buy you?”

Carrey’s behavior wouldn’t surprise Dartmouth political scientist Brendan Nyhan. It’s just one more piece of mounting evidence that those most stubbornly committed to rejecting well-established science are largely immune to facts that don’t fit their views. Surely, rationalists have thought for decades, those in denial are merely ignorant. If we simply educate them and show them the error of their ways, they will change their minds and embrace scientific truth. (Perhaps they will even thank us profusely for our trouble.) This approach is known as the “information deficit model”—it assumes that a lack of factual understanding is the primary culprit behind staunchly antiscience stances, and hence the solution is to beat the public into submission with a bombardment of cold, hard facts.

The first in a series of photographs by Svend Keller documenting a phase transition from a liquid to a solid. Courtesy Andreas Keller.

 

The deficit model not only doesn’t work, it can backfire. Badly. So badly it has a name too: the “backfire effect.” Study after study has demonstrated that presenting hardline denialists with facts just makes them dig their heels in deeper.

Unfortunately, as Nyhan’s research also shows, clever alternative strategies to sneak past people’s cognitive biases—appealing to their emotions, or telling a compelling story—aren’t nearly as effective as one would hope. Last year, Nyhan conducted a study exploring attitudes about vaccines and examining what it might take to get people with strongly held beliefs—like Carrey—to change their minds. Nearly two thousand parents with mixed or negative feelings toward vaccines were shown one of four pro-vaccination campaigns, each adopting a different persuasive strategy—facts, science, emotional appeals, or stories—to see which was most effective in changing minds. (There was one control group.) The punchline: none of the above. Nothing changed people’s minds. “It’s depressing,” Nyhan admitted to The New Yorker.

This is the fundamental challenge of creating widespread collective change: it must start with the individual. But as Nyhan’s work makes clear, changing one person’s mind is no easy feat. So should we throw up our hands in despair at ever making a difference in swaying public opinion? Not necessarily. People sometimes do change their minds or alter core beliefs. But whether or not they do so appears to depend on how strongly they connect a particular opinion or belief with their personal identity. Political and religious affiliations, for example, are major factors in one’s personal identity. Thus, our beliefs about certain issues tied to those affiliations will be stronger “core” beliefs, and new information that challenges those core beliefs will be rejected. Opinions or beliefs less central to core identity are easier to discard when new information is received.

But there lies within a glimmer of hope, because identity is fluid. It shifts and evolves continually over a lifetime in response to personal experience. That means the question, “Who are you?” will evoke a different answer at different points in a person’s life. Change a person’s self-identity, even a little, and you just might have a better shot at swaying their opinion.

I like to think of the process as something akin to a phase transition. Any substance has a specific moment when the pressure or temperature is just right to cause it to shift from one state to another. Take water as a simple example: lower the temperature sufficiently and it will turn into ice; raise the temperature to a boil and it will evaporate into steam. Those are phase transitions. The precise moment when this happens is called the critical point—colloquially, the “tipping point”—when the substance is perfectly poised halfway between one phase and the other.

When this process is plotted out neatly on a graph, it doesn’t produce a continuous curved slope as the substance moves between phases. Rather, it resembles a staircase of sorts: the temperature drops (or rises) a bit, then there’s a long stretch where nothing appears to be happening at all. One might be forgiven for concluding that the old maxim is true: the watched pot will never boil. But then there is a sudden shift again as it hits the critical point and moves into a new phase. The kettle whistles. Teatime.

In nature, phase transitions are ubiquitous. They’re equally useful when thinking about complex systems in our cultural infrastructure: contagious diseases, electrical power grids, global financial markets. Raissa D’Souza, who studies phase transitions in complex networks, says they are also useful when considering profound shifts in public opinion.

D’Souza is a physicist-turned-computer modeler who straddles multiple departments at the University of California, Davis. She recently examined the role of zealots in swaying public opinion. Zealots are highly passionate, committed, and outspoken people who will never change their beliefs. Researchers also tend to use the labels “influentials” for people who can easily influence others and “susceptibles” for those who are, as D’Souza puts it, “a little easier to sway.” (Carrey would be both a zealot and an influential.) Unlike susceptibles, whose beliefs might evolve over time, zealots are “phase-locked,” says D’Souza—frozen in their core beliefs, like the tightly packed molecules in a crystal lattice of solid ice.

D’Souza’s group has done computational modeling of how opinions change in a population. They found that a small number of zealots can sway a large population over time, if there are no zealots on the opposite side. “The fact that just a few committed individuals can change public opinion is concerning when we think about climate change or vaccinations,” she says, particularly given the unlikelihood of a true zealot ever changing his or her mind. But D’Souza’s models also found that a roughly equal numbers of zealots on opposite sides of an issue produce a predictable result: stalemate. When there are more than two competing positions, a stable result is hard to predict.

Which still leaves open the question of how opinions change.

It is one thing to study collective shifts in public opinion. It is quite another to explore the complex murky depths of how a single person forms an identity and how that identity then evolves over time. But the phase transition analogy still applies, except now it is the collective influence of personal experience shaping an individual mind, rather than many individuals shaping public consensus.

Human behavior is inherently unpredictable, so there is no “one size fits all” description of how personal opinions may shift. There is more than one way to change a mind, and there is also more than one kind of phase transition. Take the case of water turning into ice or steam. The final shift may occur abruptly, but the underlying process is smooth and continuous. Similarly, we tend to think of pivotal moments in life as a sudden shift in perspective—a radical conversion of sorts, like Saul on the road to Tarsus. At least that’s how it seems. We don’t stop to consider the myriad past experiences that led Saul down that fateful road in the first place, or what role those experiences may have played in his seemingly sudden conversion.

The final photograph in the phase transition series by Svend Keller. Courtesy Andreas Keller.

 

Sudden-seeming, dramatic conversions are relatively rare. Most people’s opinions shift gradually, by degrees. Earlier this year, the Washington Post ran a wrenching essay by a rabbi named Gal Adam Spinrad, detailing how, over a period of twelve years, she shifted from being fearful of vaccinating her young daughter, to insisting that the entire family stay up-to-date on all immunizations, including flu shots. There was no one moment when she changed her mind completely; rather, many different incidents and experiences over the years slowly reshaped her thinking by degrees.

Initially, as a new young mother in San Francisco, she self-identified strongly with her local home birth collective’s views that vaccines could be harmful to newborns, delaying her daughter’s first immunizations until the child was a year old. Then Spinrad developed shingles while abroad, a result of having had chickenpox as a child, and she began to see the value in such protections. A second daughter, born with a serious congenital defect, lived just fifty-eight days, and a broken-hearted Spinrad realized she couldn’t continue to take her children’s health (and her ability to protect them) for granted. Moving to the Midwest and finding a staunchly pro-vaccination doctor sealed the deal. By 2013 she finally understood the concept of herd immunity and why it wasn’t just about protecting her children. It was also about protecting other vulnerable members of society who, for various reasons, couldn’t be vaccinated.

Those are just the episodes Spinrad recalled as she shaped her narrative with the advantage of hindsight. There were likely countless other tiny things, adding up over the years with seemingly imperceptible effects, until that critical threshold was reached.

Applying phase transitions to shifting opinion has already seeped into psychology. It’s been dubbed the “affective tipping point,” akin to the critical point, in this case one that describes a phase change in emotion or feeling. A 2010 paper in the journal Political Psychology involving so-called “motivated reasoners” found intriguing evidence for an affective tipping point in voter opinions. At some point, the authors reasoned, repeated encounters with new information at odds with closely held beliefs can overcome the backfire effect, potentially pushing even the most stubborn denialists out of phase-locked mode, past the critical threshold, to shift their stance, even if just a little.

Nyhan cautions that it is very difficult to get phase-locked people to reach that stage: “It may take overwhelming evidence to the contrary if the belief is deep-seated and psychologically meaningful enough,” he says, “but it can and does happen.” He thinks that ultimately, people are swayed over time by a combination of shifting opinions in their social circles—the herd mentality—and among high-profile “influentials.” He points to the rapid collective shift in public opinion about gay marriage as one example.

The 2010 Political Psychology study also found that voters become more anxious as they approach the tipping point, because their cognitive dissonance increases accordingly. This could be a driving factor behind Carrey’s Twitter rant. Perhaps the person loudly and passionately ignoring all of the scientific evidence contradicting any link between vaccines and autism is so forceful precisely because he is close to the affective tipping point—when cognitive dissonance is at its most intense, and he is closest to shifting to a new phase of thinking. Or perhaps that’s just seriously wishful thinking.

We can’t control what people experience and how their opinions and beliefs evolve in response—the same way we can’t control the stock market, or epidemics, or whether that latest YouTube video goes viral. Your smidgen of input is just one factor among many working to shape a complex system over time. All you can do is sow the seeds and hope some of those seeds find fertile ground. When you get discouraged, remember this: we can’t know another person’s innermost thoughts. Those seeds might not flourish for months, or years. You might not see any outward change at all for a good long while. That doesn’t mean your efforts were wasted; beneath the surface, any number of seeds could be taking root, slowly growing toward that critical threshold. Opinions can and do change, individually and collectively. The California vaccine legislation is proof of that.

Posted by Boom California