Share Button
f3gormans_lillash_science-denial

The four Gormans are all Penn alumni and all involved professionally in the mental-health field. In a new book, two of them—daughter Sara and father Jack—take a careful look at the psychological factors driving science denialism and how to counter them. Hint: more data isn’t the answer.

BY JOANN GRECO | Illustration by Rich Lillash


f3_thumb_0318
Download a PDF of the article

Earlier this year—and the winter before it— Sara E. Gorman C’07 dutifully trotted off to receive a flu shot. Shortly after, though, she became convinced that she had actually gotten the bug.

“I had the 24 hours of high fever, the aches, and everything else that goes with it,” she recalls with a chuckle. “And what went through my head each time was: What the hell? This is ridiculous! What good did the vaccine do?

Then I calmed down and realized, even if I did have the flu, it was going to be less severe and of shorter duration because I got the shot.”

Her experiences provide an interesting thought experiment, Gorman continues. “Because I was taxed and stressed about feeling sick, I wasn’t as able to mobilize the forces of my rational brain.” Given her own reaction, she finds it easy understand the public hysteria after a handful of cases of Ebola turned up in the United States a few years ago or why some parents may be fearful about vaccinating their children.

This empathy toward, and curiosity about, behavior typically characterized as irrational animates Denying to the Grave: Why We Ignore the Facts that Will Save Us (Oxford University Press, 2016). Gorman, a public-health specialist, wrote the book with her father, Jack M. Gorman C’73, a psychiatrist who is CEO and chief scientific officer at Franklin Behavioral Health Consultants—which he founded after a long career in academic medicine at Columbia University and Mount Sinai School of Medicine—and has written The Essential Guide to Psychiatric Drugs and co-edited several psychiatry textbooks.

Denying to the Grave attempts to understand why otherwise intelligent people often fall prey to self-damaging beliefs when it comes to health science. Drawing on case studies, research, and their own observations, the Gormans identify six key drivers of this phenomenon—conspiracy theories, charismatic leaders, confirmation bias, confusing correlation and causality, avoidance of complexity, and risk perception and probability—and examine the behavioral and neuroscientific reasons people are swayed by them.

The book is intended as a guide to the “vast majority of people who are kind of in the middle on a lot of the issues we address, such as GMOs [genetically modified organisms], vaccines, and gun ownership,” says Sara. “Our work is to catch them while they’re in this uncertain state, and make sure we don’t just leave it so they move toward what we see as the irrational side of the argument, or inadvertently push them to doubt us even more.”

By “we” and “us,” the younger Gorman means the scientists and healthcare professionals who, she and Jack believe, often do more harm than good. “We see the experts yelling numbers and data at them, and we’re saying that’s the wrong approach,” Sara continues. “We have to figure out where these beliefs and doubts are coming from. We know the psychological causes of some of these denial tendencies is not that people don’t have access to data or are too dumb to understand it. It’s something else—and the point of our book is to try and identify that something else.”


Growing up Gorman, Sara and her older sister, Rachel Moster C’04, were surrounded by discussions about mental health. Their mother, Lauren Kanter Gorman CW’72, is also a psychiatrist, with a private practice.

“People ask me all the time, what it was like growing up with two parents who were psychiatrists,” says Rachel. “And I say—they were just my mom and dad. We were a very close-knit family, and they were both very involved with our lives.”

Rachel was drawn to a career in medicine, though initially she tried to “avoid psychology out of a delayed teenage rebellion,” she adds. “We couldn’t really get away from it, though. We heard about it all the time, and it was so fascinating.” After considering neurology as a specialty, she eventually entered psychiatry and now works with inpatients who are acutely ill with psychiatric disorders at Columbia’s New York Presbyterian/Allen Hospital.

While Sara shared the family fascination—often pulling the Diagnostic and Statistical Manual of Mental Disorders from a shelf to read along as her mother talked on the phone—she never wanted to be a doctor. “I wanted to become a poet,” she says.

Her mother describes Sara as an “endlessly curious, quirky kid. She loved Latin, she loved languages, loved being on stage.” She was particularly obsessed with opera—a family passion, her father adds. “When she was about eight, we went to see a production of La Traviata, but the guy at the window said the only tickets left were those with an obstructed view,” he recalls. “Sara goes, ‘That’s okay, Daddy, I already know the whole libretto.’ All of a sudden, the guy disappears and comes back with these phenomenal tickets, muttering, ‘No child who loves opera that much …’”

Rather than medical school, Sara followed up her studies in English literature at Penn with a master’s degree from Oxford and a Harvard PhD in the field—only to switch gears and enter the “family business” from a fresh angle by pursuing a career in public health. “I realized that academia was going to be a very solitary life, and a very slowly paced one,” she says. “This author’s still dead, that book’s still 600-years-old.”

After adding a master’s degree in public health from Columbia to her resume, she now works as a project manager for the Global Public Health division of pharmaceutical giant Johnson & Johnson, creating programs to improve healthcare around the world, with a particular focus on mental health, behavioral science, and public-health research. In one recent project, Sara traveled to Rwanda to help set up clinics and train healthcare workers as part of the government’s continuing efforts to combat the traumatic effects of the 1994 genocide.

“In my family, everybody’s work touches on actual people and trying to make the world a better place,” she says. “I’ve discovered that, for me, public health offers the perfect marriage of the social sciences and hard science.”

Her father is tickled by the different paths that led to the family’s common focus: “All four of us started out at Penn and then, once we settled on medicine, all four of us changed our fields!”

Born into a middle-class Bronx household where his father was a certified public accountant and his mother an elementary-school teacher, Jack remembers “always being interested in becoming a doctor.” He chose Penn for its well-regarded pre-med program, but decided to major in English. “It might have been the times,” he says. “This was the heyday of the Vietnam War, and I think I had an abnormal college experience all around. I didn’t really have a sense of college life; my time there was mostly spent joining in protests.”

He moved on to medical school at Columbia, where he met Lauren on his first day and discovered that they had both been at Penn at the same time. They married in 1977.

Looking for a research field, Jack interned at Columbia’s Babies Hospital but found “working with inpatients sad and with outpatients boring,” he says.

“Maybe you should think about doing something else,” Lauren suggested.

“But I love little children!” he protested.

“Then that’s a good reason for us to have one,” she retorted.

Soon enough, Rachel was on the way and Jack began exploring a career in psychiatry. Not long after, Lauren, tiring of the “gadget-oriented, surgical focus” of her chosen discipline of ophthalmology, abandoned it for psychiatry as well.

Rachel has collaborated with her father in the past—while studying evolutionary biology at Penn, she coauthored a paper with him about the development of fear and anxiety, and she later contributed to a chapter on combining psychotherapy and medicine in the treatment of anxiety disorders in a textbook he edited—but Lauren says it makes sense that Sara would be the daughter to go ahead and write an entire book with him. “They both love research and writing—after all, Jack was an English major, too,” she observes. “They had a lot of things in common from early on. He would read poetry to her and that became a special love for her, just as it is for him.”

At the time they began their collaboration, Sara had become “really interested in the phenomenon of anti-vaccine advocates and why very intelligent people were buying [into] incorrect ideas about immunization,” Jack recalls, while he had been doing a lot of thinking about gun ownership. “I’ve lived my whole life in New York City—and Philadelphia when I was in college—and so I’ve had no real contact with gun owners. Basically, I thought they were hunters who lived in Montana, so I was surprised that most gun owners have them for protection. The data is very clear on that.” Multiple studies also show that having a gun in the house is more dangerous than not, both ineffective as protection and increasing the likelihood of murder or suicide, “so I wanted to understand why someone denies the evidence,” he says. “We realized we were thinking about the same things, and there were lots of other examples.”

Working together, they insist, was a breeze.

f3gormans_180128_hamerman_04-v101

“Everyone assumes it must be a nightmare to be in a professional relationship with your dad,” Sara says. “But actually, he’s one of the easiest people I’ve ever worked with. He’s very clear, he gets things done on time—”

“That’s because we have weekly meetings, and she always has an agenda,” Jack finishes. “We get through everything, and then she assigns me a lot of work.”

They divided the drafting of the book’s chapters—each considering a different reason that people ignore or deny scientific evidence to the detriment of their health and safety—and then handed them back and forth “until we were satisfied that they didn’t sound like they were written by different people,” says Jack.

In picking which issues to focus on, the duo was keenly aware of their own liberal biases. “We didn’t want it to seem as if we were only using examples that aligned with our politics,” says Jack. “So, GMOs, for example, is one that angers our left-leaning friends. The same with nuclear energy. It was a very good exercise to really consider what the data reveals, even if we’re pre-programmed to be against these things.”

Sometimes, as with views on nuclear power, that pre-programming has to do with the different ways experts and laypeople view risk. “Experts judge risk in terms of quantitative assessments of morbidity and mortality,” the Gormans write, drawing on research by Paul Slovic, a professor of psychology at the University of Oregon and president of Decision Research. “Yet most people’s perception of risk is far more complex, involving numerous psychological and cognitive processes.”

Scientists can supply all of the facts and figures they want, but people “still think that only what we aren’t in control of can hurt us,” they continue. Since the risk of a nuclear meltdown is perceived as not only being beyond our control but potentially catastrophic, the rational appeals of epidemiologists (such accidents only happen once every few decades) or climatologists (traditional methods of producing electricity are destroying the environment) often fall on deaf ears. “Slovic’s analysis goes a long way in explaining why we persist in maintaining extreme fears of nuclear energy while being relatively unafraid of driving automobiles, even though the latter has caused many more deaths than the former,” the Gormans write. “The risk seems familiar and knowable. There is also a low level of media coverage of automobile accidents, and this coverage never depicts future or unknown events resulting from an accident. There is no radioactive ‘fallout’ from a car crash.” This “nonlinear estimation” of risk is hardwired into our brains, according to research they cite.

Opposition to GMOs, they say, is fueled by the lure of conspiracy theories. “The overwhelming scientific consensus is that GMOs are not harmful to human health, but the details of how a gene is inserted into the genome of a plant to make it resistant to insects, drought, or herbicides and pesticides requires explaining some complicated genetics,” the Gormans write. “On the other hand, Monsanto [the preferred target of GMO opponents] is a huge company and its business practices are aimed at bigger and bigger profits … Hence, we are swayed by the simple belief that Monsanto has a plausible motive to deceive us.”

In another context, liberal opponents might be expected to “wholeheartedly endorse GMOs as way of saving the lives of millions of impoverished Africans and Asians,” they write. “But at this point, explaining to them that GMO foods are in fact not dangerous and that without them millions of people stand to suffer diseases and hunger that could be averted has virtually no impact.”

Given their political sympathies, the Gormans themselves might easily feel that way—especially nudged along by another driver of scientific denial: confirmation bias.

To illustrate how this concept works, they offer this telling quote: “I wouldn’t have seen it if I didn’t believe it.”

They attribute the saying to the late baseball great and cultural commentator Yogi Berra—a fount of such quips, many of which he didn’t actually say. That one, in fact, as a Google search reveals, is commonly credited to media guru Marshall McLuhan. Nevertheless, they write: “We love the line, it makes our point well, and we are going to stick to the Yogi attribution no matter what kind of discomforting evidence comes up.”

It’s tempting to dig in your heels when challenged, adds Sara. “I don’t want to change my mind, do you?” They cite research showing that “staying put with a point of view activates the pleasure centers of the brain whereas making a change excites areas of the brain associated with anxiety and even disgust.”

Even when we consciously know a belief is incorrect, it can still affect behavior—as illustrated by a story about Sara in the book. She needed to get a replacement charger for her laptop. The first time she used it she noticed that the charge began registering as she moved her laptop closer to an electrical outlet, and deduced that the “laptop must need to be a certain distance from the electrical socket for the charger to work.” Eventually, she was interrupted before she had a chance to move the laptop to the required proximity, and it began charging anyway; for whatever reason, there was a momentary lag before the replacement registered the charge. Despite this proof that it was simple coincidence, however, she continued her ritual of moving the laptop closer to the outlet for charging.

The book uses this anecdote to introduce one more driver of scientific irrationality: a tendency to ignore the maxim that “correlation is not causation.” “We are primed to appreciate and recognize patterns in our environment,” the Gormans write. “Our desire to attribute causality is strong enough to override even our own conscious rationality.”

The anti-vaccine movement represents the most potent example of this phenomenon in the context of health. “The timing of childhood vaccines and the onset of autism are so close that they can be considered synchronous,” they write. “This synchronicity opens up the potential for coincidence to be interpreted as cause.”

Last summer’s outbreak of measles in the Somali community of Hennepin County, Minnesota, is a recent example of the real-world impact of this view. Eventually, 79 incidents were confirmed—more than the total recorded for the entire United States the year before—with the great majority of them involving unvaccinated kids. The episode cost officials about $1 million to contain, and required the intervention of local imams to convince parents to defy the anti-vaxxers who have long targeted this immigrant population.

When it comes to personal or public health decisions, these kinds of events—and the people who propel them—can indeed literally kill us, says Sara. Figures like Andrew Wakefield, the disgraced British physician who falsified data to claim a link between the MMR vaccine and autism; once-respected cancer researcher Peter Duesberg, who more recently has denied a link between HIV and AIDS; and gun advocate Wayne LaPierre, executive vice president of the National Rifle Association, have “done tremendous harm. They prey on people who are vulnerable because they’re under stress.”

“They encourage people to doubt science,” Jack adds.

At work here is perhaps the most pernicious anti-science driver: the charismatic leader.

“These people are brilliant at appealing to common shared concerns,” observes Jack. “They can put themselves in the position of being a victim, ‘like you.’ There’s a lot of basic us-versus-them psychological mechanisms. As an experiment, I recently, and very painfully, went through the experience of listening to a speech by LaPierre, and I was struck by how he seldom said anything like, ‘Get a gun.’ Instead it was about personal freedom, human rights …”

“Out of context, it would sound perfectly okay,” Sara adds.

“Yes, if he was talking about, say, gay marriage, it would be absolutely palatable for certain listeners,” says Jack.

Such leaders “have a good sense of the psychological features of group formation and identity,” they write, and appeal to what neuroscientists call the “emotional brain.” If someone “can make us sufficiently frightened when he or she first gives us misleading or false information, it may set in motion neural processes that inhibit our ability to correct this initial impression.”


How can scientists combat the misguided beliefs that all these fear-mongers and psychological manipulators give rise to? The Gormans offer several proven psychological strategies. One is motivational interviewing, a technique that teases out a kernel of an idea or behavior that the subject is willing to adopt and builds on it. In the case of vaccine-skeptical parents, for example, that might be the statement, “I want to make sure my children are as protected as possible from preventable infectious diseases.”

Starting with these premises, the process “develops along Socratic lines,” in which the interviewer—whether a clinician, concerned friend, or determined debunker—“assesses at each step what the interviewee knows and has heard about the topic and what he or she wants to know.” The idea is not to lose listeners in a barrage of information—on the genetic and environmental causes of autism, say—that may further frighten or confuse them. “Substituting misinformation with new facts is not guaranteed to change minds,” the Gormans write. “The groundwork first needs to be laid to create a receptive, calm audience that is open to considering new ideas.”

Stats don’t cut it, but stories do. When individuals believe that correlation is causation or give in to confirmation bias, they are relying on the power of a personal narrative that they’ve experienced or heard from a friend. “Stories are easier to understand, and so people are more comfortable with them,” offers Jack.

In Denying, the authors demonstrate the concept by discussing how to counter a powerful anti-vaccine narrative: a mother’s description of the death of her infant and the paralysis of her toddler after they received a series of shots. The story was published on the website of a group called the ThinkTwice Global Vaccine Institute.

“How can we mount a human response … that acknowledges the pain and suffering of the family who lost a child but does not use it to sacrifice the value of vaccination?” the Gormans ask. One could come back with a series of questions that dig deeper into the actual cause of the child’s death, or which look at how often vaccines cause death, but “all of this appeals to the prefrontal cortex rather than the amygdala and probably appears heartless.” Instead, the Gormans suggest telling a story “about a 5-year-old with leukemia who is exposed to the [potentially deadly] measles virus because of contact with an unvaccinated playmate.” While public-health officials may be loath to fight emotional fire with more of the same, “neuroscientists, psychologists and behavioral economists document over and over again that emotional messages carry more valence than fact-based ones,” they write.

Such a strategy can also backfire, though, as political scientist Brendan Nyhan and colleagues demonstrated in a 2014 article in the journal Pediatrics. Their study presented a randomized group of parents with pro-vaccine messages presented in four escalating levels of emotional intensity: facts, risks, narratives, and, lastly, graphic images of desperately sickened children. Surprisingly, the latter two tactics actually resulted in upticks in the percentage of respondents who took a negative view of vaccines.

“When fear-based campaigns in public health don’t work, they can be disastrous,” Sara acknowledges. “You’re approaching someone in a very emotional state, and you may be reactivating those same fears without changing the content that’s underneath the emotion.”

It’s a conundrum that has led the Gormans to focus on better understanding how to craft and deliver successful scientific messages to the public. “There’s not a lot of time and resources devoted to this,” Sara observes. “Public health is an evidence-based field, and we’re advocating systematic research on how people change their minds about health-related decisions.” Their own takes on what works and needs to be done appear in the book’s conclusion. In addition to tactics such as motivational interviewing and recognition of people’s discomfort with facts, causality, and risk perception, they call for a renewed emphasis on better scientific education for children and better training for members of the media. And, perhaps most importantly, they suggest that scientific agencies themselves must be bolder and willing to “join the conversation in a much more active way” to more effectively “anticipate public fears and concerns and take action before incorrect ideas become unchangeable.”

Jack and Sara trade off writing a regular blog for Psychology Today and are currently at work on a follow-up book that will deal more specifically with science literacy and the ways that the science establishment (including science journalists) can enhance it. They have also formed a new company, called Critica—motto: Think Deeply, Think Well—through which they hope to bring some of these suggestions to life. Designed “to develop and test new methods of advancing public acceptance of scientific evidence and promoting informed health decision-making,” as their website puts it, the firm will offer consulting, training, and continuing education for science educators, government officials, journalists, and healthcare professionals.

Last Fall, Sara appeared on a panel at the annual meeting of the New Jersey Pharmacists Association, and she also represented the company to deliver a talk at TEDMED in Palm Springs. “[Jack’s] very generous about giving me the spotlight when it comes to speaking or other opportunities,” she says. “He knows that I’m very early in my career.”

In addition to his work at Franklin Healthcare Consultants, Jack is also writing a neuroscience text that focuses on how life experiences change the physical structure of the brain, while devoting time to philanthropic causes such as the United Jewish Appeal and advocating action on climate change.

Whenever he reads a scientific article on the latter, he says, “I realize I have to trust the authors, because I can’t really understand how they’re modeling and measuring levels of, say, ocean rise. It’s a very sobering thing—to put yourself in the position of a non-scientist who’s trying to figure out something.”

And that’s the point, adds Sara. “This isn’t about ‘those people,’” she says. “Everyone has blind-spots. It doesn’t matter how educated you are—it comes down to psychological factors.”


JoAnn Greco writes frequently for the Gazette.

Share Button

    Related Posts

    Tim Beck’s Final Brainstorms
    Anxiety Sisterhood
    Wellness Warriors

    1 Response

    1. Alison Bernstein

      Thank you for this article! I have become very active in science communication (I run a Facebook page called Mommy PhD and was featured in a documentary called Science Moms). There is a lot of discussion amongst science communicators about why people continue to believe things that aren’t backed by evidence. It’s abundantly clear, as the authors say, that the information deficit model is not an accurate description of why people don’t follow the science. I especially like the part where the authors identify the 6 key drivers of science denial. I am particularly interested in risk perception and have been collaborating on a series of articles about common mistakes we make in risk perception (you can find these on the SciMoms blog at scimoms.com).

    Leave a Reply