When Kevin Esvelt, an evolutionary biologist at MIT, started thinking about using genetically engineered mice to fight Lyme disease, among his first stops was a community meeting in the small Martha’s Vineyard town of Chilmark. Esvelt makes regular field trips to talk to the public about his work. If the potential of tools like CRISPR to solve the problems of disease, hunger and environmental catastrophe is ever to be realize, he reasons, first the public will have to be convinced it is not about to usher in the apocalypse.
As technologies like genetic engineering are increasingly poised to alter the world we live in, scientists like Esvelt are finding that public outreach is as much a part of their job as the science itself.
A new study, though, casts doubts on whether more information about science can really change someone’s mind. If a person is already pre-disposed to disbelieve scientific evidence about topics such as human evolution, climate change or stem cell research because of religious or political views, a study by social scientists at Carnegie Mellon in the Proceedings of the National Academy of Science found, learning more about the subject may actually increase their disbelief.
This, of course, is a major challenge to the idea that education conquers all.
“We find that beliefs are correlated with both political and religious identity for stem cell research, the Big Bang, and human evolution, and with political identity alone on climate change,” the researchers wrote. “Individuals with greater education, science education, and science literacy display more polarized beliefs on these issues.”
To judge how “educated” someone was, researchers looked at markers such as the number of years in school, highest degrees earned, aptitude on general science facts and the number of science classes someone had taken.
There was some good news: those already pre-disposed to trust peer-reviewed science are more likely to always believe it. And two topics they looked at—GMOs and nanotechnology—appeared unaffected by political or religious belief.
In a highly politicized world where fake science news can gain as much traction as the real deal, scientists have increasingly focused on understanding why people don’t accept the findings of scientific experts on controversial issues.
“Science denial, as a behavior rather than a label, is a consequential and not-to-be ignored part of society,” John Cook, a cognitive scientist at George Mason University, wrote earlier this year in the National Review. “When people ignore important messages from science, the consequences can be dire.”
We have recently seen those dire consequences unfold in the US. The sentiments of anti-vaxxers, though science clearly shows that vaccines are safe, have spurred major measles outbreaks in Minnesota. In Texas, the influence of this anti-vaccine sentiment has begun to take hold in the state legislature. (Not to mention the White House.)
In Key West, though scientists flew in to community meetings to help explain the science behind a proposal to release genetically modified, Zika-fighting in the area, the community ultimately struck down the proposal despite an ongoing Zika crisis, amid swirling rumors that the mosquitoes might result in undesired effects such as sterile children.
Science already tells us that the opinions of the public are often divorced from that of science. According to a Pew Research Center survey in 2015, while 88% of scientists believed that it is safe to eat genetically modified foods, only 37% of the public does. Likewise, 87% of scientists believed the planet is getting warmer because of human activity, while only 50% of the public does. The new study, unfortunately, doesn’t shed light on why rational discuss seems to fail driving home the truth.
What all this means is that scientists like Esvelt have a much harder job than we’d ever realized—they much appeal not just to our rational selves, but to our emotional, political and religious ideologies in arguing for progress.