Knowledge About The Nature

Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

977700

research-article2020
PUS0010.1177/0963662520977700Public Understanding of ScienceWeisberg et al.

P U S
Research article

Public Understanding of Science

Knowledge about the nature


1­–19
© The Author(s) 2020
Article reuse guidelines:
of science increases public sagepub.com/journals-permissions
DOI: 10.1177/0963662520977700
https://doi.org/10.1177/0963662520977700
acceptance of science journals.sagepub.com/home/pus

regardless of identity factors

Deena Skolnick Weisberg


Villanova University, USA

Asheley R. Landrum
Texas Tech University, USA

Jesse Hamilton and Michael Weisberg


University of Pennsylvania, USA

Abstract
While people’s views about science are related to identity factors (e.g. political orientation) and to knowledge
of scientific theories, knowledge about how science works in general also plays an important role. To test
this claim, we administered two detailed assessments about the practices of science to a demographically
representative sample of the US public (N = 1500), along with questions about the acceptance of evolution,
climate change, and vaccines. Participants’ political and religious views predicted their acceptance of scientific
claims, as in prior work. But a greater knowledge of the nature of science and a more mature view of how
to mitigate scientific disagreements each related positively to acceptance. Importantly, the positive effect
of scientific thinking on acceptance held regardless of participants’ political ideology or religiosity. Increased
attention to developing people’s knowledge of how science works could thus help to combat resistance to
scientific claims across the political and religious spectrum.

Keywords
climate change, epistemological style, evolution, nature of science, philosophy of science, public
understanding of science

Corresponding author:
Deena Skolnick Weisberg, Department of Psychological and Brain Sciences, Villanova University, 800 East Lancaster
Avenue, Villanova, PA 19085, USA.
Email: deena.weisberg@villanova.edu
2 Public Understanding of Science 00(0)

Some scientific claims are controversial among members of the public, especially in the United
States. For example, despite overwhelming scientific consensus that evolutionary theory provides
the best explanation for the origin and development of species, at least a quarter of the US public
rejects this explanation (Smith and Son, 2013; Weisberg et al., 2018). In addition, there is a 37-point
gap between members of the US public and members of the scientific community in terms of their
acceptance of anthropogenic climate change (Pew Research Center, 2015). The same pattern can
be seen with respect to vaccine safety (Pew Research Center, 2015; Villa, 2019).
Public resistance to well-established scientific claims is troubling not simply from an epistemic
point of view; vaccine non-compliance, for example, has resulted in many deaths, and failure to
acknowledge climate change can lead to similarly dire results. The recent (lack of) response to
COVID-19 from members of the public and political leaders also underscores the ways in which
science denial can have deadly consequences. It is thus imperative to identify the reasons behind
people’s resistance to science and to uncover effective ways to combat it.

1. Sources of resistance
People’s opposition to science is often associated with identity factors, like political affiliation or
religious identity (McPhetres and Zuckerman, 2018; Rutjens et al., 2018). For example, individu-
als who are more politically conservative and more religious tend to reject evolution at higher rates
than average (Swift, 2017; Weisberg et al., 2018). Similarly, political conservatives are more likely
to deny climate change (Brenan and Saad, 2018). Vaccine safety denial is similarly tied to identity
factors, though these deniers hold extreme views at both ends of the political (Baumgaertner et al.,
2018; Berezow and Campbell, 2012) and religious (Kennedy, 2017) spectrums.
These links between identity factors and science acceptance may occur because people’s
responses to survey questions about scientific issues are subject to pressure from their group
affiliations (e.g. Douglas and Wildavsky, 1982; Kahan et al., 2011; Lewandowsky et al., 2013).
For example, individuals who are more religious know that religion is seen as conflicting with
evolution. So they may report that they do not accept evolution when asked, regardless of their
personal views, because this response conforms to the views of their religious community. If
people are sensitive only to the norms of their community when deciding what they should
believe, then their knowledge about science plays little role, if any. In support of this argument,
some previous work has found that knowledge of evolutionary theory does not relate to accept-
ance of it (e.g. Bishop and Anderson, 1990; Lawson and Worsnop, 1992; Shtulman, 2006). Simply
teaching people the science behind evolutionary theory, anthropogenic climate change, or vaccine
safety, then, will not increase acceptance.
But there is a reason to question this skeptical conclusion. Prior work has found positive correla-
tions between individuals’ knowledge of particular scientific theories and their acceptance of these
theories (McPhetres et al., 2019; Weisberg et al., 2018). Furthermore, some studies have demon-
strated that teaching the mechanisms of climate change (Ranney and Clark, 2016) and evolution
(Ingram and Nelson, 2006; Lawson and Weser, 1990; Shtulman and Calabi, 2013) can increase
acceptance of these theories. So, people’s knowledge about particular scientific theories, not just
their identities, can matter to science acceptance.
However, much of this work has examined only how people’s knowledge about a scientific
theory affects their acceptance of that particular theory. An educational strategy based on these
results would thus involve teaching each theory separately. While it would be beneficial for mem-
bers of the public to gain this knowledge, this strategy might not be the most efficient way to
combat resistance to science. After all, it is impossible to know what sort of science knowledge
may be relevant in the future, as science denial related to the current pandemic has demonstrated.
Weisberg et al. 3

2. Measuring people’s knowledge about the nature of science


A more promising approach may thus be to focus on people’s knowledge of general science facts
and of the processes and practices of science, known as the Nature of Science (NoS). In general,
people who have higher NoS knowledge may be in a better position to understand the connection
between scientific practices and the generation of knowledge (Nelson et al., 2019; Thanukos and
Scotchmoor, 2012) or the role that the scientific community plays (Slater et al., 2019). This
knowledge may allow for more robust acceptance of the scientific consensus or greater trust in
scientific claims.
Indeed, several prior studies have found such a connection: People who perform better on tests
of general science knowledge and reasoning are overall more likely to accept scientific claims,
such as those about climate change and evolution (Lombrozo et al., 2008; McPhetres and
Zuckerman, 2018; Rutjens et al., 2018; Weisberg et al., 2018). This connection presents a poten-
tially fruitful avenue for intervention: Imparting a broad scientific knowledge base could poten-
tially lead to greater acceptance of a variety of scientific claims without needing to address each of
those claims directly. Furthermore, teaching about science in general may not encounter the kind
of reflexive resistance that teaching directly about a specific controversial claim would.
But this connection between NoS knowledge and acceptance of scientific claims is more com-
plex than it first appears. For example, the Ordinary Science Intelligence Scale (OSI) (Kahan,
2017) is a commonly used measure of NoS. It includes a subsample of basic science knowledge
questions (e.g. electrons are smaller than atoms) drawn from the National Science Foundation
(NSF) Science and Engineering Indicators (National Science Board, 2016). The OSI additionally
includes measures of numeracy (e.g. how to express numerically a 1% chance of winning a prize
if 1000 people enter a lottery), and measures of cognitive reflection (Frederick, 2005). While those
who score highly on the OSI tend to accept climate change, in line with the work reviewed above,
those who are additionally politically conservative are less likely to accept it (Kahan, 2015; Kahan
et al., 2012). That is, greater knowledge about aspects of science actually corresponds to greater
polarization in people’s views (see also Drummond and Fischhoff, 2017; Hamilton, 2011).
Why might this be the case? While greater NoS knowledge may simply lead to greater polariza-
tion overall, it is important to note that some extant measures of NoS only ask people about scien-
tific facts, hence may not capture the most probative aspects of NoS. For example, prior work has
sometimes used questions about people’s knowledge of scientific facts from the General Social
Survey (e.g. True or false: More than half of human genes are identical to those of mice). While
people should know these important science facts, they are simply facts, which one could memo-
rize without really understanding. This could explain why greater knowledge of such facts does not
always lead to better acceptance of scientific claims.
Other measures, including the OSI, do present questions that gauge people’s knowledge of the
methods of science (e.g. the need for a control group) or people’s general thinking skills (e.g. gen-
eral numeracy). But these measures do not tend to capture people’s knowledge of how science
generates knowledge or how scientists carry out their work. We hypothesize that those aspects of
NoS are particularly important predictors of acceptance of scientific claims, since knowledge
about the way in which science works may be necessary to making productive connections between
scientific claims and the process of generating and validating those claims. Conversely, individuals
who have poor knowledge about how science works (e.g. that there is only a single scientific
method that must be followed rigidly, like a recipe) may thereby fail to know that scientific theories
are reliable, valid, and supported by multiple converging lines of evidence. This hypothesis about
the connection between knowledge about the processes and practices of science and acceptance of
scientific claims has so far not been fully tested on a demographically representative sample in the
4 Public Understanding of Science 00(0)

United States. This study aims to fill this gap. Specifically, we test whether greater knowledge
about the aspects of the nature of science (NoS) described earlier could benefit people’s acceptance
of several publicly controversial scientific claims (evolution, anthropogenic climate change, and
vaccine safety), regardless of their political or religious views.
Furthermore, and crucially, none of this prior work has directly examined how people think
about scientific disagreements and their style of resolving such disagreements. Given that public
discourse about scientific topics is often framed as debates between opposing sides, gaining an
understanding of how members of public conceptualize these debates is vital. Specifically, indi-
viduals who see the debates as being completely black and white—one side must be incorrect if the
other side is correct—may fail to accept scientific claims that seem controversial because they lack
knowledge about how such claims can be both well-supported and defeasible, or about how differ-
ent interpretations of evidence could possibly be valid.
This study addresses these issues by using two new measures of people’s knowledge about these
aspects of the NoS. One presents a series of 20 statements (e.g. “The process of science is nonlin-
ear; each step can lead to many possible next steps.”) for which participants rate their agreement.
These statements all focus on some aspect of how science is practiced or how theories are devel-
oped, rather than on knowledge of particular scientific facts or general thinking abilities.
Our second measure gauges participants’ epistemological styles. We present a brief vignette
about a scientific disagreement, in which two groups of scientists investigated developmental
deformities in a population of frogs (adapted from Barzilai and Weinstock, 2015). Some scientists
had evidence that these deformities were caused by cysts in the leg area, while other scientists had
evidence that these deformities were caused by chemicals in the water. Crucially, we do not ask our
participants to judge which group was correct. Instead, we ask what it would take to decide which
group was correct.
We do so by presenting four questions about this disagreement (e.g. “Can one know for certain
what happened to the frogs?”). For each question, participants are asked to rate their level of agree-
ment with three possible answers: “Yes. If the topic were to be investigated further, one could know
for certain,” “No. Even if the topic were investigated further, one could never know for certain
because it is not possible to observe what really happened,” and “Maybe. If the topic were to be
investigated further, one could not be completely certain, but one could make a reasonable esti-
mate.” Participants rate their degree of agreement with each of these answers independently, allow-
ing us to determine how much each participant’s views align with each type of claim.
These three answers are designed to capture different epistemic styles (Kuhn et al., 2000, 2008).
The first is absolutism, the idea that knowledge is objectively true and can be straightforwardly
obtained from observation of the world. The second is multiplism (or relativism), the idea that
knowledge is subjective because it is generated by human minds, hence any scientific view is just
as valid as any other. The third is evaluativism, the idea that any body of scientific knowledge has
degrees of certainty, hence scientific claims must undergo a continual process of evaluation in light
of other knowledge and theories. Prior work has found this third idea is the most mature and is
related to better performance on reasoning tasks for both adults and children (Kuhn et al., 2008;
Thomm et al., 2017; Walker et al., 2012).
To our knowledge, these are the most nuanced measures of scientific thinking that have been
presented to a representative sample of the US public, allowing us to fully explore the possibility
that a general knowledge of how science works might provide an avenue toward reducing public
rejection of well-supported scientific claims. It is important to note that, in contrast to prior work
in this area, these two new measures do not test individuals’ knowledge of any particular scientific
theory or claim, or of any discrete facts. Rather, these measures examine higher order knowledge
about how science works and how scientists should resolve disputes. As such, this study provides
Weisberg et al. 5

a first window into how knowledge of the processes and practices of science could mitigate iden-
tity-based resistance to specific scientific claims, regardless of one’s knowledge about the science
underlying the claims themselves.
Based on prior work, we predict that more politically conservative participants and more
religious participants will tend to reject publicly controversial scientific claims. We additionally
predict that both of our new measures of the NoS will be related to acceptance of scientific theo-
ries. Specifically, participants who have a better knowledge of the nature of theories and partici-
pants who have a more evaluativist style will be more likely to accept these theories. Finally, and
most importantly, we predict that the relationships between politics and acceptance and between
religion and acceptance will be moderated by participants’ NoS knowledge and by their episte-
mological styles. Participants’ knowledge about how science is practiced should provide protec-
tion against the effects of their ideology, making them less polarized in their views about
scientific claims.

3. Methods
We pre-registered our hypotheses with respect to evolutionary theory (https://osf.io/y6amz); par-
enthetical numbering throughout the article refers to the measures, hypotheses, and analyses regis-
tered in that document. We consider our investigations with respect to climate change and vaccines
exploratory.1

Participants
We contracted with YouGov, a survey firm, to recruit a sample of 1500 participants. This sample
size represents the maximum number of participants YouGov could provide to us based on our
available funding. The final sample included 811 women (54%) and 689 men (46%). Participants’
average age was 50 years (SD = 16.4 years, range = 18–92 years).
YouGov originally administered our survey to 1611 participants drawn from their standing pan-
els. Then, blind to our hypotheses, they selected our 1500 final participants by matching cases to a
sampling frame on the variables of gender, age, race, education, political party identification, polit-
ical ideology, and political interest. This sampling frame was constructed from the 2010 American
Community Survey, the November 2010 Current Population Survey, and the 2007 Pew Religious
Life Survey.

Measures
Participants completed 13 measures (pre-registration section 11). In this article, we focus on the
following five: evolution acceptance (11.11), climate change acceptance (11.5), vaccine accept-
ance (11.7), knowledge of the NoS (11.1), and epistemic thinking style (11.2). The remaining eight
measures asked about participants’ views on the consequences of accepting the theory of evolution,
the conflict between science and religion, the appropriate expert (scientific, religious, or other) to
consult about a set of questions, the reasons that they might accept a claim, the reasons that they
might consider a particular claim to be true, whether they identify as a particular kind of person
(e.g. a science person, an arts person), and knowledge of evolutionary theory. See the Supplemental
Materials or the pre-registration site (https://osf.io/y6amz) for the exact text of all questions. As
some of these measures were drawn from prior work, we followed those studies’ practices with
respect to randomization or counterbalancing of questions and answer options within each block.
6 Public Understanding of Science 00(0)

Evolution acceptance. This multiple-choice question asked participants to choose which of the fol-
lowing options best describes how they think animals and plants (n = 740) or human beings (n =
760) came to exist on earth: (a) they were created by God in more or less their current form (crea-
tionist), (b) they developed through natural processes, which were guided by God the entire time
(theist), (c) they developed through natural processes, which were set up by God but continued on
their own (deist), or (d) they developed entirely through natural processes (naturalist). These
answer options appeared in this order for all participants, following previous surveys of evolution
acceptance that also presented the choices as decreasing in their level of supernatural involvement
(e.g. the Gallup poll). Although three of these four answer options explicitly reference God, we
based the wording of this question on previous measures of public acceptance of evolution and on
our own piloting, both of which found that a majority of participants assent to God having some
involvement in the process of evolution.

Climate change acceptance. We presented three questions about climate change (based on wording
used by the Pew Research Center). All participants first responded to the question, “From what
you’ve read and heard, is there solid evidence that the average temperature on earth has been get-
ting warmer over the past few decades?” This question had three answer choices: “Yes, there is
solid evidence that the earth is getting warmer”; “No, there is no solid evidence that the earth is
getting warmer”; and “Don’t know.”
Participants who responded “yes” to this first question were then asked to choose the primary
cause of climate change from the following options: “Human activity such as burning fossil fuels”;
“Natural patterns in the earth’s environment”; and “Don’t know.” Participants who responded “no”
to the first question were asked to choose whether we just do not know enough yet about whether
the earth is getting warmer or whether the earth just is not getting warmer, or they could indicate that
they did not know. For all three questions, the first two options were presented in a random order.

Vaccine acceptance. We presented participants with a single question asking whether childhood
vaccines were safe. There were four answer options: “very safe,” “somewhat safe,” “not very
safe,” and “not safe at all.” As with the evolution acceptance question, these options were always
presented in the same order, since they have a natural logical progression.

Nature of science index. We presented participants with 20 statements about the practice of science
and the nature of scientific theories (Hofer, 2000; Liang et al., 2006; Lombrozo et al., 2008; Schom-
mer, 1990).2 For example, “The same hypothesis or theory is often tested in many different ways”
and “Scientific theories are just scientists’ guesses” (reverse-scored). For each statement, partici-
pants rated their level of agreement on a five-point scale: strongly disagree, disagree, unsure, agree,
and strongly agree. The order of the 20 statements was randomized between participants.

Epistemic thinking style. This measure (based on Barzilai and Weinstock, 2015) first presented par-
ticipants with a description of two theories about the causes of deformities in a population of frogs.
This description was followed by four general questions about the nature of knowledge and how
knowledge should be justified (e.g. “Must there be only one true explanation about the deformed
frogs?”). These four questions were presented in a random order for each participant.
Each of these four questions had three possible answers, and these answers each reflected one
of the three epistemological styles: absolutism, multiplism, and evaluativism. Participants used a
scale from 1 (“very much disagree”) to 10 (“very much agree”) to rate their agreement with each
statement. As in Barzilai and Weinstock (2015), these three statements were presented in a random
order for each participant, and each statement was presented on a separate page in order to encour-
age participants to respond independently to each.
Weisberg et al. 7

Procedure
Participants completed the survey online. Each of the 12 measures appeared as its own block in the
survey. These blocks were presented in a random order except for the block about evolution accept-
ance, which always appeared last. This was done so as not to bias responses to our set of questions
about evolution knowledge, which was a focus of our pre-registered analyses.
Questions about demographic factors (e.g. age, gender, political orientation) were presented on
their own in a separate testing session before participants engaged in this survey. Responses from
that session were used to construct a demographically representative sample, as noted earlier.

4. Results
Coding and descriptive statistics
Evolution acceptance. We first tested whether there were differences in responses to the “humans”
and “plants and animals” versions of this question. We found overall differences in the distribu-
tion of responses among our four acceptance categories (χ2(3) = 20.09, p < .001). Specifically,
the “humans” wording of the question received significantly more creationist responses than the
“plants and animals” version (36.6% vs 26.4%; exact proportions test p < .001), significantly
fewer deistic responses (17.4% vs 22.6%; exact proportions test p < .001), and marginally fewer
naturalistic responses (30.9% vs 35.7%; exact proportions test p = .057). This aligns with other
work showing that different versions of an acceptance question can yield different responses,
particularly when comparing humans to other living things (Maitland et al., 2014; Miller et al.,
2006). Given that participants responded generally similarly to the two versions of the question,
and for ease of interpretation, we combined their responses into a single acceptance measure for
our main analyses.
We found that 31.5% of participants agreed with the creationist option, 15.3% of participants
agreed with the theist option, 20.0% of participants agreed with the deist option, and 33.3% of
participants agreed with the naturalistic option.
For our main regression analyses, we split the four response options to this question into two
categories: “created by God” and “guided by God” were coded as “leans creationist” (0) and “set
up by God” and “natural processes” were coded as “leans evolutionist” (1). We pre-registered the
analysis using this coding strategy because these pairs of categories tended to cohere and because
the binary outcome variable is easier to interpret than the relational outcomes. Overall, 53.3% of
our participants were categorized as leaning evolutionist.

Climate change acceptance. We used responses to the three climate change questions to construct a
scale of responses (1–7), with higher numbers indicating closer agreement to the scientific consen-
sus of anthropogenic climate change. Participants who responded “yes” to the first question (i.e.
there is solid evidence that the Earth is getting warmer) and then attributed that fact to human activ-
ity were assigned the highest score (7) since this matches the scientific consensus. Participants who
responded “yes” to the first question and then attributed that fact to natural patterns were assigned
a score of 6. Participants who responded “yes” to the first question but then said they did not know
why this was happened were assigned a score of 5. Participants who responded “don’t know” to the
first question were coded as 4. Participants who responded “no” to the first question (i.e. there is
no solid evidence that the Earth is getting warmer) but then said they did not know why were
assigned a score of 3. Participants who responded “no” to the first question and then said that we
just do not know enough yet were assigned a score of 2. Participants who responded “no” to the
first question and then said that the Earth just is not getting warmer were assigned a score of 1.
8 Public Understanding of Science 00(0)

The average overall score on this scale was 5.1 (SD = 2.1), significantly higher than the mid-
point of the scale (4; t(1499) = 20.6, p < .001). We found that 62.5% of our population said that
they accepted climate change, indicating that a majority of US citizens agree with the scientific
consensus on this issue. These numbers are generally in line with reports from other polls: Gallup,
for example finds that 66% of the population accepts that climate change is happening (Brenan and
Saad, 2018; see also Leiserowitz et al., 2010). In addition, 45.9% of our population agreed that
climate change is caused by human activity.
As with evolution acceptance, we created a binary acceptance variable to use in our main analy-
ses. Participants who responded “yes” to the first question (i.e. there is solid evidence that the Earth
is getting warmer, in agreement with the scientific consensus) were assigned a score of 1. All other
participants were assigned a score of 0.

Vaccine acceptance. We coded participants’ responses so that higher scores were assigned to
answers that more closely reflected the scientific consensus (as we did with climate change).
Responses of “not safe at all” were coded 1, “not very safe” were coded 2, “somewhat safe” were
coded 3, and “very safe” were coded 4. One participant who skipped this question was removed
from analyses that used this scale.
Overall, 88.9% of our participants accepted the safety of vaccines, with 53.1% of all partici-
pants saying that vaccines were very safe. These numbers align well with a 2016 Pew poll, in
which 88% of US respondents judged vaccines to be safe (Villa, 2019). The average score on our
acceptance scale for vaccines was 3.38 (SD = 0.79), which is significantly higher than the mid-
point of the scale (2.5; t(1498) = 42.99, p < .001).
We again created a binary variable for analysis, with scores of 1 and 2 reflecting overall non-
acceptance (coded 0) and scores of 3 and 4 reflecting overall acceptance (coded 1).

Nature of science index. We converted participants’ responses to the 20 items into a scale from 1
(strongly disagree) to 5 (strongly agree). Eight of the items were reverse coded. Each participant’s
responses to these 20 items were averaged into a single score, with higher numbers reflecting a
greater knowledge about NoS. We examined the unity of this index using the nFactors package in
R. An analysis of eigenvalues, a parallel analysis, and an optimal coordinates analysis suggested a
three-factor solution (see Table 1), whereas the acceleration factor suggested a one factor solution.
Next, we conducted an exploratory factor analysis specifying three factors using the psych package
in R. Examining the factor loadings (see Table 1) suggests that the factors correspond to (1) a rec-
ognition that science is an ongoing and potentially nonlinear process (11 items), (2) (not) viewing
science as a set, stable method (seven items), and (3) (not) dismissing science as mere guesswork
(two items).
It is worth noting that although three factors were suggested, the factors appeared to break down
based on whether the items were reverse coded. Given this and the fact that internal consistency for
the entire index was strong (Cronbach’s alpha = .84), we chose to retain all items in for analyses.
Average score on this index was 3.73 (SD = 0.45). Although this is significantly above the midpoint
of the index (2.5; t(1499) = 106.66, p < .001), the distribution of responses is roughly normal.

Epistemic thinking style. This measure asked participants their level of agreement with 12 state-
ments, four for each of the three epistemic thinking styles (absolutist, multiplist, and evaluativist).
Each participant’s responses to the four statements reflecting each style was averaged together,
creating three scores per participant. Following the study on which this measure was based (Bar-
zilai and Weinstock, 2015), these averages ranged from 1 to 10, with higher numbers reflecting a
greater degree of agreement with each thinking style. One participant failed to respond to one of
Weisberg et al. 9

Table 1. Factor loadings for the Nature of Science index. Loadings below 0.30 are not displayed.

Item Statement Factor 1 Factor 2 Factor 3


NOS3 To be accepted, scientific theories must be supported by 0.59
much evidence.
NOS7 The same hypothesis or theory is often tested in many 0.67
different ways.
NOS9 The process of science is nonlinear; each step can lead to 0.66
many possible next steps.
NOS11 Scientific knowledge is built through a complex process 0.63
that relies, in part, on observations of nature.
NOS13 Scientific theories are subject to ongoing testing and 0.72
revision.
NOS15 Scientific investigations usually lead to additional 0.74
questions for further investigation.
NOS17 An important aim of scientific testing is to figure out 0.58
which explanation for a phenomenon is most likely to be
correct.
NOS19 The scientific community is essential to the process and 0.56
progress of science.
DEF2 Answers to questions can change as experts gather more 0.71
information.
AUTH1 It is good to question the ideas presented by others. 0.66
NOS5 Accepted scientific theories are well-supported 0.42 0.41
explanations for a broad set of natural phenomena.
NOS1.rc Once a scientific theory has been established, it is never 0.67
changed.
NOS8.rc Scientific investigations always require laboratory 0.52
experiments.
NOS14.rc Scientific theories based on accurate experimentation 0.57
will not be changed.
DEF4.rc All researchers always come up with the same answer to 0.64
a question.
DEF1 Scientists can ultimately get to the truth. −0.38 0.39
NOS10.rc Scientific research is always conducted in the following −0.32 0.38
order: (1) Observation, (2) Hypothesis, (3) Experiment,
(4) Conclusion
NOS18.rc Data collected for one experiment can only be used for 0.43
that experiment.
NOS2.rc Scientific theories are just scientists’ guesses. 0.74
NOS4.rc New hypotheses are basically wild guesses; scientists just 0.67
dream them up.
Eigenvalues 6.11 2.68 1.65

the evaluativism items, so this missing value was filled in with the mean of the sample for that
question before constructing this participant’s evaluativism summary score.
Overall scores on the absolutism scale (M = 6.55, SD = 1.47) and on the evaluativism scale
(M = 7.01, SD = 1.45) were significantly above the midpoint of the scale (5.5; t(1499) = 27.55,
p < .001 and t(1499) = 40.17, p < .001, respectively). Scores on the multiplism scale (M = 4.81,
SD = 1.80) were significantly below the midpoint (t(1499) = −14.78, p < .001). Cronbach’s
10 Public Understanding of Science 00(0)

Table 2. Zero-order correlations among all variables.

1 2 3 4 5 6 7 8
1. Evolution acceptance X
2. C
 limate change 0.27** X
acceptance
3. Vaccine acceptance 0.11** 0.21** X
4. NoS index 0.33** 0.33** 0.30** X
5. Absolutism −0.01 0.13** 0.09** 0.06* X
6. Multiplism −0.28** −0.16** −0.09** −0.49** 0.08** X
7. Evaluativism 0.12** 0.19** 0.20** 0.29** 0.41** 0.19** X
8. Political orientation −0.42** −0.54** −0.19** −0.32** −0.08** 0.24** −0.17** X
9. Religiosity −0.63** −0.26** −0.08** −0.23** −0.02 0.23** −0.06* 0.42**

*p < .05; **p < .01.

alpha for the absolutism scale was .52, for the multiplism scale was .66, and for the evaluativism
scale was .62. Although these alpha values are somewhat low, we chose to use these scales as they
were presented in order to remain consistent with prior work using this measure.

Political orientation. Participants were asked to rate their ideology on a five-point scale, which we
scored from 1 (“very liberal”) to 5 (“very conservative”). There were 162 participants who
responded “not sure”; their data were removed from analyses involving this scale. We found that
13.1% of our participants reported being very liberal, 16.7% liberal, 33.2% moderate, 23.5% con-
servative, and 13.5% very conservative. The average overall score on this scale was 3.08 (SD =
1.21), significantly higher (i.e. more conservative) than the midpoint of the scale (t(1337) = 2.29,
p = .02).
For data visualization (although not for analyses), we transformed this into a three-point scale
by labeling participants who responded “very liberal” or “liberal” as liberal and participants who
responded “conservative” or “very conservative” as conservative.

Religiosity. Participants responded to three questions from the Pew Religious Life battery, which
asked about their frequency of attendance at religious services, the importance of religion in their
lives, and their frequency of prayer. Responses to these three items were made on different scales.
To combine them into a single scale, we first normalized the scale for each item. Then we averaged
these scores together and normalized this composite scale, which we used in our analyses. Because
we normalized the scale, the average was 0 and SD was 1.
For data visualization only, we split this scale into three groups, with participants responding
more than one SD above M being labeled “highly religious” (25% of the sample), participants
responding between one SD above and below M being labeled “average religious” (50% of the
sample), and participants responding more than one SD below M being labeled “low religious”
(25% of the sample).
Table 2 provides a correlation matrix for all variables.

Acceptance of evolutionary theory (pre-registered analyses)


Individual predictors. We examined the likelihood that participants leaned evolutionist conditional
on their political ideology, religiosity, knowledge of the NoS, and their epistemic thinking styles
(absolutism, multiplism, and evaluativism scores). Separate logistic regression analyses were
Weisberg et al. 11

conducted to characterize the relationship between each variable and evolution acceptance (see
sections 4.1.6, 4.2, and 17.2 in our pre-registration).
As predicted, conservative political ideology (b = −0.66, p < .001; hypothesis 4.2.2.3) and
greater religiosity (b = −1.39, p < .001; hypothesis 4.2.2.2) were associated with a decreasing like-
lihood of leaning evolutionist. Greater NoS knowledge was associated with an increasing likelihood
of leaning evolutionist (b = 1.61, p < .001; hypothesis 4.2.1.4). Increasing evaluativism predicted
a greater likelihood of leaning evolutionist (b = 0.14, p < .001; hypothesis 4.2.1.8), whereas
increasing multiplism predicted a lower likelihood of leaning evolutionist (b = −0.33, p < .001;
hypothesis 4.2.1.7). Contrary to our predictions, absolutism was unrelated to whether participants
leaned evolutionist (b = −0.01, p = .774; hypothesis 4.2.1.7).

Conditional effects. We next examined whether the relations between leaning evolutionist and NoS
knowledge, and between leaning evolutionist and having an evaluativist thinking style, were con-
ditional on the identity variables (political orientation and religiosity) (hypothesis 4.1.7). We con-
ducted logistic regression analyses predicting the degree of leaning evolutionist from our two
measures of science epistemology and from the two identity variables, including the interaction
terms. Here, following our pre-registration (section 17.1), we examined relations with political
ideology and religiosity separately.
In terms of political ideology, the probability of leaning evolutionist increased with increasing
NoS knowledge (odds ratio 38.7) and with increasing evaluativist thinking (odds ratio 1.07) for
each level of political orientation (Figure 1, top panels). In addition, we found that the relationship
between leaning evolutionist and NoS knowledge was conditional on political ideology (b =
−0.67, p < .001), such that greater political conservatism was associated with a weaker influence
of NoS knowledge on leaning evolutionist. However, the relationship between leaning evolutionist
and evaluativism was not conditional on political ideology (b = 0.003, p = .929): Increasing one’s
commitment to an evaluativist thinking style increased the likelihood of leaning evolutionist
equally across the political spectrum.
As with political ideology, the probability of leaning evolutionist increased with increasing NoS
knowledge (odds ratio 4.42) and with increasing evaluativist thinking (odds ratio 1.18) for each
level of religiosity (Figure 1, bottom panels). The relationship between leaning evolutionist and
NoS knowledge was conditional on religiosity (b = −0.50, p = .005); the predicted probability of
leaning evolutionist increased with increasing NoS knowledge of the NoS even for those who
scored in the top 25% of our measure of religiosity, but their increase was less steep. The relation-
ship between leaning evolutionist and evaluativism was also conditional on religiosity (b = −0.12,
p = .013) with the probability of leaning evolutionist increasing with greater evaluativist thinking
only for those with middle to lower religiosity scores.

Acceptance of climate change (exploratory analyses)


Individual predictors. In parallel to our analyses of evolution acceptance, we conducted separate
logistic regression analyses predicting our binary climate change acceptance variable from the
other variables individually. Because these analyses were not pre-registered, we used a Bonferroni
correction to adjust our alpha level to .0083, which accounts for the six tests that we ran. We found
that conservative political ideology (b = −1.05, p < .001) and greater religiosity (b = −0.50, p <
.001) were significantly associated with a decreasing likelihood of accepting climate change.
Greater knowledge of the NoS (b = 1.82, p < .001), increasing absolutism (b = 0.21, p < .001),
and increasing evaluativism (b = 0.31, p < .001) predicted a greater likelihood of acceptance.
Increasing multiplism predicted a lower likelihood of acceptance (b = −0.18, p < .001).
12 Public Understanding of Science 00(0)

Figure 1. Relations between evolution acceptance, science epistemology measures, and demographic
factors. Shaded areas represent 95% confidence intervals.

Conditional effects. As for evolution acceptance, increasing NoS knowledge (odds ratio 48.8) and
increasing agreement with evaluativist statements (odds ratio 1.52) was associated with an
increased likelihood of accepting climate change across all levels of political orientation. This pat-
tern also held across all levels of religiosity (odds ratio 5.66 for NoS knowledge and 1.38 for evalu-
ativism) (Figure 2). Again, we used a Bonferroni correction to adjust the alpha level to account for
multiple comparisons across these four tests (new alpha = .0125).
We additionally found that the relationship between NoS knowledge and accepting climate
change was conditional on political ideology (b = −0.74, p < .001), whereas the relationship
between evaluativist thinking and accepting human-caused climate change was not (b = −0.05, p
= .26). Likelihood of acceptance increased with increasing NoS knowledge and increasing evalu-
ativist thinking style for each of the political leanings. This increase was sharper for liberals than
for conservatives, but only for the NoS index.
The relationship between accepting climate change and NoS knowledge was also conditional on
religiosity (b = −0.64, p < .001), but the relationship between accepting climate change and evalu-
ativism was not (b = −0.06, p = .12). The predicted probability of accepting climate change
increased with increasing NoS knowledge and with increasing evaluativist thinking, even for those
Weisberg et al. 13

Figure 2. Relations between climate change acceptance, science epistemology measures, and
demographic factors. Shaded areas represent 95% confidence intervals.

who scored in the top 25% of our measure of religiosity. This increase was sharper for low-religi-
osity individuals than for high-religiosity individuals, but only for the NoS index.

Acceptance of the safety of vaccines (exploratory analyses)


Individual predictors. With an adjusted alpha level of .0083, conservative political ideology (b =
−0.33, p < .001) was significantly associated with a decreasing likelihood of accepting that vac-
cines are safe. Greater NoS knowledge (b = 1.50, p < .001) and increased evaluativism signifi-
cantly predicted acceptance of vaccines’ safety (b = 0.32, p < .001). Greater religiosity (b =
−0.08, p = .34), absolutism (b = 0.07, p = .19), and multiplism (b = −0.06, p = .21) were not
significant predictors.

Conditional effects. As in our climate change analyses, we adjusted our alpha level of .0125 to
account for multiple comparisons. Both increasing NoS knowledge and increasing evaluativism
were associated with increased probability of acceptance across the political and religious spec-
trum (Figure 3). The likelihood of accepting that vaccines are safe increased with increasing NoS
14 Public Understanding of Science 00(0)

Figure 3. Relations between vaccine acceptance, science epistemology measures, and demographic
factors. Shaded areas represent 95% confidence intervals.

knowledge (odds ratio 8.3), and this relationship did not differ across all levels of political ideology
(b = −0.19, p = .34). The same was true for the relationship between vaccine acceptance and
evaluativism (odds ratio 1.86), which also did not significantly differ based on political ideology
(b = −0.12, p = .035).
In terms of religiosity, the relationship between vaccine acceptance and NoS knowledge was
significantly conditional on religiosity (b = −0.56, p = .007). Although individuals at all levels of
religiosity increased their likelihood of accepting vaccines’ safety with increasing NoS knowledge
(odds ratio 4.3), individuals of high religiosity experienced the increase in acceptance with increas-
ing NoS knowledge at a less dramatic rate (see Figure 3, bottom panels). The relationship between
vaccine acceptance and an evaluativist thinking style was not significantly conditional on religios-
ity (b = −0.07, p = .18); individuals across the religious spectrum were more likely to accept vac-
cines’ safety with increasing evaluativism (odds ratio 1.39).

5. Discussion
Previous investigations of public acceptance of science have focused on identity factors, like
political orientation, finding that such factors play a strong role in whether individuals accept or
Weisberg et al. 15

reject scientific claims (e.g. Lewandowsky et al., 2013; McPhetres and Zuckerman, 2018).
Furthermore, although some prior work suggests greater knowledge of the NoS is related to
greater acceptance (e.g. Weisberg et al., 2018), other studies find that greater NoS knowledge is
associated with greater polarization (e.g. Kahan et al., 2012). This study focuses specifically on
knowledge of the processes and practices of science and finds that this higher-order knowledge
about science does relate to acceptance of specific scientific claims. Indeed, in many cases, this
relationship was not attenuated by identity factors. This latter point is critical, as it suggests that
it might be possible to increase one’s knowledge of how science works and one’s acceptance of
scientific claims without interference from one’s political orientation or religiosity. This work
thus suggests possible avenues for developing effective interventions that could address the lack
of public acceptance of well-confirmed scientific claims.
More specifically, using a representative sample of the US population, we found that people’s
responses to our NoS index significantly predicted their acceptance of evolution, climate change,
and the safety of vaccines. Importantly, the likelihood of accepting the scientific consensus
increased with increasing NoS knowledge across all political identity groups and all levels of
religiosity. That is, as predicted, individuals with greater knowledge about aspects of modern sci-
entific practice and the nature of scientific theories were more likely to agree with the scientific
consensus on all three publicly controversial scientific claims.
We found the same pattern when examining people’s epistemic thinking styles; higher degrees
of evaluativism (the understanding that claims can have degrees of correctness and must be evalu-
ated in the light of multiple sources of evidence) were positively related to acceptance of all three
scientific claims. By contrast, multiplism was negatively related to acceptance of all three claims,
indicating that a view of science as a collection of opinions does not provide a helpful basis for
accepting the scientific consensus. Unexpectedly, absolutism was not related to acceptance of evo-
lution, although it was positively related to acceptance of climate change and to acceptance of
vaccines’ efficacy. These results suggest that a view of science as having a single correct answer
could assist people in agreeing with the scientific consensus. However, evolution may be too
closely linked to identity factors to allow for this view to have a helpful influence. Importantly,
however, increased commitment to evaluativism was consistently associated with increased accept-
ance across all levels of political conservatism and of religiosity, often with no attenuation from
these identify factors.
Crucially, for no political ideology or level of religiosity was increasing NoS knowledge or
increasing evaluativism related to decreased acceptance of science claims. This does not mean that
political and religious worldviews had no influence on acceptance, however. We did find that the
effects of knowledge on acceptance were weakened for members of the most highly conservative
and the most highly religious groups; these individuals’ views did not increase as much as those of
liberal individuals or less-religious individuals as the knowledge factors increased, especially in
the context of evolution and climate change. That is, political and religious identification still mat-
ter to the acceptance of scientific claims, and a greater knowledge of the NoS does not entirely
remove their influence. Importantly, however, the identify factors never reversed the effect of the
knowledge factors. These results point to the importance of a general knowledge about the episte-
mology of science to one’s acceptance of scientific claims.
This conclusion, on its face, may seem to be in tension with earlier work in this area, which has
found that individuals who were ideologically pre-disposed to reject a scientific finding were even
less likely to accept that scientific finding when they had more science knowledge, at least for
climate change (Drummond and Fischhoff, 2017; Kahan et al., 2012; Rutjens et al., 2018). But this
difference can be explained by the differences in the measures of NoS used in these various studies.
Kahan’s work used the OSI scale, which includes questions about basic science facts, numeracy,
16 Public Understanding of Science 00(0)

and aspects of scientific methods; other work that has come to similar conclusions has examined
only knowledge of science content (Drummond and Fischhoff, 2017; Rutjens et al., 2018). In con-
trast, our measures assess participants’ knowledge of the processes and practices of science and
their epistemological styles, which are different dimensions of people’s thinking about science. By
capturing these aspects of participants’ knowledge, our new measures complement this prior work
by demonstrating that individuals’ knowledge of how science works is a strong predictor of science
acceptance.
One obvious limitation to this study is that it is correlational; it is not possible to establish how
these relations came about using the current data, and we are not able to make claims about the
direction of causality. Individuals with a greater orientation toward scientific thinking may be more
likely to accept science, or individuals who already accept scientific claims may be more likely to
educate themselves about the workings of science, or there may be some other common cause.
Future work should investigate this question through interventions that teach aspects of the NoS
and then measure acceptance of particular science theories. Indeed, many current guidelines for
science education emphasize imparting an understanding of how science works and a familiarity
with scientific reasoning skills (e.g. NGSS Lead States, 2013), on the assumption that this will
provide students with the tools for properly evaluating scientific claims. The current data provide
preliminary evidence in favor of this conclusion. Future work should build on these results to
uncover exactly how one’s knowledge about the NoS may influence and be influenced by one’s
views about scientific claims.

Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publica-
tion of this article: This work was supported by the National Science Foundation (SES-1455425).

ORCID iDs
Deena Skolnick Weisberg https://orcid.org/0000-0002-4000-4941
Asheley R. Landrum https://orcid.org/0000-0002-3074-804X
Jesse Hamilton https://orcid.org/0000-0002-6571-4325
Michael Weisberg https://orcid.org/0000-0002-3944-1167

Supplemental material
Supplemental material for this article is available online.

Notes
1. These data were collected as part of a larger study on factors influencing public acceptance of evolution-
ary theory. The hypotheses and pre-registered analyses associated with that larger study are reported in
full in the Supplemental Materials; we choose not to report all of those measures and tests here in order
to focus this article more sharply on the relations among identity factors, understanding of the process of
science, and acceptance of science.
2. Our original version of the index, as reported in our pre-registration, had 23 items. We removed three of
these items because they asked about participants’ opinions (e.g. “When thinking about how the world
works, I am more likely to accept the ideas of someone with firsthand experience than the ideas of
researchers”). Because these items do not have an objectively correct answer, they cannot be coded in
the same way as the rest of the items in the index. We filed a Transparent Changes document on OSF to
report this deviation from our pre-registered analysis plan.
Weisberg et al. 17

References
Barzilai S and Weinstock M (2015) Measuring epistemic thinking within and across topics: A scenario-based
approach. Contemporary Educational Psychology 42: 141–158.
Baumgaertner B, Carlisle JE and Justwan F (2018) The influence of political ideology and trust on willingness
to vaccinate. PLoS ONE 13(1): e0191728.
Berezow A and Campbell H (2012) Science Left behind: Feel-Good Fallacies and the Rise of the Anti-
Scientific Left. New York: Public Affairs Press.
Bishop BA and Anderson CW (1990) Student conceptions of natural selection and its role in evolution.
Journal of Research in Science Teaching 27(5): 415–427.
Brenan M and Saad L (2018) Global warming concern steady despite some partisan shifts. Available at:
https://news.gallup.com/poll/231530/global-warming-concern-steady-despite-partisan-shifts.aspx
Douglas M and Wildavsky A (1982) Risk and Culture: An Essay on the Selection of Technical and
Environmental Dangers. Berkeley, CA: University of California Press.
Drummond C and Fischhoff B (2017) Individuals with greater science literacy and education have more
polarized beliefs on controversial science topics. Proceedings of the National Academy of Sciences of
the United States of America 114(36): 9587–9592.
Frederick S (2005) Cognitive reflection and decision making. Journal of Economic Perspectives 19(4): 25–42.
Hamilton LC (2011) Education, politics and opinions about climate change evidence for interaction effects.
Climatic Change 104(2): 231–242.
Hofer BK (2000) Dimensionality and disciplinary differences in personal epistemology. Contemporary
Educational Psychology 25(4): 378–405.
Ingram EL and Nelson CE (2006) Relationship between achievement and students’ acceptance of evolution
or creation in an upper-level evolution course. Journal of Research in Science Teaching 43(1): 7–24.
Kahan DM (2015) Climate-science communication and the measurement problem. Political Psychology 36:
1–43.
Kahan DM (2017) “Ordinary science intelligence”: A science-comprehension measure for study of risk and
science communication, with notes on evolution and climate change. Journal of Risk Research 20(8):
995–1016.
Kahan DM, Jenkins-Smith H and Braman D (2011) Cultural cognition of scientific consensus. Journal of Risk
Research 14(2): 147–174.
Kahan DM, Peters E, Wittlin M, Slovic P, Ouellette LL, Braman D, et al. (2012) The polarizing impact of sci-
ence literacy and numeracy on perceived climate change risks. Nature Climate Change 2(10): 732–735.
Kennedy B (2017) Majorities in all major religious groups support requiring childhood vaccination. FactTank.
Available at: https://www.pewresearch.org/fact-tank/2017/02/07/majorities-in-all-major-religious-groups-
support-requiring-childhood-vaccination/
Kuhn D, Cheney R and Weinstock M (2000) The development of epistemological understanding. Cognitive
Development 15(3): 309–328.
Kuhn D, Iordanou K, Pease M and Wirkala C (2008) Beyond control of variables: What needs to develop to
achieve skilled scientific thinking? Cognitive Development 23: 435–451.
Lawson AE and Weser J (1990) The rejection of nonscientific beliefs about life: Effects of instruction and
reasoning skills. Journal of Research in Science Teaching 27(6): 589–606.
Lawson AE and Worsnop WA (1992) Learning about evolution and rejecting a belief in special creation:
Effects of reflective reasoning skill, prior knowledge, prior belief and religious commitment. Journal of
Research in Science Teaching 29(2): 143–166.
Leiserowitz A, Smith N and Marlon JR (2010) Americans’ Knowledge of Climate Change. New Haven, CT:
Yale Project on Climate Change Communication.
Lewandowsky S, Gignac GE and Oberauer K (2013) The role of conspiracist ideation and worldviews in
predicting rejection of science. PLoS ONE 8(10): e75637.
Liang LL, Chen S, Chen X, Nafiz Kaya O, Dean Adams A, Macklin M, et al. (2006) Student understanding
of science and scientific inquiry (SUSSI): Revision and further validation of an assessment instrument.
Paper presented at the annual conference of the National Association for Research in Science Teaching
(NARST), San Francisco, CA, 3–6 April.
18 Public Understanding of Science 00(0)

Lombrozo T, Thanukos A and Weisberg M (2008) The importance of understanding the nature of science for
accepting evolution. Evolution: Education and Outreach 1(3): 290–298.
McPhetres J and Zuckerman M (2018) Religiosity predicts negative attitudes towards science and lower lev-
els of science literacy. PLoS ONE 13(11): e0207125.
McPhetres J, Rutjens BT, Weinstein N and Brisson JA (2019) Modifying attitudes about modified
foods: Increased knowledge leads to more positive attitudes. Journal of Environmental Psychology
64(February): 21–29.
Maitland A, Tourangeau R, Yan Y, Bell R and Muhlberger P (2014) The Effect of Question Wording on
Measurement of Knowledge about Evolution: An Examination of Survey Experiment Data Collected
for the National Center for Science and Engineering Statistics. Washington, DC: National Center for
Science and Engineering Statistics.
Miller JD, Scott EC and Okamoto S (2006) Public acceptance of evolution. Science 313(5788): 765–766.
National Science Board (2016) Science and Engineering Indicators 2016. Arlington, VA: National Science
Foundation.
Nelson CE, Scharmann LC, Beard J and Flammer LI (2019) The nature of science as a foundation for foster-
ing a better understanding of evolution. Evolution: Education and Outreach 12(1): 6.
NGSS Lead States (2013) Next Generation Science Standards: For States, by States. Washington, DC:
National Research Council.
Pew Research Center (2015) Public and scientists’ views on science and society. Available at: http://www.
pewinternet.org/2015/01/29/public-and-scientists-views-on-science-and-society/
Ranney MA and Clark D (2016) Climate change conceptual change: Scientific information can transform
attitudes. Topics in Cognitive Science 8(1): 49–75.
Rutjens BT, Sutton RM and van der Lee R (2018) Not all skepticism is equal: Exploring the ideological ante-
cedents of science acceptance and rejection. Personality and Social Psychology Bulletin 44(3): 384–405.
Schommer M (1990) Effects of beliefs about the nature of knowledge on comprehension. Journal of
Educational Psychology 82(3): 498–504.
Shtulman A (2006) Qualitative differences between naïve and scientific theories of evolution. Cognitive
Psychology 52(2): 170–194.
Shtulman A and Calabi P (2013) Tuition vs. intuition: Effects of instruction on naïve theories of evolution.
Merrill-Palmer Quarterly 59(2): 141–167.
Slater MH, Huxster JK and Bresticker JE (2019) Understanding and trusting science. Journal for General
Philosophy of Science 50(2): 247–261.
Smith TW and Son J (2013) General social survey 2012 final report: Trends in public attitudes about confi-
dence in institutions. Chicago, IL. Available at: https://gssdataexplorer.norc.org/documents/879/display
Swift A (2017) In US, belief in creationist view of humans at new low. Available at: http://www.gallup.com/
poll/210956/belief-creationist-view-humans-new-low.aspx
Thanukos A and Scotchmoor J (2012) Making connections: Evolution and the nature and process of sci-
ence. In: Rosengren KS, Brem SK, Evans EM and Sinatra GM (eds) Evolution Challenges: Integrating
Research and Practice in Teaching and Learning about Evolution. Oxford: Oxford University Press, pp.
410–427.
Thomm E, Barzilai S and Bromme R (2017) Why do experts disagree? The role of conflict topics and epis-
temic perspectives in conflict explanations. Learning and Instruction 52: 15–26.
Villa V (2019) 5 facts about vaccines in the U.S. Available at: https://www.pewresearch.org/fact-tank/2019/
03/19/5-facts-about-vaccines-in-the-u-s/
Walker CM, Wartenberg TE and Winner E (2012) Engagement in philosophical dialogue facilitates chil-
dren’s reasoning about subjectivity. Developmental Psychology 49(7): 1338–1347.
Weisberg DS, Landrum AR, Metz SE and Weisberg M (2018) No missing link: Knowledge predicts accept-
ance of evolution in the United States. BioScience 68(3): 212–222.

Author biographies
Deena Skolnick Weisberg is an Assistant Professor in the Department of Psychological and Brain Sciences
at Villanova University, where she directs the Scientific Thinking and Representation (STAR) Laboratory
and co-directs the Pennsylvania Laboratory for Understanding Science (PLUS). Her research focuses on
Weisberg et al. 19

scientific thinking and imaginative cognition in children and adults. She has published over 50 peer-
reviewed articles and her work has received funding from the National Science Foundation and the
Templeton Foundation.
Asheley R. Landrum is an assistant professor of science communication in the College of Media and
Communication at Texas Tech University. Prior to this, she was a Howard Deshong Postdoctoral Fellow at
the Annenberg Public Policy Center of the University of Pennsylvania. Her research focuses on the role of
individuals’ views and values in their interpretation of scientific information and she is currently a principal
investigator on two National Science Foundation grants examining young adults’ engagement with educa-
tional science media.
Jesse Hamilton is a doctoral student in Philosophy at the University of Pennsylvania. Jesse focuses on ethics,
political philosophy, and philosophy of science. His specific research interests within those areas include
human rights, distributive justice, just war, and climate change.
Michael Weisberg is a Professor and Chair of Philosophy, as well as Senior Faculty Fellow and Director of
Post-Graduate Programs at Perry World House, at the University of Pennsylvania. His research focuses on
how highly idealized models and simulations can be used to understand complex systems. He leads efforts to
better understand the interface between humans and wildlife and between humans and the climate system, and
he studies how scientific issues are understood by communities in the Americas and in East Asia.

You might also like