Inquiry
An Interdisciplinary Journal of Philosophy
ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/sinq20
Scepticism and the value of distrust
Maria Baghramian & Silvia Caprioglio Panizza
To cite this article: Maria Baghramian & Silvia Caprioglio Panizza (2022): Scepticism and the
value of distrust, Inquiry, DOI: 10.1080/0020174X.2022.2135821
To link to this article: https://doi.org/10.1080/0020174X.2022.2135821
© 2022 The Author(s). Published by Informa
UK Limited, trading as Taylor & Francis
Group
Published online: 16 Nov 2022.
Submit your article to this journal
Article views: 74
View related articles
View Crossmark data
Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=sinq20
INQUIRY
https://doi.org/10.1080/0020174X.2022.2135821
Scepticism and the value of distrust
Maria Baghramian and Silvia Caprioglio Panizza
School of Philosophy, University College Dublin, Belfield, Dublin, Ireland
ABSTRACT
Faced with urgent calls for more trust in experts, especially in high impact and
politically sensitive domains, such as climate science and COVID-19, the
complex nature of public trust in experts and the need for a more critical
approach to the topic are easy to overlook. Scepticism – at least in its
Humean mitigated form that encourages independent, questioning attitudes
– can prove valuable to democratic governance, but stands in opposition to
the cognitive dependency entailed by epistemic trust. In this paper, we
investigate the tension between the value of mitigated scepticism and the
need for trust in experts. We offer four arguments in favour of mitigated
scepticism: the argument from loss of intellectual autonomy; the argument
from democratic deficit; the argument from the normative failures of science;
and the argument from past and current injustices. One solution, which we
reject, is the idea that reliance, rather than trust, is sufficient
for accommodating experts in policy matters. The solution we endorse is to
create a ‘climate of trust’, where questioning experts and expertise is
welcomed, but the epistemic trust necessary for acting upon information
which the public cannot obtain first-hand is enabled and encouraged
through structural, institutional and justice-based measures.
KEYWORDS Trust; distrust; scepticism; experts; injustice; vaccines
1. Introduction
The uncritical tend to believe too much that is unsubstantiated, the overcritical
tend to believe too little that is true. (Audi 2011)1
That there is a strong ethical dimension to what we believe, how we
justify our beliefs and how or when we are willing to modify them is
beyond dispute, but relatively little is said about the ethics of disbelief.
CONTACT Maria Baghramian
maria.baghramian@ucd.ie
School of Philosophy, University
College Dublin, Dublin 4, Ireland
1
We would like to thank our colleagues at UCD School of Philosophy and PEriTia for their comments and
questions on presentations of earlier drafts of this paper. Particular thanks go to Ben Almassi, Michel
Croce, Jim O’Shea, and Matthew Shields for their comments on earlier drafts.
© 2022 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group
This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives License (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited, and is not altered,
transformed, or built upon in any way.
2
M. BAGHRAMIAN AND S. CAPRIOGLIO PANIZZA
This paper focuses on one well known form of disbelief, namely scepticism, and connects it with the topic of epistemic trust, the type of trust
that is frequently required in accepting the testimony of others. The
paper has 5 sections: §2 discusses the value of moderate scepticism,
what Hume had called ‘mitigated scepticism’, particularly for democratic
governance; §3 applies these general considerations to the specific case
of trust in experts and outlines four sources of sceptical doubt about
scientific experts and their role in policy decisions; §4 discusses the
imperative of trust in experts, particularly for cases that are pressingly
urgent and require a great deal of specialist input; §5 entertains and
rejects the idea that reliance on experts rather than a thicker normative
attitude of trust is sufficient for accepting their testimony; §6 attempts
to resolve the tension between trust and moderate scepticism we have
outlined by suggesting that we need to create a collective climate of
trust which can both accommodate and address justified scepticism
about experts.
2. The value of scepticism
Scepticism as a philosophical problem, particularly in its Cartesian global
version, has long been seen as a source of philosophical anxiety in need of
resolution or dissolution, rather than a doxastic stance of any value.
Indeed, the strong version of scepticism is beyond the reach of value judgements, for if all judgements are open to doubt, then mutatis mutandis,
so are judgements about the value of scepticism. The more plausible versions of scepticism have better circumscribed domains and are motivated
by specific arguments, as is the case with the mitigated scepticism advocated by Hume.
To Hume, mitigated scepticism, in contrast with the self-refuting radical
or extreme scepticism, is invaluable because ‘the greater part of mankind
are naturally apt to be affirmative and dogmatical in their opinions … they
see objects only on one side, and have no idea of any counterpoising arguments’ (E 207). Mitigated scepticism counters such tendencies by encouraging non-dogmatism, fallibilism and intellectual humility, because it
‘would naturally inspire [the dogmatists] with more modesty and
reserve, and diminish their fond opinion of themselves, and their prejudice
against antagonists (Hume 1975, Section XII, Part III, p. 161).
Moderate scepticism is valuable, not just as an intellectual virtue, but
also as a civic virtue promoting democratic governance by facilitating
non-dogmatism, tolerance and open mindedness. As Alan Hazlett
INQUIRY
3
(2015) has argued, dogmatism and unwavering claims to knowledge can
hamper political engagement with those of different persuasion, while
willingness to acknowledge the possibility that we all know less than
we think we do can facilitate more open and tolerant dialogues,
because where ‘disputants take themselves to know, entrenchment
may ensue, whereas when disputants take themselves to be ignorant,
or suspend judgment about whether they know, respectful engagement
may be possible’ (Hazlett 2015: 90). Motivated and self-conscious exercise
of doubt is the starting point of both global and moderate scepticism, but
doubt also is an effective antidote to political extremism. For this reason,
Quassim Cassam (2021) sees doubt, and thereby presumably mitigated
scepticism of the type we have discussed, as a corrective virtue in the
public domain.
Scepticism towards political authorities, often manifesting itself in the
form of distrust, can also be instrumental in propelling the citizenry into
action. For instance, based on the example of the Black Civil Rights movement in America, Meena Krishnamurthy (2015) argues that it was distrust
of the White moderates’ willingness to act to promote racial justice that
led Martin Luther King to start a campaign of direct action using new
forms of political participation. The value of scepticism, in this case, lies
in its ‘tendency to bring about justice by tempering tyranny’ (2015, 400).2
But mitigated scepticism can also have negative political and social
impact. Climate and vaccine scepticism are currently the two most prominent examples as is, more generally, the type of scepticism toward authorities that breeds conspiracy theories. We will return to these below.
3. Scepticism and trust
There is a nascent tension between the value of scepticism, on the one
hand, and the need for epistemic trust on the other. Trust is an essential
ingredient of social life: by exercising warranted trust we can learn from
each other, cooperate, collaborate with each other, and facilitate the
much needed division of cognitive and other forms of labour (Alfano
2016) Scepticism, however, even in its mitigated form, is not always hospitable to cultivating an attitude of trust. Epistemic trust, the most relevant species of trust for this discussion, involves accepting and relying
2
Other examples can be found in the protests against the Iraq war, which were motivated in large part by
scepticism of the American government’s claims about the need for a war. The same considerations
often apply to local-level politics where activism often starts with scepticism about the claims and
deeds of political authorities. (We owe this point to Matthew Shields.)
4
M. BAGHRAMIAN AND S. CAPRIOGLIO PANIZZA
on knowledge claims and testimony of others under conditions where
one is not in possession of full facts or complete evidence and takes
others to be more knowledgeable (Baghramian and Croce 2021). Scepticism, as we have known since Descartes, if not before, begins with the
method of doubt. The absence of indisputable evidence and a justified
horizon for doubt are crucial to the engine of scepticism.
Epistemic trust, is required exactly where we are not in possession of
full evidence and have at least a residual doubt about the testimony
we are to accept. To be sure, such trust does not need to be blind and
fall into gullibility. Indeed, often, but not always, trust calls for a degree
of vigilance and a low-level monitoring of the risks involved in our interactions with others (Sperber et al. 2010). But, when we are in full possession of all the relevant information, we do not need either to demand or
to rely on trust. Trust, as Anthony Giddens argues, ‘is only demanded
where there is ignorance – either of the knowledge claims of technical
experts or of the thoughts and intentions of intimates upon whom a
person relies. Yet ignorance always provides grounds for scepticism or
at least caution’ (Giddens 1990: 89).
It is this acceptance of doubt that introduces an element of vulnerability in relation of trust: trust involves a risk, the risk of betrayal, being
let down, not obtaining the outcome we had hoped for Becker (1996)
and Baier (1986). In trusting, we give the benefit of doubt, rather than
encouraging an attitude of doubt. Any resistance to scepticism includes
trust in the information we receive – including self-trust about the
reliability of our faculties that lead us to such information. In short,
trust involves a willing acceptance of uncertainty: by contrast, the sceptical stance starts with taking uncertainty as a genuine challenge to the
possibility of knowledge. For these reasons, trust and scepticism about
the information we receive, and its sources cannot readily coexist.
Similar considerations have led philosophers to conclude that ‘Doubt is
the enemy of trust. To say that one doubts the motives or competence of
one’s political leaders or institutions is to say that one does not trust them’
(Cassam, forthcoming, n.p.) and that scepticism is the antonym of trust
(Scott 1999: 276). Yet the arguments in this paper do not depend on
establishing such strong logical or conceptual opposition between trust
and sceptical doubt. Rather, what we wish to emphasise is that the call
for public and civic trust, so common to the political discourse of today,
is not always compatible with the exercise of the type of scepticism
that is both politically virtuous and of intellectual value. And that, as philosophers, we should be concerned about this incompatibility.
INQUIRY
5
The incompatibility between scepticism and trust takes a practical
turn in the context of the much-discussed requirement of epistemic
trust in experts. Here, trust is not only demanded by individuals (the
expert) but by groups and institutions (the scientific community). In
this case, competence and sincerity have been identified as the
main grounds of epistemic trust. But as Heidi Grasswick (2018) has
argued, on the one hand, trusting expert’s competence comes with
trusting their judgment and their adherence to social ethical norms,
avoiding misconduct; on the other, trusting their sincerity, in cases
of epistemic trust when something important is at stake, also involves
trusting that they care for us enough to select and deliver information
accordingly, even if that requires putting aside their own specific
interests.
Grasswick’s observations about epistemic trust in experts apply significantly in the case of vaccine (dis)trust discussed in the paper. They introduce an ethical element that is at stake in both trust and distrust of
experts. Further, given that in the case of vaccines experts assume the
role of advisers on policy matters, in such circumstances epistemic trust
takes on both ethical and political significance. For these reasons, much
discussion has gone into how to build trust in experts occupying public
policy roles, but the flip side of such trust should be considered too.
What price might we pay in suppressing scepticism about the role and
function, if not the findings, of experts?
These considerations suggest that we should take a look at the value of
mitigated scepticism. As Hume, Russell, and Austin among others have
pointed out, mitigated scepticism, like belief, needs to be justified – to
avoid falling prey to what Torcello (2016) calls pseudo scepticism, a
worthy companion to ‘pseudoscience’. In what follows we outline four
types of concerns regarding the position of experts, the consequences
of trust in their role, and the grounds provided for their advice, which
go some way towards showing the value of moderate scepticism and
an attitude of distrust.
3.1. Argument from loss of intellectual autonomy3
As Sandy Goldberg puts it,
3
Linda Zagzebski (2013) has distinguished between epistemic and intellectual autonomy and thinks that
while the former could be acceptable, the latter is not. Here we are bypassing the distinction for the
simple reason that, at least when it comes to political decision making, some level of epistemic autonomy is a pre-condition for fully blown intellectual autonomy.
6
M. BAGHRAMIAN AND S. CAPRIOGLIO PANIZZA
an epistemically autonomous subject is one who judges and decides for herself,
where her judgments and decisions are reached on the basis of reasons which
she has in her possession, where she appreciates the significance of these
reasons, and where (if queried) she could articulate the bearing of her
reasons on the judgment or decision in question (Goldberg 2013, 169).
An autonomous person not only determines the course of their life for
themselves (see Raz 1998), but as autonomous thinker she also decides
the course of her thinking. The concern then is that in trusting the epistemic authority of experts who have policy and political roles we
choose to make ourselves reliant on their judgements, and in the more
extreme cases, intellectually subservient to them; in other words, we
sacrifice our autonomy and compromise our ability for critical thought.4
It may be objected that the call for intellectual autonomy is not relevant to the case of trust in experts. After all, experts are by definition
those people to whom we rightly attribute a special epistemic authority
and rely on their judgment as the best source of information in a particular domain. If that is the case, the traditional tension between intellectual
autonomy and trust, based on the argument that trusting makes us intellectually subservient or compromises our ability for critical thought, does
not seem to apply here. Yet this objection depends on the assumption of
an antecedent acceptance of the role of experts as a source not only of
guidance, but also of decision making. And it is exactly the latter role
that is being questioned: it is possible to acknowledge that individuals
or groups have greater knowledge and competence in certain areas,
while, at the same time, wanting to exercise intellectual autonomy – for
instance, by insisting that we can learn from different expert sources
and accept, reject or combine their recommendations.
The desire for retaining our intellectual autonomy and the value we put
on it are evident in our daily conduct. Each time we resort to an internet
search engine to check our doctor’s advice, despite the medical profession’s frequent recommendation to avoid reliance on online medical
information, we show a desire for intellectual and epistemic autonomy
by engaging in activities that rightly or wrongly seem to decrease our
blind dependence on others – even though we may be just falling back
on other, and less reliable, sources of information. The wish for intellectual
autonomy is exploited shamelessly by conspiracy scams, such as QAnon,
with their motto ‘Do your own research’. The ‘research’ advocated has
4
The point is not new: philosophers have written about the value of epistemic autonomy as far back as
Descartes (1968 [1637]), who forbids inquiring minds from relying on the ideas of others. Similar sentiments have been expressed by Locke and Kant.
INQUIRY
7
some of the academic and scientific trappings of gathering evidence and
confirming hypotheses; it promises autonomous expertise to its supporters and counters the feeling of intellectual subservience to experts
with whose political positions they may disagree. While the intellectual
autonomy promised by QAnon and its like is nothing but a sham, it is a
sham that exploits a desire to retain intellectual independence or at
least not to feel subservient – particularly to those who are seen as the
elite in a system of governance that perpetuates, indeed creates, inequalities – both economic and epistemic.
Elizabeth Fricker (1994) has rightly argued that we should accept an
assertion on the basis of testimony only if we recognise correctly that
the testifier is epistemically better placed than we are in making that
assertion. One important question facing us now is what and how
much of our intellectual autonomy, and of the ability for critical
thought that follows from it, we are sacrificing when we stop being sceptical about the hierarchies of knowledge and simply accept that members
of certain groups are better placed to offer judgements on matters that
affect us socially and personally. The second, and maybe an even more
crucial question, is about the social and political costs of this recognition.
Knowledge is power, as we know so well since Machiavelli (1940 [1640]),
and by acceding the power of knowledge to experts in policy decisions,
the fear is that we may be forfeiting at least some of our rights, and
instead empowering unelected members of the society to have direct
impact on policy decisions. When experts play a central role in public
policy matters, to accept their intellectual authority involves relinquishing
not just our intellectual but also our political autonomy. This is a dual
concern: sacrificing one’s intellectual autonomy to trust will also compromise our ability to deliberate critically about political and social matters.
This worry links the concern about intellectual autonomy with a worry
about democratic participation, to which we now turn.
3.2. Argument from democratic deficit
Experts’ involvement in policy advice has become a constant feature of
modern governance. For a time, epistocracies were seen as harbingers
of a new era of politics, where ideologies would give place to technical
problem solving and expert-driven, problem-oriented thinking would
replace both the capitalist and the socialist states (e.g. Price 1965).
While these high hopes proved ill founded, reliance on experts in policy
decisions in a wide range of areas, from economics to health, from
8
M. BAGHRAMIAN AND S. CAPRIOGLIO PANIZZA
technological know-how to agriculture, remains central to contemporary
governance. The so-called ‘knowledge economies’ – where intellectual
capital is a key source of economic growth, accentuates this reliance.
There are two interconnected concerns specifically about the role of
experts in democratic governance: first, expert knowledge, by definition,
is not open to assessment by non-experts and therefore the very idea of
expertise is premised on epistemic inequality which will have social and
political consequences. Second, experts, when they function as unelected
contributors to political governance, are immune from the type of
accountability we impose on elected members of governments.
Experts, by definition, know more than the general public about their
area of expertise, and particularly in knowledge economies those with
greater epistemic resources and access to such resources are rewarded
financially and accrue prestige and status – they become the new
elites, the aristocrats of the intellect with all the power and prestige
that go with the status. The top-down reliance on a small group of
experts on policy decisions leaves less space for the participation of citizenry in political decision making.
This gives rise to democratic deficit of trust in experts: if policy
decisions are to be taken based on advice from experts, and if the
correct epistemic position towards experts is one of trust rather than critical questioning, then the space available not just for political contestation
but also for political decision making will shrink in proportion to the
extent that we allow expert-driven trust-based policies to become the
guiding principle of political governance. These worries offer grounds
for scepticism, and indeed distrust, of experts, not primarily because
the content and sources of their advice, but because of the impact of
their involvement in political decision making. Yet there is no easy distinction between these two reasons: in promoting trust in experts, the question of it’s the entirety of its short and long-term impact is pertinent to our
decision to trust.5
5
The danger of over-reliance on experts was known to John Dewey who worried, already in the early
twentieth century, that the ever-increasing reliance on expert advice can diminish the scope of participation by ordinary citizens in the political process (Dewey 1927). A worry that was echoed, in even
stronger terms, by Hannah Arendt (1972) and other members of the Frankfurt School (Habermas
1985) as well as Michel Foucault (2003) who, in their various ways, saw over-reliance on experts as
a side-effect of the type of ‘scientism’ that holds Western societies in its grip, and constitutes an inherently ideological stance masquerading under the banner of objectivity. As Stephen Turner (2001) puts
it, ‘the political threat to democracy posed by the existence of expert knowledge [is that] expertise is
treated as a kind of possession which privileges its possessors with powers that the people cannot successfully control, and cannot acquire or share’ (Turner 2001, 123). Such inequality is not only detrimental but indeed inimical to the type of equality of participation that democracies presuppose.
INQUIRY
9
As we will see in §4, there are measures to address the democratic
deficit of reliance on experts in governance, but the growing easy dismissal of any sign of distrust in experts ignores the complexities of the role
assigned to experts in democracies. At the same time, the reasons for
such a dismissive attitude are easy to understand. In recent years,
experts and expertise have been under attack by the worst of the rightwing populist leaders, attacks that have led to innumerable preventable
deaths in the US and Brazil from COVID-19, to name just two cases, and
have accelerated the threats of global warming. This new political order
has turned the question of trust in experts into a liberal cause and politicised expertise unduly, turning it, particularly in the US, into a partisan
political issue. But it is the very complexity of the standing of experts in
a democratic governance that makes it open to exploitation by populist
politicians. Ignoring the difficulty and simply calling for more trust in
experts by offering further demonstrations of the professional trustworthiness of experts will not address the worries that fuel such
scepticism.
3.3. Argument from the normative shortcomings of science
Science sceptics claim, on a variety of grounds, that science does not
deliver the objective, interest-free knowledge it promises. Such scepticism may be directed at the corrupt practices of individual scientists, or
it may be seen as inherent in the very methodology of science. Both
types of criticism result in the accusation that the scientific theories and
their resultant technical knowledge that inform policy decisions are
never pure or value neutral, but to the contrary, they are often infused
with personal and ideological biases that support the interests of the individual scientists and/or the existing economic and power hierarchies. We
will look at each concern in turn.6
The simplest reason for scepticism and distrust of experts is the suspicion that their advice is informed by their personal and sectoral biases or
financial interests. Examples of fraud, personal bias, and incompetence,
while not widespread, are part of the landscape of expertise. Undoubtedly, there are bad actors among experts whose advice is motivated by
personal or professional gain rather than the best scientific evidence,
but moral or professional failures at this personal level are not a very
6
We should point out, maybe unnecessarily, that the mere uncertainty of the sort inherent in the methodology and practice of science, and readily acknowledge by scientists, is not a sufficient reason for
scpeticism about scientific expert advice.
10
M. BAGHRAMIAN AND S. CAPRIOGLIO PANIZZA
good reason for scepticism about the science/policy nexus. While blatant
corruption is a sad social reality, there are reliable ways of detecting and
addressing such corruption in scientific practice – for instance through
peer review and expectations of the replicability of results, particularly
where the results are surprising or indicate significant breakthroughs;
whistle-blowers have also played an important role in exposing corrupt
practices at institutional level. For these reasons, fraudulent practices
within science are not very common.
There is a host of deeper and more subtle reasons for doubts about the
purity and value neutrality of scientific advice, reasons that are embedded
within the operational and theoretical frameworks of science rather than
in the psychology or the proclivities of individual scientists. First, there are
concerns about the underdetermination of scientific hypotheses by existing data, i.e. that there are empirically equivalent rival theories that are
equally adequate in explaining experimental results or observations
(Quine 1970, 179). Underdetermination poses the question of how scientists choose between different empirically adequate rivals and the extent
to which values play a role in such decisions. The so-called Problem of
Unconceived Alternatives, based on the historical evidence that ‘typically
[there] are alternatives to our best theories equally well confirmed by the
evidence, even when we are unable to conceive of them at the time’
(Stanford 2001, S9), shines a different light on the same problem. The
‘new pessimistic meta induction’ from past failures in imagining better
theoretical alternatives opens the possibility that we may be in the grip
of similar failures now, that our biases and values may not allow us to
imagine and give preference to alternative scientific theories. Finally,
the inductive risk argument (Hempel 1965; Douglas 2013), starts with
the position that scientists never have conclusive proof for their theories
or complete evidence for their hypotheses, but always face a degree of
uncertainty regarding scientific knowledge, so they need to use their judgement to make a final call on how much uncertainty they are going to
accept, and concludes that scientific judgements have strong normative
elements.
What the three arguments show is that theory choice is not fully determined by available evidence, that scientists use their judgement, exercise
their imagination and make risk assessments in prioritising a particular
theory over and above alternatives, and in doing so they inevitably rely
on value judgements. Moreover, contra Kuhn (1977), such judgements
are not restricted to epistemic values only. As Heather Douglas (2013)
has argued, values, both epistemic and moral, come to play an important
INQUIRY
11
role in framing the problems scientists are addressing, deciding on the
range of evidence they take into consideration, the scope they assign
to a hypothesis, the levels of uncertainty they are willing to accept, and
how they calculate the consequences of any error they may make. The
optimists about science, and we think Douglas is among them, believe
that scientists should, and are in a position to introduce values such as
benevolence (or a concern for the welfare of others), the principles of
least harm, and due diligence in assessing negative consequences of
their decisions to ensure that the value gap in science is addressed in
ways that are beneficial to the recipients of their advice. Pessimists,
among them some feminist epistemologist, on the other hand, point
out that scientists’ theoretical choices are frequently coloured by unacknowledged gender, race, class and other ideological biases. Moreover,
in most cases, only a historic distance will allow us to detect the full
range of pernicious values that are brought to bear on the choices of
theory and evidence. On both accounts, the upshot is that, one way or
another, theory choice is not value natural, and the values that are
brought to bear on the scientists’ choices, have not just an epistemic
but also social and moral dimensions.
The sceptical concerns outlined in this section should not be seen as an
invitation to science denial but as a valuable corrective to a range of pernicious practices in science It is the illusion that science is value-free that
often fuels pernicious science denialism. As Philip Kitcher writes, ‘The
deepest source of the current erosion of scientific authority consists in
insisting on the value-freedom of Genuine Science while attributing
value-judgments to the scientists whose conclusions you want to deny’
(2011, 40). In fact, science denialism, be it in the shape of vaccine resistance or the rejection of climate science, is the exact opposite of scepticism, for it speaks of a certainty that the sceptic wishes to avoid, and it
does so in the face of quite overwhelming evidence in favour of vaccines.
3.4. Argument from past and present injustices
Distrust of scientific experts is also often rooted in experiences of harm
and suffering brought about through collusions between experts and political powers, where the damage has been inflicted, in particular, on the
marginalised, the defenceless and the vulnerable. We do not need to
revisit the horrors of Nazi Germany to find examples of injustices and
suffering that can be directly linked with experts as policy advisers or
as facilitators of nefarious state policies. Pharmaceutical companies
12
M. BAGHRAMIAN AND S. CAPRIOGLIO PANIZZA
testing their new vaccines, without the mothers’ consent, on children
incarcerated in Irish Mother and Baby homes In Ireland well into the
1960s and 1970s (Ireland Department of Children, Equality, Disability,
Integration and Youth 2021) or the Tuskegee experiments on Black subjects in the United States (Razai et al. 2021; Bajaj and Stanford 2021) are
among fairly recent examples of how experts, with or without the intervention of the State, have abused their expertise as well as the position
of trust accrued by it, to exploit the vulnerable and the unknowing for
the sake of scientific advancement and possible societal good (or the
good of part of society).
What differentiates these cases from the worries discussed in 3.3 is that
the abuse of power by the state, its institutions, and scientists themselves,
was not the result of values implicitly permeating theory choice, nor does
it have to do with scientific fraud. Rather, these are instances of how
science, even when potentially of benefit to the general public, is also a
potential source of harm to the marginalised. As Naomi Scheman has
observed, when institutions are unjust, the trustworthiness of what is
embedded within those institutions suffers – and, she claims, ought to
suffer – even when such associated or embedded practices and individuals are faultless (Scheman 2011, 223).
These cases illustrate the extreme end of Maya Goldenberg’s (2016)
advice not to take a paternalistic approach to the question of trust in
experts in the context of vaccine hesitancy. Vaccine hesitancy, as Goldenberg points out, may not be driven solely by ignorance and stubbornness,
but a distrust grounded in the failure on the part of experts to address
people’s particularistic concerns, as in the case she describes of parents’
understandable concern about the health of their children. Karen Jones
(2013) has spoken rightly of the epistemic and moral costs of misplaced
distrust. But the cost of misplaced trust can be just as high, if not
higher, and not just epistemically and morally, but as a matter of life
and death: for all those subjected to vaccine trials and other medical
experiments by members of the very profession they are now called
upon to trust, distrust is not a sign of irrationality or epistemic failure,
but a reasonable, and potentially valuable, stance.
4. The requirement of trust
The sceptical arguments in 3.1–3.4 show how the question of trust in
expert advice could be seen as a legitimate terrain of both normative
and political contestation. To withhold trust in experts, particularly in
INQUIRY
13
the political domain, can be and often is a political act, but the act should
not be dismissed only as a by-product of right-wing populism, even if in
many instances, in recent times, it has been. The cumulative impact of
3.1–3.4 is to demonstrate that there are legitimate grounds for scepticism
towards experts, particularly in the context of the roles they assume in
public life. Yet it remains true that epistemic, as well as other forms of
trust, are inescapable as well as beneficial features of our social life.
Without epistemic trust we cannot learn from each other, nor can we
enrich our collective intellectual life by allowing a division of epistemic
labour where not everyone is expected to be in possession of all that is
open to investigation. Equally inescapable is the need for expert advice
on complex matters of health, safety, ecology, the environment, AI, and
more, often at a global level. As the COVID-19 pandemic has demonstrated dramatically, the intervention of experts on policy decisions can
be of great urgency and the need for their advice, literally, a matter of
life and death. The point can be generalised beyond the current emergency and be applied to other threats facing us, with climate change
and vaccination – for COVID-19 but also other diseases – being among
the most notable examples.
In such instances, trust in experts takes on specific features, which are
relevant to assessing its ethical significance. First, COVID-19 health advice
and vaccination, like the science of climate change, are backed by widespread consensus within the scientific community, offering grounds for
reduced scepticism. Second, the urgent need for action in the face of an
imminent threat reduces the opportunity for independent reflection and
assessment and heightens the need to exercise trust as a shortcut to
immediate action. Third, trust is not just expedient but essential in these
matters, because first-hand knowledge of medical and climate science
among the public is scarce, and well-informed attitudes about these questions essentially depend on trust in the testimony of experts (see Almassi
2012). Furthermore, the requirements of compliance with expert advice
for the public good in these cases is not just a question of individual autonomous choice, but a matter of significance to all members of society, in the
local as well as global community. As Heidi Grasswick observes, this closer
adherence of information and action at times of great uncertainty
presents particular challenges for laypersons, who must find ways to responsibly trust scientific institutions, since the boundaries between the knowledge
produced and policy implications begin to blur and with that, political interests
play a prominent role in the development and presentation of the knowledge.
(Grasswick 2014, 542–543)
14
M. BAGHRAMIAN AND S. CAPRIOGLIO PANIZZA
In the case of vaccine hesitancy, while reluctance to accept vaccination is
not something new, the mass vaccination campaign started at the end of
2020 to immunise against COVID-19 has brought the issue to the forefront, globally, as never before. From an ethical perspective, one of the
most salient features of vaccine hesitancy is that, in countries with a
high degree of diversity such as the UK and the US, minority groups are
more likely to reject vaccination, based on lack of trust in the healthcare
system (Laurencin 2021) yet, at the same time, minority groups are also
among those who have suffered most from the impact of COVID-197
Historical and current inequalities both motivate and exacerbate the
consequences of vaccine hesitancy among these groups. As we have
seen, vulnerable and marginalised groups have routinely received
unjust treatment from the medical system, and in some cases have
even had their human rights flagrantly violated. Such cases show, in a
stark manner, the internal tension between rational and ethical distrust,
in this instance based on experiences of overt injustices and more
covert biases, and the potentially destructive consequences of the same
distrust, lending further moral urgency to the question: how can we
reconcile the requirement of trust with justified scepticism towards
experts and their policy advice?
5. Possible resolution 1: replacing trust with reliance
One possible solution is to argue that, when it comes to scientific matters,
trust is actually not a suitable doxastic or emotional attitude. Rather, in
such contexts, the thinner notions of epistemic deference and reliance
are the more appropriate stances. It seems natural to think that the
vaccine hesitant simply refuses to rely on the advice of the medical
experts and the climate sceptic does not rely on the predictions of the
climate scientists when deciding on buying green cars or cutting down
their consumption of beef. If this is true, any talk of trust or distrust
over-complicates their attitudes towards science.
Another reason to reject the relevance of trust to matters of science
policy is that trust is often seen as a stance we take towards individuals
rather than groups or institutions, while scientific policy and advice are
frequently the product of advisory groups and policy institutions. It is
7
In the UK, the impact on BAME communities has been more severe, with a 10–50% higher risk of death
(Public Health England 2020). Black Americans are also 3.57 times more likely to die from COVID-19
than white people (Razai et al. 2021). As Razaiet al. argue in the recent BMJ editorial just mentioned,
re-building their trust is key.
INQUIRY
15
intuitively easy to think about trust in individuals, but less so to think of
trust in groups, not least because some core features of trust – such as
mutual relationships, personal acquaintance, feelings associated with
trust such as betrayal – do not transfer easily to cases of institutional
and procedural trust.8 Inkeri Koskinen (2020) uses these ideas to claim
that the ethical dimensions of trust are such that trust is not a fitting
concept for science and the detached objectivity that science aims at.
Arguing against those who believe that objectivity requires a shared
basis for trust, and that trust is necessary on the part of the public to
believe experts (e.g. Scheman 2011), Koskinen claims that the trustbased approach fails to distinguish between trust and reliance in the
case of science, and that the latter is more properly applicable to scientific
objectivity: identifying scientific information as objective enables us to
rely on it, while trusting it would be a different and unwarranted step.9
This solution, however, does not work if we consider the situational
demands of specific cases of the science/policy nexus under consideration here. Trust, rather than mere reliance, seems appropriate in the
cases of COVID-19’s stringent public policy measures, vaccination and
climate change for they involve the acceptance of risk in the face of possible harm, or substantial sacrifice as cost of relying on expert advice.
Unlike reliance, trust invokes a range of emotional and evaluative
responses, including a sense of risk-taking, the possibility of feeling
betrayed (Baier 1986), and an engaged attitude rather than a spectatorial
one (Holton 1994), which are all at play in these cases. When we trust, we
make ourselves vulnerable, and this is especially true when trust requires
us to act in specific ways, and when these actions could affect our lifestyle
or our health. Baier (1986) shows that there are degrees of vulnerability in
trust-relationships (ranging from infant-parent relationships to the relative safety of contracts), but that in every case trust impacts power positions. Following Baier, Lawrence Becker (1996) has argued for a noncognitivist account of trust in the political sphere, where trust involves
‘confidence about the benevolence, conscientiousness, and reciprocity
of others’ (1996, 53). These moral value are also part of Grasswick’s
(2018) account of epistemic trust, which depends not only on expertise
but also on the expert’s willingness to protect the public’s interests;
8
See also the discussion on trust in States and individuals representing States and other groups (e.g.
Booth and Wheeler 2007) and organizational trust (e.g. Saunders 2010).
9
Koskinen denies that we can trust science, but not that we can trust scientists. However, she introduces
a difficulty insofar as the object of trust in scientists is science, and scientists are representatives of the
scientific objectivity.
16
M. BAGHRAMIAN AND S. CAPRIOGLIO PANIZZA
making correct but irrelevant information salient, for instance, may contribute to false beliefs, so value-based choices are part and parcel of
the public role of experts.
Conversely, breaches of trust are harmful, in Becker’s view, not only for
individuals but also for societies, because our responses to them tend to
be more ‘volatile and disruptive’ than responses to mere unreliability. The
trust required, of course, is not blind, and is informed, among other
things, by the reputation, the track record of past performances, and
the success or failures of the experts. But, as noted above, in cases such
as COVID-19, as well as climate change, the extent of the knowledge
gap between the experts and lay people, coupled with the urgency of
the decisions to follow scientific advice and the high stakes of such
decisions, call for the richer attitudes of trust rather than mere informed
reliance.
6. Possible resolution 2: towards a climate of trust
Reasoned scepticism about the role of science in policy decisions should
be taken seriously, both because of the worries about the politics of trust
raised above, and because distrust may signal and reveal social imbalances and wrongs in the way scientific knowledge is placed in the
service of the political and social interests of some sectors of the
society at the expense of others. In such instances, scepticism and
indeed distrust is not only justified, but it may have the value of pointing
in the direction of something that needs rectifying. As Grasswick (2018)
argues, what is distinctive about ‘impersonal’ trust such as epistemic
trust in the scientific community is the importance of the trustworthiness
of the practices of the institution. If these practices are part of unjust structures, their trustworthiness is also eroded. In these cases, according to
Grasswick, we can talk of ‘epistemic trust injustice’, where injustice lies
not only in not being listened to, but in not possessing the resources to
trust appropriately: it is not only a rational, but a social and ethical
issue to be in a position to trust experts.
At other times, distrust is not a sort of protest against injustice, but
stems from of a fear of losing autonomy and hence power.10 This
source of distrust is different, but it too points at social imbalances that
need to be acknowledged. On the other hand, as we have seen, distrust
of experts, even when it originates from experiences of injustice and long10
Such as the case of some climate sceptics, see McCright and Dunlap (2011).
INQUIRY
17
standing grievances, becomes a source of concern when scientific advice
plays a crucial and urgent role in securing the wellbeing of a population
or a planet. The question is how to reconcile these Janus faces of distrust?
Much of the reaction to the real or perceived breakdown of trust in
experts, over the last few years, has called for better communication or
messaging by scientists and science journalists, (Lewenstein and Brossard
2006; Jasanoff 2014), greater scientific literacy on part of the general
public (Lombrozo et al. 2008; Miller 1983; Miller 2004) and countering
the impact of motivated cognition (Oberauer and Lewandowsky 2016).
Addressing these flaws, it is assumed, will address issues of distrust in
science. Others have rightly emphasised the need both to substantiate
and to increase the trustworthiness of experts by demonstrating not
just their knowledge and competence, but also their honesty in communication and their responsibility or responsiveness to the evidence
(Anderson 2011, 145–146). But, as we have seen, distrust is not occasioned only by perceptions of incompetence or failures in performance.
Nor it is only the result of normative and ethical failures, such as dishonesty or irresponsibility, on the part of experts.
Distrust arising from socially based concerns shows that what is at
stake is not just professional credibility or competence, nor intellectual
desire to master more information. In fact, distrust in experts in these
cases can arise from the rejection of any information coming from a particular source, not simply because of worries about the accuracy of the
information or the personal credibility of the experts responsible, but
because of the background values and political structures within which
the information is created and shared. What is at issue is not only the
content of the message of distrust or scepticism, but also the identity of
those who are distrustful,11 as well as the social, political and historical
factors influencing the creation and reception of the scientific message.
To reiterate the point, while ensuring the trustworthiness of the
experts and policy makers who are advised by them is essential in countering unwarranted distrust, we also need to take into account the varying
factors that go into legitimising distrust. To achieve the complex goal of
countering the call of distrust, we argue, we need to create a climate of
trust, a social and political environment where the concerns that motivate
and legitimise distrust are acknowledged and, to the extent that is possible, addressed and where legitimate trust is allowed to flourish. Trust in
11
The approach follows the suggestion to take a ‘situated’ perspective advocated by feminist epistemologists such as Code (2006) and Wylie (2003).
18
M. BAGHRAMIAN AND S. CAPRIOGLIO PANIZZA
experts, particularly but not only in cases where they guide policy
decision, cannot be treated in isolation, for it is interwoven with other
social and political forms and requirements of trust. For this reason, as
Scheman (2011) has argued, the impersonal discourse of science, its universal concept of objectivity, and the impersonal demonstrability of trustworthiness that goes with it, are not enough in a context where social
hierarchies exclude some individuals and groups from the production
of knowledge and the exercise of power. Trust, as Scheman puts it,
‘needs to be convincingly demonstrated – not just abstractly demonstrable’, and justified belief in the trustworthiness of the scientific
methods practiced by institutions depends on
the justified belief that those institutions do in practice what they are supposed
to do in theory: ground knowledge claims that are acceptable to all of us, not
just to those of us with certain forms of privilege, who see the world through
certain lenses, from certain biased perspectives. (Scheman 2011, 221)
A climate of trust will facilitate trust and trustworthiness at a collective
level rather than focusing only on either the individual experts and
their trustworthiness or the attitudes of trust or distrust evinced at individual level. Trustworthiness is a feature of that wider institutional practices that make the legitimate acts of trusting possible. It is often said
that trust is the glue that binds the members of society. A climate of
trust is what ensures that the glue is spread evenly, reaching all segments,
and not excluding or marginalising particular groups or individuals.
While public trust is a collective phenomenon, it is worth noting that
the requirement of trustworthiness is not spread equally across all
members of the society: greater demands and more onerous conditions
are placed on policy makers, the experts of various kinds, the media,
and medical carers, to name a few. This does not, however, let individual
consumers of expertise off the hook. Consumers of information, in particular, have responsibility to show due diligence in accepting the testimony of their sources or in transmitting such information to others.
In the remainder of the paper, we will briefly outline how elements of a
climate of trust can address some of the worries, listed in 3.1–3.4, that give
rise to justified scepticism about experts and even justified distrust.
6.1. Addressing the loss of intellectual autonomy
The Cartesian conception of intellectual autonomy was a product of an
individualist conception of the mind; once we come to think of ourselves
INQUIRY
19
fully as social animals, dependent on others physically, socially and linguistically, then the Cartesian view of autonomy as self-reliance begins
to lose its hold. Acknowledging a desire for intellectual autonomy does
not mean denying the place of epistemic dependence and co-dependence or the need for experts in facing the complexities of the world
and its attendant knowledge landscape. What we hope to find is the
possibility of a balance between reliance and even dependence on
others, epistemic and practical, on the one hand, and the capacity for
independent thinking and decision making, including the ability to
engage in political contestation, on the other. The complexity of
finding such a balance is often forgotten in the loud contemporary call
for trust and trustworthiness. This means that independent thinking
and decision making have to be protected, but not simply by offering
more information and providing more transparency about the process
of knowledge creation.
As Onora O’Neill (2002) has pointed out, we can retain and manifest
our epistemic and intellectual autonomy by choosing with due diligence
whom we rely on, or to whom we intellectually defer. In other words, the
exercise of autonomy is prior to the act of trust or reliance.12 Following
O’Neill’s strategy of distinguishing types of autonomy, Nguyen (2018)
has argued that we can preserve autonomy in trust: the kind of intellectual autonomy that is threatened by trust in experts, according to
Nguyen, is direct autonomy, where we seek to understand the information
and process it by ourselves. However, there are two other kinds of autonomy which are consistent with trust: delegational autonomy and management autonomy. In the former, we trust others to provide information we
cannot arrive at ourselves but remain autonomous insofar as our trusting
is active, justified, and we take responsibility for it and for our choice to
delegate; in the latter, we put together, for ourselves, information from
different sources, and take responsibility for the whole system of knowledge rather than the constituent pieces of information. However,
Nguyen’s and O’Neill’s conceptions of autonomy are not always equally
available, but can be best exercised within a climate of trust where
there are assurances that the sources of our knowledge, or the larger
system of knowledge, are indeed trustworthy. Moreover, taking a leaf
from the success of QAnon, to preserve intellectual autonomy, genuine
alternative expert advice should be made available to the interested
members of the public to enable them to do ‘their own research’.
12
We would like to thank Ben Almassi for pointing out to the strength of this type of objection.
20
M. BAGHRAMIAN AND S. CAPRIOGLIO PANIZZA
Expert bodies should acknowledge contrary scientific positions, where
applicable, but also point out the extent to which there is a consensus
among the scientific community at large on the position they are advocating. The result may not have the intoxicating game playing appeal of a
conspiracy theory mystery, but it will be more respectful of the intellectual autonomy of those who are called upon to show trust in expert
opinion. It may be argued that recent COVID-19 related examples of
scientists sharing and discussing expert disagreement in the media
could reduce trust in experts and be detrimental to the public
debate.13 But the pretence that scientists are in full agreement and the
attendant denial of contrary voices not only is dishonest but can also
undermine the very trust we wish to establish or strengthen.
6.2. Decreasing the democratic deficit
Citizen participation, at various levels of knowledge creation and transmission (or what Dewey calls popularisation of knowledge), is an essential
element of creating a climate of trust. The point is echoed by many contemporary thinkers. Philip Kitcher (2011), for instance, argues that to
counter value-based distrust we need more inclusive participation of
the public in the workings of scientific research from the very beginning,
including, in particular, the facilitation of a more transparent and open
discussion about the values that inform such research.
An effective way of including the public in expert-informed policy
making is the routine use of citizen assemblies and mini fora (Farrell
and Suiter 2019), where experts and representatives of the general
public engage in publicly accessible conversations and deliberations
about the policy choices available. Making intellectual space for the contestation of dominant views, monitoring the performance of the experts
and their commitment to honesty, transparency and good will are further
means of both creating a climate of trust and enabling a participatory
form of democracy where experts and policy makers are held answerable
to the citizenry.
A division of epistemic labour is essential for the smooth running of
any society, but the division does not need to be purely hierarchical. A
horizontal model, where multidisciplinary panels, including lay
members of society, share the responsibility for policy advice, is more
in line with the egalitarian aspirations of democratic governance. Such
13
This point, as well as several others critical of our position was raised by Michel Croce.
INQUIRY
21
‘horizontal’ models do not deny the role and significance of specialist
knowledge in decision making, nor do they flatten the idea of expertise
by placing their level of knowledge on par with the lay people’s.
Rather, they allow for input from and debate between a variety of
sources and voices. The complaints about the elitism of experts, more
often than not, are directed at the exorbitant financial rewards they
receive, as well as the air of arrogance surrounding them, rather than
the knowledge and information that they possess. The horizontal conception of the division of epistemic labour on expert panels can also
have beneficial consequences for structuring the financial rewards that
experts receive and thus address some of the grave economic inequalities
that the knowledge economy has engendered. The horizontal model, put
into action through citizens assemblies, is part of a move towards a more
cooperative rather than hierarchical division of epistemic labour, which is
also constitutive of the reciprocal nature of the demands and commitments that are part of a climate of trust.
6.3. Accommodating values
This element consists of two steps. The first step is to acknowledge the
uncertainties of science and the value gap between evidence and
theory, and to ensure that the gaps are filled in such a way that the
well-being and best interest of those affected become central to the
conduct of science. This transparency and acknowledgment of value
would go some way towards addressing the concerns expressed in 3.3.
The second step is the acknowledgment of the values which may
inform the public’s inclination, or disinclination, to trust specific scientific
information. The creation of a climate of trust requires the awareness that
distrust in scientists may be the result of a genuine difference between
the public and the experts about the values that shape scientific research
(Kitcher 2011). In what is presented as a purely scientific question,
different values, priorities, and preferences are involved, and bringing
them out may lead to a more respectful, but also more mutually trusting
practice. Also helpful is addressing a lack of clear diversity and broad representation in the scientific community, and an insufficient acknowledgement of the plurality of values that determine trust and distrust in science.
To take one example, climate sceptics appear highly susceptible to messages delivered by their own social group, particularly their political group.
This demonstrates not only reluctance to change, but also the importance
of group belonging when it comes to trust. Group belonging includes the
22
M. BAGHRAMIAN AND S. CAPRIOGLIO PANIZZA
acknowledgement of shared values which in turn play an important role
in whom to trust. For these reasons, David Hall (2019) has suggested
that in order to address climate scepticism it is necessary to ‘frame’
the message without denying or eradicating the fundamental values
of some groups. Drawing on Bernard Williams’s (1979) ‘internal
reasons’, Hall suggests a model of persuasion which is truthful, but
also which acknowledges the plurality of people’s motivations, connecting facts about climate change with people’s ‘subjective motivational
sets’ (Hall 2019, 41–42). Similarly, Feygina, Jost, and Goldsmith (2010)
suggest that reframing pro-environmental change as, for instance, preserving, rather than challenging, the social system (e.g. the ‘American
way of life’) may encourage those who are motivated to protect the
system to take greater personal responsibility (2010, 333) (See also
Kahan 2010).14
6.4. Addressing injustices created by differential power structures
The idea of a climate of trust involves addressing the root causes of distrust not only in the specific domains where distrust manifests itself,
but also the different power structures in which these domains are articulated. It is not the just the conduct of science or the scientists that may be
biased, but the broader social and political context within which science is
practiced, communicated and used. Establishing a climate of trust
requires acknowledging both the context and the full range of the
causes of distrust, which observing the social positions of the sceptics
reveals to be more than merely intellectual; and this, in turn, means
being prepared to change deeply rooted social and political structures
and practices. If distrust is at least partly fuelled by power inequalities
and past and current discrimination, then consistent and visible efforts
to rectify such inequalities are part and parcel of building a climate of
trust. How can that be done?
Distrust, as we have noted, can have a beneficial effect, leading to a reconfiguration of unjust power structures. This is primarily the case when
distrust is grounded in actual oppression: there distrust can represent a
wake-up call, lead away from complacency and toward action for
justice (see Krishnamurthy 2015). Even when it is grounded on
14
At the same time, it is important to take values into account in a way that does not deny their incompatibility with others and that does not distort the scientific message. Some values just are incompatible with pro-environmental behaviour and a just system. That is why the social situatedness of the
untrusting needs to be taken into account, not only by experts who seek trust, but also by the untrusting themselves. This is Heidi Grasswick’s (2014) proposal to address climate scepticism.
INQUIRY
23
unwillingness to change and lose one’s position of power, distrust can
signal uneven power relationships, and can be acted upon accordingly.
The connecting element here is the vulnerability and disempowerment
that are inherent in trust and also at the root of the distrust arising from
experiences of injustice. The greater vaccine hesitancy among marginalised groups, to return to the example noted above, results in part from
the extreme vulnerability that come from having trusted the medical
establishment, with catastrophic consequences. To trust, as we saw, is
to make ourselves vulnerable, but the added level of social vulnerability
makes the act of trusting much riskier than it could otherwise be. We
cannot remove the experience of vulnerability inherent in the act of trusting, but in building a climate of trust we can work towards addressing the
excess of such feelings of vulnerability occasioned by experiences of
social injustice.
Two possible ways of working with vulnerability, but changing its
impact, are suggested by Katherine Furman (2020): sharing costs, and
giving up power (on the part of the current elite). Taking as her
example a doctor from Doctors without Borders in South Africa (narrated
in Steinberg 2017), who drew his own blood in front of his patients to
show that the HIV tests he was offering were safe, Furman suggests
that spreading the perceived risk also flattens the hierarchical nature of
the trust relationship. In the case of expertise, giving up power (on the
part of the elites), or more properly re-balancing it, also means greater
inclusivity in the expert class. In the case of vaccines in the US, having a
more proportionate number of Black scientists as well as spokespersons
increases the sense of both inclusion and in-group role models. According
to Bajaj and Stanford (2021), the focus on current power relations, as
opposed to past injustice, is a more positive way to address distrust,
because it frames the problem in terms of everyday racism (which is
present, and can be challenged) rather than ‘immovable historical occurrences’, which in their view undermines the efforts to combat distrust.
For instance, Alsan et al. (2021) have shown that Black Americans were
more likely to act on COVID-19 prevention advice and seek further information when this was presented by Black doctors than white doctors (see
also Wells and Gowda 2020). The point about seeking information is
important: we do not in the slightest want to suggest that inclusivity is
merely a means to achieve the ideal of full trust in experts by manipulating people into it. That would merely replicate the unquestioning call for
trust we have criticised. Inclusivity, besides being a good in itself, promotes a desirable climate of trust insofar as it also makes room for
24
M. BAGHRAMIAN AND S. CAPRIOGLIO PANIZZA
autonomous thinking. The tension identified in this paper can only be
addressed if a climate of trust allows for vulnerability but also makes independent inquiry available.
In these ways, besides the obvious move toward greater justice,
members of marginalised groups can feel that their vulnerability is
shared, and not unduly focused on their group-membership. Only then
can they be expected to accept the more reasonable degree of vulnerability that is part and parcel of justified trust in trustworthy experts.
7. Conclusion: on the value of distrust
Distrust, like trust, is valuable when it is exercised under the right conditions and directed towards the right individuals and organisations. In
the right context, distrust can be valuable in ensuring individual autonomy and democratic participation. Distrust can also have deeper roots
and be grounded in social and power dynamics that fundamentally determine the ways expert information is shaped and received. Distrust, in
these cases, is not necessarily a problem; in fact, it can point towards
inequalities that influence epistemic practices, and that need to be
rectified. For this reason, we have argued, in cases where trust in
experts is urgently called for, it is not sufficient to clarify information,
share more data, and demonstrate experts’ honesty and reliability.
What is needed is a larger scale intervention, which we have suggested
equals to the fostering of a ‘climate of trust’ in which the costs and vulnerabilities of trust are shared, and which enables and encourages the participation of various groups and the inclusion of differing values, not
only in the production of knowledge, but in the process of the application
of such knowledge which – in the cases examined – has such significant
impact on everyone involved.
Acknowledgements
The information and opinions contained herein are those of the authors and do not
necessarily reflect those of the European Commission.
INQUIRY
25
Disclosure statement
No potential conflict of interest was reported by the author(s).
Funding
This work has received funding from the European Union’s Horizon 2020 research and
innovation programme under grant agreement No 870883.
References
Alfano, Mark. 2016. “The Topology of Communities of Trust.” Russian Sociological
Review 15 (4): 30–56.
Almassi, Ben. 2012. “Climate Change, Epistemic Trust, and Expert Trustworthiness.”
Ethics and the Environment 17 (2): 29–49. doi:10.2979/ethicsenviro.17.2.29.
Alsan, Marcella, Fatima Cody Stanford, Abhijit Banerjee, Emily Breza, Arun
Chandreshakar, Sarah Eichmeyer, Paul Goldsmith-Pinkham, et al. 2021.
“Comparison of Knowledge and Information-Seeking Behavior After General
COVID-19 Public Health Messages and Messages Tailored for Black and Latinx
Communities: A Randomized Controlled Trial.” Annals of Internal Medicine 174 (4):
484–492. doi:10.7326/M20-6141.
Anderson, Elizabeth. 2011. “Democracy, Public Policy, and Lay Assessments of
Scientific Testimony.” Episteme 8 (2): 144–164.
Arendt, Hannah. 1972. The Crises of the Republic: Lying in Politics; Civil
Disobedience; On Violence; Thought on Politics and Revolution. New York: Harcourt
Brace & Co.
Audi, Robert. 2011. “The Ethics of Belief and the Morality of Action: Intellectual
Responsibility and Rational Disagreement.” Philosophy (London, England) 86 (1):
5–29. doi:10.1017/S0031819110000586.
Baghramian, Maria, and Michel Croce. 2021.Experts, Public Policy and the Question of
Trust’.” In Routledge Handbook of Political Epistemology, edited by M. Hannon, and J.
De Ridder, 446–457. London, UK: Routledge. Forthcoming.
Baier, Annette. 1986. “Trust and Antitrust.” Ethics 96 (2): 231–260. doi:10.1086/292745.
Bajaj, Simar Singh, and Fatima Cody Stanford. 2021. “Beyond Tuskegee — Vaccine
Distrust and Everyday Racism.” New England Journal of Medicine 384 (5): e12.
doi:10.1056/NEJMpv2035827.
Becker, Lawrence C. 1996. “Trust as Noncognitive Security About Motives.” Ethics 107
(1): 43–61.
Booth, Ken, and Nicholas Wheeler. 2007. The Security Dilemma: Fear, Cooperation and
Trust in World Politics. New York: Palgrave Macmillan.
Cassam, Quassim. 2021. “Doubt as a Political Virtue.” Midwest Studies in Philosophy 45:
371–391. doi:10.5840/msp20219104.
Code, Lorraine. 2006. Ecological Thinking: The Politics of Epistemic Location. Oxford &
New York: Oxford University Press.
Descartes, René, and F. E. Sutcliffe. 1968. Discourse on Method and the Meditations.
London: Penguin.
26
M. BAGHRAMIAN AND S. CAPRIOGLIO PANIZZA
Dewey, John. 1927. The Public and Its Problems. Chicago: Swallow Press.
Douglas, Heather. 2013. “The Value of Cognitive Values.” Philosophy of Science 80 (5):
796–806. doi:10.1086/673716.
Farrell, D. M., and J. Suiter. 2019. Reimagining Democracy: Lessons in Deliberative
Democracy from the Irish Front Line. Ithaca: Cornell University Press.
Feygina, Irina, John T. Jost, and Rachel E. Goldsmith 2010. “System Justification, the Denial
of Global Warming, and the Possibility of “System-Sanctioned Change”.” Personality
and Social Psychology Bulletin 36 (3): 326–338. doi:10.1177/0146167209351435.
Foucault, Michel. 2003. Society Must Be Defended. New York: Picador.
Fricker, Elizabeth. 1994. “Against Gullibility’.” In Knowing from Words, edited by A.
Chakrabarti, and B. K. Matilal, 125–162. London: Kluwer Academic Publishers.
Furman, Katherine. 2020. “Emotions and Distrust in Science.” International Journal of
Philosophical Studies 28 (5): 713–730. doi:10.1080/09672559.2020.1846281.
Giddens, Anthony. 1990. The Consequences of Modernity. Stanford, CA: Stanford
University Press.
Goldberg, Sandy. 2013. “Self-Trust and Extended Trust: A Reliabilist Account.” Res
Philosophica 90 (2): 277–292. doi:10.11612/resphil.2013.90.2.11.
Goldenberg, Maya J. 2016. “Public Misunderstanding of Science? Reframing the
Problem of Vaccine Hesitancy.” Perspectives on Science 24 (5): 552–581. doi:10.
1162/POSC_a_00223.
Grasswick, Heidi. 2014. “Climate Change Science and Responsible Trust: A Situated
Approach.” Hypatia 29 (3): 541–557. doi:10.1111/hypa.12090.
Grasswick, Heidi. 2018. “Understanding Epistemic Trust Injustices and Their Harms.”
Royal Institute of Philosophy Supplement 84: 69–91.
Habermas, Jürgen. 1985. The Theory of Communicative Action, Volume 1: Reason and
the Rationalization of Society. Translated by Thomas McCarthy. Boston: Beacon
Press.
Hall, David. 2019. “Internal Reasons and the Problem of Climate Change.” Theoria 66:
160. doi:10.3167/th.2019.6616003.
Hazlett, Alan. 2015. “The Civic Virtues of Skepticism, Intellectual Humility, and
Intellectual Criticism.” In Intellectual Virtues and Education: Essays in Applied Virtue
Epistemology, edited by J. Baehr, 71–94. London and New York: Routledge.
Hempel, Carl G. 1965. “Science and Human Values.” In Aspects of Scientific Explanation
and Other Essays in the Philosophy of Science, edited by Carl G. Hempel, 81–96. New
York: The Free Press.
Holton, Richard. 1994. “Deciding to Trust, Coming to Believe.” Australasian Journal of
Philosophy 72 (1): 63–76. doi:10.1080/00048409412345881.
Hume, David. 1975. Enquiries Concerning Human Understanding and Concerning the
Principles of Morals, 3rd ed. edited by L. A. Selby-Bigge, and P. H. Nidditch.
Oxford: Oxford University Press.
Ireland Department of Children, Equality, Disability, Integration and Youth. 2021.
“Final Report of the Commission of Investigation into Mother and Baby Homes.”
Accessed 20 June 2021. https://www.gov.ie/en/publication/d4b3d-final-report-ofthe-commission-of-investigation-into-mother-and-baby-homes/?referrer=http://
www.gov.ie/en/campaigns/2f291-final-report-of-the-commission-of-investigationinto-mother-and-baby-homes/#executive-summary.
INQUIRY
27
Jasanoff, Sheila. 2014. “A Mirror for Science.” Public Understanding of Science 23 (1): 21–
26. doi:10.1177/0963662513505509.
Jones, Karen. 2013. “Distrusting the Trustworthy.” In Reading Onora O’Neill, edited by
D. Archard, M. Deveaux, D. Manson, and D. Weinstock, 186–198. London: Routledge.
Kahan, Dan. 2010. “Fixing the Communications Failure.” Nature 463 (7279): 296–297.
doi:10.1038/463296a.
Kitcher, Philip. 2011. Science in a Democratic Society. Amherst, N.Y: Prometheus.
Koskinen, Inkeri. 2020. “Defending a Risk Account of Scientific Objectivity.” British
Journal for the Philosophy of Science 71 (4): 1187–1207. doi:10.1093/bjps/axy053.
Krishnamurthy, Meena. 2015. “Tyranny and the Democratic Value of Distrust.” The
Monist 98 (4): 391–406. doi:10.1093/monist/onv020.
Kuhn, Thomas S. 1977. “Objectivity, Value Judgment, and Theory Choice’.” In The
Essential Tension: Selected Studies in Scientific Tradition and Change, edited by
Thomas S. Kuhn, 320–339. Chicago: University of Chicago Press.
Laurencin, Cato T. 2021. “Addressing Justified Vaccine Hesitancy in the Black
Community.” Journal of Racial and Ethnic Health Disparities 30: 1–4. doi:10.1007/
s40615-021-01025-4.
Lewenstein, Bruce, and Dominique Brossard. 2006. “Assessing Models of Public
Understanding In ELSI Outreach Materials.” USDA Communication. Accessed 7
July 2021. https://portal.nifa.usda.gov/web/crisprojectpages/0190518-assessingmodels-of-public-understanding-in-elsi-outreach-material.html.
Lombrozo, Tania, Anastasia Thanukos, and Michael Weisberg. 2008. “The Importance
of Understanding the Nature of Science for Accepting Evolution.” Evolution:
Education and Outreach 1: 290–298. doi:10.1007/s12052-008-0061-8.
Machiavelli, Niccolò. 1940. The Prince, and the Discourses. New York: The Modern
Library.
Mccright, Aaron M., and Riley E. Dunlap. 2011. “The Politicization of Climate Change
and Polarization in the American Public’s Views of Global Warming, 2001–2010.”
The Sociological Quarterly 52 (2): 155–194. doi:10.1111/j.1533-8525.2011.01198.
Miller, Jon D. 1983. “Scientific Literacy: A Conceptual and Empirical Review.” Daedalus
112 (2): 29–48.
Miller, Jon D. 2004. “Public Understanding of, and Attitudes Toward, Scientific
Research: What We Know and What We Need to Know.” Public Understanding of
Science 13 (3): 273–294. doi:10.1177/0963662504044908.
Nguyen, C. Thi. 2018. “Expertise and the Fragmentation of Intellectual Autonomy.”
Philosophical Inquiries 6 (2): 107–124.
Oberauer, Klaus, and Stephan Lewandowsky. 2016. “Control of Information in Working
Memory: Encoding and Removal of Distractors in the Complex-Span Paradigm.”
Cognition 156: 106–128. doi:10.1016/j.cognition.2016.08.007.
O’Neill, Onora. 2002. Autonomy and Trust in Bioethics. Cambridge: Cambridge
University Press.
Price, Don K. 1965. The Scientific Estate. Cambridge, MA: Harvard University Press.
Public Health England. 2020. “Disparities in the Risk and Outcomes of COVID-19.”
Accessed 12 June 2021. https://assets.publishing.service.gov.uk/government/
uploads/system/uploads/attachment_data/file/908434/Disparities_in_the_risk_and_
outcomes_of_COVID_August_2020_update.pdf.
28
M. BAGHRAMIAN AND S. CAPRIOGLIO PANIZZA
Quine, W. V. 1970. “On the Reasons for Indeterminacy of Translation.” Journal of
Philosophy 67 (6): 178–183. doi:10.2307/2023887.
Raz, Joseph. 1998. “Disagreement in Politics.” American Journal of Jurisprudence 43 (1):
25–52. doi:10.1093/ajj/43.1.25.
Razai, Mohammad S., Tasnime Osama, Douglas G. J. McKechnie, and Azeem Majeed
2021. “Covid-19 Vaccine Hesitancy among Ethnic Minority Groups.” BMJ 372
(February): n513. doi:10.1136/bmj.n513.
Saunders, Mark. 2010. Organizational Trust: A Cultural Perspective. Cambridge:
Cambridge University Press.
Scheman, Naomi. 2011. Shifting Ground: Knowledge and Reality, Transgression and
Trustworthiness. Oxford & New York: Oxford University Press.
Scott, J. C. 1999. “Geographies of Trust, Geographies of Hierarchy.” In Democracy and
Trust, edited by M. E. Warren, 273–289. Cambridge: Cambridge University Press.
Sperber, Dan, Fabrice Clément, Christophe Heintz, Olivier Mascaro, Hugo Mercier,
Gloria Origgi, and Deirdre Wilson. 2010. “Epistemic Vigilance.” ” Mind and
Language 25 (4): 359–393. doi:10.1111/j.1468-0017.2010.01394.x.
Stanford, P. Kyle. 2001. “Refusing the Devil’s Bargain: What Kind of
Underdetermination Should We Take Seriously?” Philosophy of Science 68 (S3): 1–
12. doi:10.1086/392893.
Steinberg, Jonny. 2017. “Re-Examining the Early Years of Anti-Retroviral Treatment in
South Africa: A Taste for Medicine.” African Affairs 116 (462): 60–79. doi:10.1093/
afraf/adw026.
Torcello, Lawrence. 2016. “The Ethics of Belief, Cognition, and Climate Change
Pseudoskepticism: Implications for Public Discourse.” Topics in Cognitive Science 8
(1): 715–715.
Turner, Stephen. 2001. “. ‘What Is the Problem with Experts?” ’ Social Studies of Science
31 (1): 123–149.
Wells, Lindsay, and Arjun Gowda. 2020. “A Legacy of Mistrust: African Americans and
the US Healthcare System.” Proceedings of UCLA Health 24: 1–3.
Williams, Bernard. 1979. “Internal and External Reasons’.” In Rational Action, edited by
R. Harrison, 101–113. Cambridge: Cambridge University Press.
Wylie, Alison. 2003. “Why Standpoint Matters.” In Science and Other Cultures: Issues in
Philosophies of Science and Technology, edited by R. Figueroa, and S. G. Harding, 26–
48. London: Routledge.
Zagzebski, Linda. 2013. “Intellectual Autonomy.” Philosophical Issues 23 (1): 244–261.
doi:10.1111/phis.12012.