Familiarity With Internet Threats Beyond Awareness
Familiarity With Internet Threats Beyond Awareness
Familiarity With Internet Threats Beyond Awareness
ScienceDirect
j o u r n a l h o m e p a g e : w w w. e l s e v i e r. c o m / l o c a t e / c o s e
A R T I C L E I N F O A B S T R A C T
Article history: The degree of familiarity with threats is considered as a predictor of Internet attitudes
Received 11 December 2016 and security behaviors. Cross-sectional data were collected from 323 student participants
Accepted 29 January 2017 about their familiarity about 16 different Internet threats. All participants were presented
Available online 3 February 2017 with definitions of threats and then asked to state how familiar they were with each.
Their responses were then used to identify the extent to which threat familiarity differed
Keywords: among the sample. Three different clusters were identified. One set of participants were
Internet experience relatively knowledgeable about all threats. Cluster 1 was therefore labeled experts (n = 92).
Familiarity Clusters 2 (n = 112) and 3 (n = 92) showed very different patterns as familiarity appeared to
Internet threats depend on the novelty of the threat (with one cluster showing more familiarity with
Computer behavior well-known threats and the other more familiarity with new threats). Participants who
Cluster analysis were experts were more likely to engage in computer security behaviors than the other
Mediation two groups. Mediation analysis showed that time spent on the Internet and the length of
Internet experience were significant predictors of familiarity, and both were significant
indirect predictors of computer security use (suggesting a relationship fully mediated by
familiarity). Our paper makes several important contribution. First, the research reflects a
systematic effort to investigate the relationship between the familiarity and engagement
of online security activities. Second, we provide evidence that familiarity is a mediator
between Internet use and security behaviors – making this a baseline variable to consider
in terms of training on future threat-oriented interventions aimed at changing security
behavior. This study also provides implications for practitioners to improve user familiar-
ity of security risks.
© 2017 Elsevier Ltd. All rights reserved.
* Corresponding author.
E-mail addresses: d.jeske@ucc.ie (D. Jeske), P.Van-Schaik@tees.ac.uk (P. van Schaik).
http://dx.doi.org/10.1016/j.cose.2017.01.010
0167-4048/© 2017 Elsevier Ltd. All rights reserved.
130 computers & security 66 (2017) 129–141
feature on many sites. These are text files that are designed quences play a significant role in determining how risks are
to track user activity (BBC, 2011). Cookies may also be set by perceived and responded to.
the browser or third-parties not associated with the browser
(for more details see Opentracker, 2014). Due to press cover-
age regarding corporate privacy disasters (see Clarke, 2014), 1.1. A theoretical perspective to understanding threat
many users are exposed to information about these threats. familiarity: connecting the human and technical elements
However, some threats may be more recent and less known –
which may also affect familiarity and thus potentially the extent The difficulties associated with encouraging awareness to
to which security measures are taken by individuals. These progress to actual knowledge and understanding of threats
include sophisticated spear phishing (targeted emails that may be best explained using a framework as an explanatory
include personal user details to convince users to provide spe- metaphor. It is here that Actor-network theory (ANT; Latour,
cific information), keyloggers and rogueware. In addition to more 1987) may help explain the reactions to, barriers and chal-
traditional online security threats, a number of additional lenges that arise when we try to understand the many
threats need to be considered. These include threats such as interrelated variables that determine security-related engage-
cat-fishing, cyber-bullying, social engineering and virtual ment and behavior (e.g., past experience, affordance of
stalking. technology, and user attributes). A few comments are war-
A number of researchers have studied the role of atti- ranted to define the meaning and relevance of actor-network
tudes toward the Internet, and information hiding versus theory. First, ANT as proposed by Latour (1999, pg. 20) is a
information sharing (e.g., Acquisti and Grossklags, 2004). Simi- “very crude method to learn from actors without imposing
larly, precautionary user behavior such as use of computer on them on a priori definition of their work-building capaci-
security features also requires a certain awareness and famil- ties.” Second, it is important to avoid misunderstandings
iarity of the threats a user faces (see Dinev et al., 2009; Kruger about the meaning of actors and networks as Latour concep-
et al., 2010). We differentiate awareness from familiarity as being tualizes these as interlinked, rather than opposites (hence
aware of something may not necessarily indicate more than the hyphen). The actor-network element of Latour’s theory
a fleeting degree of knowledge that a threat of a certain kind does not refer to a dichotomy that differentiates between
exists. Awareness alone may also be subject to repeated ex- agency and structure. While the actor does not represent a
posure, and thus subject to habituation which leads to less and reflection of human agency, nor does the network element
less attention given to warning (Brinton Anderson et al., 2016). reflect society as such. Both are continuously transformed
However, this does not guarantee that the user becomes knowl- and redefined through the interdependent activities (Hassard
edgeable or familiar with what the threat entails – they only and Alcadipani, 2010). Latour (1999, pg. 17) clarifies and
recognize it. For example, individuals may be aware of email states that network element captures all “interactions through
as a communication medium but not realize that it operates various kinds of devices, inscriptions, forms and formulae,
as a storage medium – and that even deleted emails may still into a very local, very practical, very tine locus”. Indeed,
continue to be accessible via their devices or cloud servers (e.g., actors and networks are “two faces” of the same phenom-
Clark et al., 2015). So awareness of one function does not imply enon (Latour, 1999). In other words, actor-network theory
that the user really understands all functions – or threats. Threat acknowledges and highlights the connections between both
awareness suggests individuals show realization, perception macro and micro level influencers of social processes (such
of knowledge of a threat – but this knowledge is not driven as societal norms and culture vs. local and personal norms).
by experience and may not be very in depth. Attitudes and be- We propose that ANT is a useful approach to understand
haviors may be shaped by what users think they know, rather how threat familiarity relates to online behaviors (e.g., those
than their actual knowledge. As a result, awareness may be a that shape Internet experience and online engagement) and
precursor to familiarity. In contrast, familiarity is linked to the adoption of security behaviors. First, ANT clearly rejects
knowledge in more concrete ways in that knowledge is knowing the separation of the human, non-human, technical ele-
something through experience or association implying an un- ments and the social elements (Hassard and Alcadipani,
derstanding of a threat. 2010) that drive user behavior in various domains. When we
Unfortunately, many individuals are not cognizant of how focus on the user alone (e.g., his or her attitudinal indica-
much personally relevant information they share online tors), the technical (e.g., automatic processes rather than
(Kurkovsky and Syta, 2010), in line with a low familiarity those that have to be started by the user) or the social
with the threats that may arise. For example, threats may be influencers (e.g., social norms norms), we may only explain
dismissed if they appear to be unlikely to occur (no imme- some of the variance in behavioral patterns; however, the
diacy), the user discounts the possibility of being affected, or interaction of these variables may be particularly informa-
they feel competent and confident to tackle potential risks tive. ANT therefore considers a combination of variables, in a
and handle the consequences themselves. These aspects similar fashion (but not exactly the same) as (many) other
have certainly been observed in relation to password man- “models”, such as ISO 9241 and the Person-Artifact-Task
agement (e.g., Tam et al., 2010). Security countermeasures (PAT) model (see Finneran and Zhang, 2003). Security behav-
such as security policies, security education and awareness, ior is essentially the outcome of a combination of all these
and computer monitoring have also been proposed to affect elements as well. For example, personal characteristics and
perceived certainty and severity of sanctions and subse- propensity for risk may shape users’ willingness to take risks
quent misuse of information systems (D’Arcy et al., 2009). when online. Technical features may protect a user to differ-
This suggests that attitudes and user awareness of conse- ent degrees from threats, while social pressures and norms
computers & security 66 (2017) 129–141 131
may also influence which activities the user pursues and symmetry due to the challenges associated with keeping up
which precautionary behaviors they adopt. one’s knowledge of emerging and existing threats.
Second, as ANT suggests, entities and understanding emerge
and gain meaning as the result of their interaction and
relations with each other. Arnaboldi and Spiller (2011, pg. 1.2. Rationale and research questions
645) note the following, in reference to Latour (2005) and Law
(1992): “The increasing popularity of ANT arises from a A particularly relevant target group for interventions are uni-
pivotal, though controversial, feature: the symmetrical treat- versity students as many are soon entering the workforce – and
ment of human and non-human actors, and of social and with that their lack of knowledge or awareness of online threats
technical elements.” This might indeed be particularly rel- may represent an important knowledge gap that needs to be
evant to cybersecurity behaviors. The creation of threats and addressed in company inductions and training schemes. Learn-
the effect of certain cybersecurity threats rely on the interac- ing what students know about threats is the first step to
tions of numerous technical and human aspects. For example, understand why and when they adopt computer security be-
in the absence of precautionary tools and the users’ unwill- haviors. In line with current knowledge and knowledge gaps,
ingness to engage with threats, negative outcomes due to the present paper poses three questions.
email harvesting or identity theft are also much more likely. The questions are as follows: how familiar are students with
This outcome may not be repeated if the precautionary the various online threats and is this similar for UK and US
behavior is prompted or prevention tools are automatically samples (RQ1)? Dinev et al. (2009) noted that awareness of the
triggered, reducing the reliance on the user. However, such threats posed by spyware predicted favorable attitudes toward
settings also reduce their control over their devices, which is protective information technology, but that this relationship
why such mechanisms are not always readily adopted by the was more pronounced for the US than the South Korean sample
user. Familiarity with threats, either by direct exposure or in their research. This suggests that familiarity and thus aware-
experience, is likely to emerge as the result of an interplay ness of threats may vary across countries. However, in this study
between the users’ technical experience (e.g., Internet use), we focus on similar cultures to assess the robustness of find-
their personal characteristics (such as Internet attitudes), ings in the UK and a comparative sample from the USA. We
and their behavior to date, especially when this is reinforced propose that the two samples are unlikely to vary signifi-
by social or technical means (such as the adoption of pre- cantly in terms of their familiarity with threats. By extension
cautionary behavior, which may be achieved through nudges of these findings, we ask if we can identify groups that are more
or social norms). or less familiar with certain new vs. well-known (estab-
Finally, ANT considers the importance of translation in terms lished) threats (RQ2). We are particularly interested to learn how
of how various, potentially contradictory interests are cap- familiarity clusters relate to Internet attitudes and computer
tured (Hassard and Alcadipani, 2010, pg. 10). This process further security (e.g., differences between the two subsamples or group
recognizes the role of stakeholders, the need for information clusters).
sharing and evolution in terms of the roles that actors inherit Third and finally, we consider a mediation hypothesis (see
(see also Arnaboldi and Spiller, 2011). Users are often, by default Fig. 1). That is, to what extent is past Internet experience and
and unintentionally, designated as recipients of data security familiarity with threats overall related to security measures being
training – but not necessarily viewed as active participants. This implemented by individuals (RQ3)? Previous research on risk and
set-up may ensure that training aims at raising threat aware- technology has found evidence in favor of the “familiarity hy-
ness, but not necessarily foster actual familiarity with threats pothesis” (Lee and Ho, 2015; Satterfield et al., 2009; Wogalter et al.,
by involving users directly. 1991). Accordingly, the perception that the benefits associated
Nevertheless, ANT as an explanatory method to under- with a particular technology outweigh its risks is positively related
stand cybersecurity is still limited. For example, many will to people’s extent of familiarity with the technology.This means
argue that the assumption of symmetry between human and prior experience, if linked to familiarity, may also have impli-
non-technical elements is misplaced – and that in the context cations for the security measures that are being implemented
of cybersecurity, it may not be feasible to aim for such by individuals – in support of a mediation model.
The data for the current paper were based on a previous dataset 2.3. Measures
collected over the course of 2015 and 2016. Participants were
recruited by mailing list in the UK and via their instructors in A number of single item measures were used to assess famil-
the USA. Participation was voluntary. As only 323 partici- iarity, use of computer security, and participant’s attitude toward
pants responded to the familiarity items, the current study the Internet, Internet experience, and demographics.
focuses on this subset. The participations included 169 par-
ticipants taking social science programs in the Midwest of the 2.3.1. Familiarity with online threats
USA and 154 participants completing various social science and Familiarity with 16 threats was examined in line with previ-
other programs at several universities in the UK. The age ranged ous work (Garg and Camp, 2012; see additional work on
from 18 to 60 (M = 22.78; SD = 5.89). As the purpose of the analy- familiarity and information security perceptions by Huang et al.,
sis is to compare samples, we also investigated potential 2007). The list consisted of the following threats: cat-fishing,
differences between samples prior to the main analysis. Overall, social engineering, e-mail harvesting, zero-day attack,
74.9% of the participants were female (n = 242) and 25.1% were rogueware, botnet, trojan, keylogger, spyware, virus, cyber-
male (n = 81). The two groups did not differ (F(1, 321) = .21, bullying, virtual stalking, Internet surveillance, identity theft,
p = 648) in terms of their age characteristics (UK M = 22.92; US phishing, cookie. Answering options ranged from 1 = fully un-
M = 22.64) and gender distribution (Pearson’s χ2(1) = .025, p = .874). familiar to 7 = fully familiar.
Over half of the samples were employed (n = 191, 59.5%), with
the remainder being either unemployed or looking for work 2.3.2. Use of security measures
(n = 90, 27.9%), with a small minority selecting the option of The degree to which participants used certain precautions was
“other” (n = 41, 12.7%; with one missing value). Students in the assessed using the Computer Security Usage scale (by Claar
USA were more likely to be employed than students in the UK and Johnson, 2012). The scale involved five items with a re-
(Pearson’s χ2(3) = 12.37, p = .006). In terms of education, 26.6% sponse scale from 1 = never to 7 = always. The items asked
had a high school diploma or similar, 31% had already ob- participants whether they used anti-virus software, firewall soft-
tained an associate degree (at community college), 29.1% had ware, anti-spyware software, software updates and security
already obtained an undergraduate degree (e.g., BA or BSc), 5.6% updates. The items were used individually and combined in
had already obtained a postgraduate degree, while 7.7% had one composite.
obtained other unspecified qualifications. The main differ-
ence in education (Pearson’s χ2(4) = 56.70, p < .001) arose in terms
2.3.3. Internet attitude
of the greater number of students in the USA having already
This was measured using five items starting with “All things
completed an associate degree (two-year degree, e.g. at com-
considered, my use of the Internet is…” followed by a 7-point
munity colleges), a qualification that is less common in the UK.
response scale (e.g., 1 = good, 7 = bad). An example item is “All
In addition, more participants in the UK were working toward
things considered, my use of the Internet is good-bad.” The
a postgraduate qualification at the time of the survey than in
items were used individually.
the USA. Participants had used the Internet for around 12 years
(M = 11.72, SD = 3.44), although Internet use ranged from 2 to
24 years. Students in the USA had used the Internet on average 2.3.4. Internet experience
one more year (M = 12.29, SD = 3.22) than students in the UK All participants were asked three questions about their Inter-
(M = 11.10, SD = 3.57) (F(1, 320) = 9.792, p = .002). net experience. The first question asked participants how long
they have been using the Internet (in years). The second ques-
tion required participants to share how often they log onto the
2.2. Procedure
Internet. Answering options ranged from 1 = weekly to 6 = more
than 3 times a day. The third question asked participants how
As soon as participants had read the study information and
much time they spend on the Internet per day. The response
completed the consent form, they were asked to complete items
options ranged from 1 = 1–5 minutes to 7 = several hours.
on their attitude toward their personal use of Internet, their
use of security measures, and their general Internet experi-
ence. They were then presented with different threats. Each 2.3.5. Internet use
of these 16 threats was also defined for the participants to We also asked participants what they used the Internet for.
ensure they all knew the characteristics of each threat (see Ap- Participants used the Internet for various purposes, specifi-
pendix). All participants were asked to indicate their familiarity cally, e-mail (96.9%), social networking (90.4%), searching for
with each of the threats individually. The survey ended with work-related or study-related information (85.8%) as well as
questions about their demographics and the debrief state- education/training (82.7%), shopping (81.4%) and banking (75.9%).
ment. All participants were eligible for course credits when
completing the survey and were also given the option to enter 2.3.6. Demographics
a prize draw (£50 or $50). In order to separate their anony- Demographics included age, gender, education and employ-
mous responses to the survey from the registration page and ment status.
computers & security 66 (2017) 129–141 133
8
Cluster 1
7
Cluster 2
6
5 Cluster 3
8
Cluster 1
7
Cluster 2
6
Cluster 3
5
Fig. 2 – (a) Cluster differences in familiarity (for new threats). (b) Cluster differences in familiarity (for established threats).
computers & security 66 (2017) 129–141 135
3
Cluster 1
Cluster 2
Cluster 3
1
good to bad harmful to beneficial positive to negative wise to unwise favorable to
unfavorable
self-identified as less familiar about well-known threats were were more likely to use security features than participants who
more likely to report more problematic behavior (Cluster 2) than were less familiar with threats (in Cluster 2 and Cluster 3).
those who were less familiar with new threats (Cluster 3).
These findings further suggest that participant’s general self-
3.3. Familiarity as mediator between Internet use and
awareness of their own potentially problematic Internet use
security behaviors (RQ3)
as indicated across all five attitude measures (see Table 3) may
be linked to the degree to which participants are (un)familiar
We considered the role of previous experience as a precursor
with threats – at the same time it is maybe a question as to
to familiarity of threats and behaviors. And only when
whether familiarity is the result of more online engagement.
threats were known are participants likely to engage in
However, we only noted a marginally significant difference in
computer security behavior. This mediation hypothesis was
terms of the frequency with which participants in the three
tested using the three questions related to Internet experi-
clusters used the Internet (F(2, 292) = 2.49, p = .085). Internet use
ence as predictors, overall familiarity as mediator (composite
was high among the experts (Cluster 1, M = 5.38, SD = 0.87) and
score), and the composite based on all computer security
those unfamiliar about newer threats (Cluster 3, M = 5.29,
responses. The mediation was assessed using the PROCESS
SD = 0.96) compared to those more unfamiliar with well-
macro from Hayes (2013). We ran three mediation models,
known threats (Cluster 2, M = 5.09, SD = 1.06).
one for each of the three Internet experience variables that
we predicted were direct predictors of familiarity and poten-
3.2.2. Computer security use tially indirect predictors (via familiarity) of computer security
Having obtained evidence that familiarity is also linked to self- use. We also considered potential covariates. Gender and
evaluations of one’s Internet use (Internet attitude) as well as employment status were significant covariates but their in-
differences in terms of the frequency with which the partici- clusion did not significantly change the results. As a result,
pants in different clusters use the Internet, the question arose we report the outcomes of the mediation without these
if familiarity is therefore also linked to whether participants covariates.
used different security features. In line with previous find- Two out of three questions (see Figs. 5 and 6) were signifi-
ings and as outlined in Table 4 (and Fig. 4), experts (Cluster 1) cant predictors of familiarity (composite score) as hypothesized.
6
Cluster 1
Cluster 2
Cluster 3
5
3
anti-virus firewall anti-spyware install software install security
updates updates
The frequency with which they used the Internet was not a was a positive predictor of familiarity (β = .079, p = .013) as was
significant predictor of overall familiarity with risks (al- the length of Internet use over several years (β = .055, p < .001).
though the frequency of Internet use differed across the three Familiarity was a significant predictor of computer security be-
clusters, see previous analysis). The average time spent online havior (β = .433, p < .001). However, time online on average
Fig. 6 – Mediation results for overall Internet experience, familiarity and security behavior.
computers & security 66 (2017) 129–141 137
(β = .020, p = .708) and length of Internet use (β = .028, p = .246) by the experts were firewalls, software and security updates.
were not significant direct predictors of computer security be- However, it is noteworthy that all these features are default
havior, with familiarity held constant. However, we did obtain features that can be readily enabled on most computers these
a significant indirect effect, demonstrating full mediation, for days. The fact that no significant differences (p < .05) were ob-
time online (β = .034, p = .032; LLCI = .0078 and ULCI = .0746) and tained for anti-spyware and anti-virus software is rather
length of Internet use (β = .024, p = .004; LLCI = .0095 and disheartening. However, we should note that since the famil-
ULCI = .0449). iarity questions were located at the end of the survey, it is
possible that self-evaluation of one’s Internet use may have
influenced the familiarity ratings. This is a possible limita-
tion of the study.
4. Discussion Despite these limitations, our results allow us to draw two
tentative conclusions. First, greater familiarity with threats does
The purpose of the current paper was to examine the role of not necessarily go hand in hand with better computer secu-
familiarity with threats, in computer security use. In the fol- rity. What is more, familiarity with threats and different online
lowing section, we briefly summarize our findings and connect behaviors may also depend on how recent/new or well-
these to practical implications. known threats and behaviors are. This is in line with related
The first question of interest considered how familiar stu- evidence that suggests that users often do not understand all
dents were with the various online threats (RQ1). We had the functions of the tools they use (Clark et al., 2015), and thus
assumed that students in the UK and USA would show similar may not understand the threats that exist. Participants (par-
degrees of familiarity. This hypothesis was largely confirmed ticularly those in Clusters 2 and 3, see Table 4) did not use
– except in terms of social engineering and phishing. In each readily available computer security features and programs to
of these two cases, the UK sample appeared to be more fa- the same degree that experts did (Cluster 1). It is unclear if this
miliar with these threats than their counterparts in the USA. reflects an overreliance on the university’s (or home environ-
Two explanations may be offered. First, the sample in the UK ment’s) infrastructure (e.g., convenience versus security driven
was potentially more diverse in terms of their educational goals decision-making, see Jeske et al., 2014). This lack of align-
than the US sample. That is, the participants from the USA were ment between what participants know and do may also be
largely recruited from social science courses, while the par- attributable to some other factor such as inability to grasp the
ticipants in the UK were recruited as part of a university- implications of one’s behavior for one’s computer security (sug-
wide research call. While we do not have the means to assess gesting low risk awareness; see Coventry et al., 2014). And
degree-specific differences in our dataset, it is possible that fa- second, further exploration showed that regular Internet ex-
miliarity with certain risks is associated with different perience and use may be particularly relevant for fostering
backgrounds and educational experiences. In addition, the uni- familiarity with threats – in line with the notion of situated
versities may have differed slightly in terms of the content of learning (Lave and Wenger, 1990).
their educational outreach activities. Many universities realize The mediation (RQ3) provided further evidence that time
the value in educating their students about Internet threats spent on the Internet and the length of Internet experience
as their risky or problematic online behavior is likely to also (but not daily or weekly frequency) were significant predic-
threaten the university’s network and Internet infrastructure. tors of familiarity with threats and online behaviors. Similarly,
The second question of interest looked at differences in our these variables were also significant indirect predictors of com-
dataset due to different degrees of familiarity with threats (RQ2). puter security use (a relationship fully mediated by familiarity).
Using cluster analysis, we identified three main clusters. The Again, this is a finding in support of situated learning (Lave
three clusters were differentiated along with their general fa- and Wenger, 1990). Although the effects were quite small overall,
miliarity with threats and labeled the experts (Cluster 1), or the practical implication may be that computer security be-
those unfamiliar with well-known threats (Cluster 2) vs. newer havior is an outcome of familiarity, which is not accomplished
threats (Cluster 3). All three clusters were less familiar with without significant time investment. This also means that the
four particular threats. These were threats such as the zero- time spent to familiarize oneself with threats and learn about
day attacks, rogueware, botnets, and keyloggers. This identifies online options is a time spent learning – but also a period of
a clear area for future training and awareness campaigns in greater vulnerability until some degree of familiarity with
these university settings. threats is achieved that triggers security behavior. However,
While we found no evidence of significant as well as con- again a limitation of this analysis is the use of single-item mea-
sistent differences between the two national samples (except sures (e.g., in relation to Internet use) as this might introduce
for the two exceptions noted above), the three clusters had bias due to unreliability (e.g., Petrescu, 2013). As a result, our
clearly very distinct Internet attitudes and computer security conclusions need to be interpreted with caution.
behaviors. Specifically, participants (Cluster 2) who were less The results suggest that numerous factors come into play,
familiar with well-known threats overall also reported more both human and technical, in line with the ANT approach
negative (personally problematic or disadvantageous) com- (Hassard and Alcadipani, 2010). This was evidenced by age and
pared to positive (and unproblematic) self-evaluated Internet employment status, both of which were important covariates.
use (Internet attitude). Participants in Cluster 2, together with Simultaneously, precautionary behaviors require access to but
the group less informed about newer threats (Cluster 3), were also familiarity with both threats and technical tools to combat
also less likely to use specific computer security features than them. Internet experience is particularly shaped by both human
the experts (Cluster 1). The three features more likely to be used (user) and technical characteristics and tools available to the
138 computers & security 66 (2017) 129–141
Table 5 – Concepts of the Health Belief Model as applied to the security context.
Concept Cybersecurity context
Perceived susceptibility How likely is the threat for the user? This might be reflected in perceptions of risk and the users’
information-sharing attitude.
Perceived severity How serious are the consequences (perceived risks) for the user?
Perceived costs What are the costs (balance between risks and benefits) that the user might incur as a result of the threat?
Perceived benefits What are the benefits of precautions (perceived benefits) to a threat?
Self-efficacy How capable does the users feel about facing these threats? This may be subject to the users’ actual
familiarity with risks and computer security use.
Cues to action What triggers (antecedents) lead to more secure user behavior? These triggers may also influence as and
when online users take precautions.
users. The starting points for interventions as well as practi- If this model is applied in combination with concepts such
cal implications are outlined in the next two sections. as situated learning (Lave and Wenger, 1990) and learning from
failure (see Fortune and Peters, 1995; see also Dalcher and
4.1. Starting points for interventions Drevin, 2003), individuals as well as trainers may become more
well-rounded in their understanding of threats (and how they
A number of starting points for interventions exist. Work on are perceived). Such training may also enable students and other
nudging individuals and research on evidence-based practice less security-experienced trainees to gain a better understand-
(see Michie and West, 2013) focuses on helping online users ing of how threats, if not prevented, impact on their data
to make better security-related decisions on social media, security and system operability. In addition, groupware and
mobile devices and in relation to wireless options – and thus online collaborative tools as well as social networks exist – all
enable them to overcome lack of knowledge and awareness of which can enable social interaction, access and support
of security threats on the part of the users (e.g., Choe et al., across time and location (Harr et al., 2011). For those tasked
2013; Kruger et al., 2010; Turland et al., 2015). In addition, the with teaching cybersecurity to employees and students, various
research on user perceptions has frequently utilized some of e-mentoring and e-coaching schemes exist as well as guid-
the premises of the Health Belief Model by Rosenstock (1974). ance on online tutoring (e.g., Perren, 2003). Furthermore, new
This includes studies to learn more about email-related platforms allow individuals to interact face-to-face in real time
security behavior (e.g., Ng et al., 2009), to explore how safe using video chats. Customized configurations of technologies
and secure online users felt about using technology to com- and settings (e.g., Shanks et al., 2003) enable organizations to
plete financial transactions (Davinson and Sillence, 2014), to not only engage others, monitor trainees’ progress but also
educate users about phishing (Davinson and Sillence, 2010) provide means to provide feedback (e.g., by using analytics and
and how to increase the use of computer security software discussion forms).
(Claar and Johnson, 2012). As a result, this model may also System-specific interventions may also be an option. Cues
provide a general explanatory framework for understanding to action may be designed with system nudges and similar
a number of additional threat perceptions, which may also system interventions to flag issues (see Acquisti, 2009; Coventry
help IT departments in universities and private businesses et al., 2014; Grossklags et al., 2010). While many cues to action
to identify starting points for interventions and awareness (e.g., warning messages upon installation of unknown or third-
campaigns. party software) are used already, it is clear that many
The main purpose of the Health Belief Model is to under- interventions fail – and we suggest that the lack of familiar-
stand why individuals engage in preventive actions or behaviors ity with the threats or issues can be a major determinant of
(Rosenstock, 1974). The model includes several concepts: self- individual’s disregard or disengagement with such interven-
efficacy (the degree to which individuals feel capable to tackle tions. Relying on user expertise and understanding of nudges
certain challenges), perceived susceptibility (opinion that one may disregard the first step: ensuring that the users are suf-
will contract health issues), perceived severity of the condi- ficiently familiar with potential threats so that those who seek
tion and consequences, and the perceived benefits and barriers to improve their security behavior can be successful.
to taking action. The model also considers the influence of trig-
gers within the individual and environment. These serve as 4.2. Expanding knowledge base: final comments for
“cues to action” (Siepmann, 2008). The model may also provide practitioners and researchers
insight into understanding online users’ threat perceptions. It
further captures the complexity of individual or situational an- Past evidence suggests that while information campaigns may
tecedents to these perceptions – and thus the precautions that change attitudes, behavior will not necessarily change in line
users may take. Taking a similar approach as Davinson and with attitudes (McKenzie-Mohr, 2000). This may be due to
Sillence (2014), we outline each of the five concepts of the Health various reasons, including the difficulty of changing a behav-
Belief Model as they could be applied in future research and ior (e.g., Constanzo et al., 1986) given environmental constraints
interventions (Table 5). The threats may also influence per- and habits. However, when behavioral change would be of per-
ceptions of susceptibility, severity, costs, benefits, cues to action sonal or even financial benefit, behavioral change is more likely
and perceived control. to follow (McKenzie-Mohr, 2000). However, it will be more dif-
computers & security 66 (2017) 129–141 139
Internet surveillance: The monitoring of online behavior, ac- South African Institute of Computer Scientists and
tivities or other changing information, often in secret and Information Technologists; 2003. p. 137–42.
without authorization. This is usually carried out on indi- Davinson N, Sillence E. It won’t happen to me: promoting secure
behavior among internet users. Comput Human Behav
viduals or groups observed by governmental organizations.
2010;26:1739–47. doi:10.1016/j.chb.2010.06.023.
Identity theft: Any kind of fraud on the Internet that results Davinson N, Sillence E. Using the health belief model to explore
in the loss of personal data, such as passwords, user names, users’ perception of “being safe and secure” in the world of
banking information, or credit card numbers technology mediated financial transactions. Int J Hum
Cookie: A small piece of text or file that is stored in a user’s Comput Stud 2014;72:154–68. doi:10.1016/j.ijhcs.2013
computer. .10.003.
D’Arcy J, Hovav A, Galletta D. User awareness of security
countermeasures and its impact on information systems
REFERENCES
misuse: a deterrence approach. Inf Syst Res 2009;20:79–98.
doi:10.1287/isre.1070.0160.
Dinev T, Goo J, Hu Q, Nam K. User behavior towards protective
Acquisti A. Nudging privacy. The behavioral economics of information technologies: the role of national cultural
personal information. Secur Priv Econ 2009;Nov/Dec:82–5. differences. Inf Syst J 2009;19:391–412. doi:10.1111/j.1365-
Acquisti A, Grossklags J. Privacy attitudes and privacy behavior. 2575.2007.00289.x.
Losses, gains, and hyperbolic discounting. In: Camp LJ, Lewis Finneran CM, Zhang P. Person-artifact-task (PAT) model of flow
R, editors. The economics of information security. Advances antecedents in computer-mediated environments. Int J Hum
in information security, vol. 12. Norwell (MA); Dordrecht, The Comput Stud 2003;59:475–96.
Netherlands: Kluwer Academic Publishers Group; 2004. Fortune J, Peters G. Learning from failure. John Wiley & Sons;
p. 165–78. 1995.
Ahmad A, Maynard S. Teaching information security Garg V, Camp LJ. End user perception of online risk under
management: reflections and experiences. Inf Manag Comput uncertainty. In: Proceedings of the 45th Hawaii international
Secur 2014;22:513–36. doi:10.1108/IMCS-08-2013-0058. conference on system sciences (HICSS), Manoa, HI, 4–7
Arnaboldi M, Spiller N. Actor-network theory and stakeholder January 2012. 2012 doi:10.1109/HICSS.2012.245.
collaboration: the case of Cultural Districts. Tour Manag Grossklags J, Radosavac S, Cárdenas AA, Chuang J. Nudge:
2011;32:641–54. doi:10.1016/j.tourman.2010.05.016. intermediaries’ role in interdependent network security. In:
BBC. New net rules set to make cookies crumble. BBC News Proceedings of the international conference on trust and
Technology, March 8, 2011. Available from: http://www.bbc trustworthy computing. 2010. p. 323–36 doi:10.1007/978-3-642-
.co.uk/news/technology-12668552. [Accessed 24 August 2016]. 13869-0_24. June.
Bonneau J, Herley C, van Oorschot PC, Stajano F. The quest to Hadar I, Soffer P, Kenzi K. The role of domain knowledge in
replace passwords: a framework for comparative evaluation requirements elicitation via interviews: an exploratory study.
of web authentication schemes. In: Proceedings of the IEEE Requir Eng 2014;19:143–59. doi:10.1007/s00766-012-0163-2.
symposium on security and privacy (SP ‘12). 2012. p. 553–67 Harr R, Wiberg M, Whittaker S. Understanding interaction search
doi:10.1109/SP.2012.44. behavior in professional social networks. Hum Technol
Brinton Anderson B, Vance A, Kirwan CB, Eargle D, Jenkins JL. 2011;7:194–215.
How users perceive and respond to security messages: a Hassard J, Alcadipani R. Actor-network theory. In: Mills AJ,
NeuroIS research agenda and empirical study. Eur J Inf Syst Durepos G, Wiebe E, editors. Encyclopedia of case study
2016;25:364–90. doi:10.1057/ejis.2015.21. research. Thousand Oaks: SAGE Publications; 2010.
Choe EK, Jung J, Lee B, Fisher K. Nudging people away from p. 9–13.
privacy-invasive mobile apps through visual framing. In: Hayes AF. Introduction to mediation, moderation, and
Proceedings of INTERACT, Sep 2–6. Cape Town, ZA: 2013 conditional process analysis: a regression-based approach.
doi:10.1007/978-3-642-40477-1_5. New York: Guilford Press; 2013.
Claar CL, Johnson J. Analyzing home PC security adoption Huang D-L, Rau P-LP, Salvendy G. A Survey of Factors Influencing
behavior. J Comput Inf Syst 2012;Summer:20–9. People’s Perception of Information Security. In: Jacko J, editor.
Clark JW, Snyder P, McCoy D, Kanich C. I saw images I didn’t even Human-Computer Interaction, Part IV, HCII 2007, LNCS 4553.
know I had “understanding user perceptions of cloud storage Springer-Verlag, Berlin Heidelberg, 2007. p. 906–15 doi:10.1007/
privacy”. In: Proceedings of CHI 2015, April 18–23. Seoul, 978-3-540-73111-5_100.
Republic of Korea: 2015 doi:10.1145/2702123.2702535. Huang D-L, Rau P-L, Salvendy G. Perceptions of information
Clarke R. Vignettes of corporate privacy disasters. 2014. Available security. Behav Inf Technol 2010;29(3):221–32. doi:10.1080/
from: http://www.rogerclarke.com/DV/PrivCorp.html. 01449290701679361.
[Accessed 24 August 2016]. Jeske D, Coventry L, Briggs P. Decision justifications for wireless
Constanzo M, Archer D, Aronson E, Pettigrew T. Energy network selection. In: Proceedings of the 2014 workshop on
conservation behavior: the difficult path from information socio-technical aspects in security and trust (STAST). Vienna,
to action. Am Psychol 1986;41:521–8. doi:10.1037/0003 Austria: 2014. p. 1–7 doi:10.1109/STAST.2014.9.
-066X.41.5.521. Kim EB. Recommendations for information security awareness
Coventry L, Briggs P, Jeske D, van Moorsel A. SCENE: a structured training for college students. Inf Manag Comput Secur
means for creating and evaluating behavioral nudges in a 2014;22:115–26. doi:10.1108/IMCS-01-2013-0005.
cybersecurity environment. In: Marcus A, editor. Design, user Kruger H, Drevin L, Steyn T. A vocabulary test to assess
experience, and usability. theories, methods, and tools for information security awareness. Inf Manag Comput Secur
designing the user experience, vol. 8517. Lecture notes in 2010;18:316–27. doi:10.1108/09685221011095236.
computer science. Springer International Publishing; 2014 Kurkovsky S, Syta E. Digital natives and mobile phones: a survey
doi:10.1007/978-3-319-07668-3_23. of practices and attitudes about privacy and security. In:
Dalcher D, Drevin L. Learning from Information Systems failures Proceedings of ISTAS (international symposium of the
by using narrative and ante-narrative methods. In: technology and society), 7–9 June. 2010. p. 441–9 doi:10.1109/
Proceedings of the 2003 annual conference of the SAICSIT. ISTAS.2010.5514610.
computers & security 66 (2017) 129–141 141
Latour B. Science in action: how to follow scientists and Sadalla E, Berlin A, Neel R, Ledlow S. Priorities in residential
engineers through society. Milton Keynes, UK: Open water use: a trade-off analysis. Environ Behav 2014;46:303–28.
University Press; 1987. doi:10.1177/0013916512456286.
Latour B. On recalling ANT”. In: Law J, Hassard J, editors. Actor Satterfield T, Kandlikar M, Beaudrie CEH, Conti J, Herr Harthorn
network theory and after. Oxford: Blackwell Publishers; 1999. B. Anticipating the perceived risk of nanotechnologies. Nat
p. 15–25. Nanotechnol 2009;4:752–8. doi:10.1038/nnano.2009.265.
Latour B. Reassembling the social: an introduction to actor- Shanks G, Seddon P, Willcocks L, editors. Second-wave enterprise
network-theory. Oxford: Clarendon; 2005. resource planning systems. Cambridge: Cambridge University
Lave J, Wenger E. Situated learning: legitimate peripheral Press; 2003.
participation. Cambridge, UK: Cambridge University Press; Siepmann M. Health behavior. In: Encyclopedia of public health.
1990. Springer; 2008. p. 515–21.
Law J. Notes on the theory of the actor network: ordering, Stewart G, Lacey D. Death by a thousand facts. Criticising the
strategy and heterogeneity. Syst Pract 1992;5:379–93. technocratic approach to information security awareness. Inf
doi:10.1007/BF01059830. Manag Comput Secur 2012;20:29–38. doi:10.1108/
Lee EWJ, Ho SS. The perceived familiarity gap hypothesis: 09685221211219182.
examining how media attention and reflective integration Tam L, Glassman M, Vandenwauver M. The psychology of
relate to perceived familiarity with nanotechnology in password management: a tradeoff between security and
Singapore. J Nanopart Res 2015;17:1–15. doi:10.1007/s11051- convenience. Behav Inf Technol 2010;29:233–44. doi:10.1080/
015-3036-z. 01449290903121386.
McKenzie-Mohr D. Fostering sustainable behavior through Turland J, Coventry L, Jeske D, Briggs P, van Moorsel A. Nudging
community-based social marketing. Am Psychol 2000;55:531– towards security: developing an application for wireless
7. doi:10.1037/0003-066X.55.5.531. network selection for android phones. In: Proceedings of the
Michie S, West R. Behaviour change theory and evidence: a 2015 British HCI conference. ACM; 2015. p. 193–201
presentation to Government. Health Psychol Rev 2013;7:1–22. doi:10.1145/2783446.2783588.
doi:10.1080/17437199.2011.649445. Ulltveit-Moe N. A roadmap towards improving managed security
Ng B-Y, Kankanhalli A, Xu Y. Studying users’ computer security services from a privacy perspective. Ethics Inf Technol
behavior: a health belief perspective. Decis Support Syst 2014;16:227–40. doi:10.1007/s10676-014-9348-3.
2009;46:815–25. doi:10.1016/j.dss.2008.11.010. Wogalter MS, Brelsford JW, Desaulniers DR, Laughery KR.
Opentracker. Third-party cookies vs first-party cookies. 2014. Consumer product warnings: the role of hazard perception. J
Available from: http://www.opentracker.net/article/third Safety Res 1991;22:71–82. doi:10.1016/0022-4375(91)90015-N.
-party-cookies-vs-first-party-cookies. [Accessed 24 August
2016]. Debora Jeske is a lecturer in Work and Organisational Psychology
Park EH, Jongwoo Kim J, Park YS. The role of information security at University College Cork, Ireland. The research outlined in this
learning and individual factors in disclosing patients’ health paper was started during her previous appointment at Edinburgh
information. Comput Secur 2017;65:64–76. doi:10.1016/ Napier University, United Kingdom. Her research interests include
j.cose.2016.10.011. psychology and technology at work (including e-HRM, human com-
Perren L. The role of e-mentoring in entrepreneurial education puter interaction, learning and mentoring at work). Contact details:
and support: a meta-review of academic literature. Educ Train Dr. Debora Jeske, School of Applied Psychology, University College
2003;45:517–25. doi:10.1108/00400910310508900. Cork, North Mall, Cork City, Ireland. E-mail: d.jeske@ucc.ie.
Petrescu M. Marketing research using single-item indicators in
structural equation models. J Mark Analyt 2013;1:99–117. Paul van Schaik is a Professor in the Department of Psychology,
doi:10.1057/jma.2013.7. Sport and Exercise at Teesside University, United Kingdom. His re-
Rocha Flores W, Holm H, Svensson G, Ericsson G. Using phishing search interests focus on applied cognitive psychology and include
experiments and scenario-based surveys to understand the psychology of human–computer interaction, information se-
security behaviors in practice. Inf Manag Comput Secur curity and information privacy, technology acceptance and
2014;22(4):393–406. doi:10.1108/IMCS-11-2013-0083. psychology of judgement and decision-making. Contact details: Prof.
Rosenstock I. Historical origins of the Health Belief Model. Paul van Schaik, Department of Psychology, Sport and Exercise, School
Health Educ Monogr 1974;2:328–35. doi:10.1177/ of Social Sciences, Business and Law, Teesside University, Middles-
109019817400200403. brough TS1 3BA, United Kingdom. E-mail: P.Van-Schaik@tees.ac.uk.