BS Iso Iec 15026-1-2013
BS Iso Iec 15026-1-2013
BS Iso Iec 15026-1-2013
National foreword
This British Standard is the UK implementation of ISO/IEC
15026-1:2013. It supersedes PD ISO/IEC TR15026-1:2010 which is
withdrawn.
The UK participation in its preparation was entrusted to Technical
Committee IST/15, Software and systems engineering.
A list of organizations represented on this committee can be
obtained on request to its secretary.
This publication does not purport to include all the necessary
provisions of a contract. Users are responsible for its correct
application.
© The British Standards Institution 2014.
Published by BSI Standards Limited 2014
INTERNATIONAL ISO/IEC
STANDARD 15026-1
First edition
2013-11-01
Reference number
ISO/IEC 15026-1:2013(E)
© ISO/IEC 2013
BS ISO/IEC 15026-1:2013
ISO/IEC 15026-1:2013(E)
Contents Page
Foreword......................................................................................................................................................................................................................................... iv
Introduction...................................................................................................................................................................................................................................v
1 Scope.................................................................................................................................................................................................................................. 1
2 Applicability............................................................................................................................................................................................................... 1
2.1 Audience........................................................................................................................................................................................................ 1
2.2 Field of applicability............................................................................................................................................................................ 1
3 Terms and definitions...................................................................................................................................................................................... 1
3.1 Terms related to assurance and properties.................................................................................................................... 1
3.2 Terms related to product and process................................................................................................................................. 3
3.3 Terms related to integrity level.................................................................................................................................................. 4
3.4 Terms related to conditions and consequences.......................................................................................................... 4
3.5 Terms related to organization.................................................................................................................................................... 5
4 Organization of this International Standard............................................................................................................................ 6
5 Basic concepts.......................................................................................................................................................................................................... 6
5.1 Introduction............................................................................................................................................................................................... 6
5.2 Assurance..................................................................................................................................................................................................... 6
5.3 Stakeholders............................................................................................................................................................................................... 7
5.4 System and Product............................................................................................................................................................................. 7
5.5 Property......................................................................................................................................................................................................... 7
5.6 Uncertainty and confidence.......................................................................................................................................................... 8
5.7 Conditions and initiating events............................................................................................................................................... 8
5.8 Consequences........................................................................................................................................................................................... 9
6 Using multiple parts of ISO/IEC 15026.......................................................................................................................................... 9
6.1 Introduction............................................................................................................................................................................................... 9
6.2 Initial usage guidance........................................................................................................................................................................ 9
6.3 Relationships among parts of ISO/IEC 15026........................................................................................................... 10
6.4 Authorities................................................................................................................................................................................................ 10
7 ISO/IEC 15026 and the assurance case.......................................................................................................................................11
7.1 Introduction............................................................................................................................................................................................ 11
7.2 Justification of method of reasoning.................................................................................................................................. 11
7.3 Means of obtaining and managing evidence............................................................................................................... 12
7.4 Certifications and accreditations.......................................................................................................................................... 12
8 ISO/IEC 15026 and integrity levels.................................................................................................................................................13
8.1 Introduction............................................................................................................................................................................................ 13
8.2 Risk analysis............................................................................................................................................................................................ 13
9 ISO/IEC 15026 and the life cycle........................................................................................................................................................14
9.1 Introduction............................................................................................................................................................................................ 14
9.2 Assurance activities in the life cycle................................................................................................................................... 15
10 Summary..................................................................................................................................................................................................................... 15
Bibliography.............................................................................................................................................................................................................................. 16
Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that are
members of ISO or IEC participate in the development of International Standards through technical
committees established by the respective organization to deal with particular fields of technical
activity. ISO and IEC technical committees collaborate in fields of mutual interest. Other international
organizations, governmental and non-governmental, in liaison with ISO and IEC, also take part in the
work. In the field of information technology, ISO and IEC have established a joint technical committee,
ISO/IEC JTC 1.
International Standards are drafted in accordance with the rules given in the ISO/IEC Directives, Part 2.
The main task of the joint technical committee is to prepare International Standards. Draft International
Standards adopted by the joint technical committee are circulated to national bodies for voting.
Publication as an International Standard requires approval by at least 75 % of the national bodies
casting a vote.
Attention is drawn to the possibility that some of the elements of this document may be the subject of
patent rights. ISO and IEC shall not be held responsible for identifying any or all such patent rights.
ISO/IEC 15026‑1 was prepared by Joint Technical Committee ISO/IEC JTC 1, Information technology,
Subcommittee SC 7, Software and systems engineering.
This first edition of ISO/IEC 15026‑1 cancels and replaces ISO/IEC TR 15026‑1:2010, which has been
technically revised.
ISO/IEC 15026 consists of the following parts, under the general title Systems and software engineering —
Systems and software assurance:
— Part 1: Concepts and vocabulary
— Part 2: Assurance case
— Part 3: System integrity levels
— Part 4: Assurance in the life cycle
The IEEE Computer Society collaborated with ISO/IEC JTC 1 in the development of the international
standards of ISO/IEC 15026. IEEE Std 1228-1994 and IEEE Standard for Safety Plan were used as base
documents in the development of this standard.
Introduction
Software and systems assurance and closely related fields share concepts but have differing vocabularies
and perspectives. This part of ISO/IEC 15026 provides a unifying set of underlying concepts and an
unambiguous use of terminology across these various fields. It provides a basis for elaboration, discussion,
and recording agreement and rationale regarding concepts and the vocabulary used uniformly across
all parts of ISO/IEC 15026.
This part of ISO/IEC 15026 clarifies concepts needed for understanding software and systems assurance
and, in particular, those central to the use of ISO/IEC 15026‑2 to ISO/IEC 15026‑4. It supports shared
concepts, issues and terminology applicable across a range of properties, application domains, and
technologies.
1 Scope
This part of ISO/IEC 15026 defines assurance-related terms and establishes an organized set of
concepts and relationships to establish a basis for shared understanding across user communities for
assurance. It provides information to users of the other parts of ISO/IEC 15026 including the combined
use of multiple parts. The essential concept introduced by ISO/IEC 15026 is the statement of claims in
an assurance case and the support of those claims through argumentation and evidence. These claims are
in the context of assurance for properties of systems and software within life cycle processes for the
system or software product.
Assurance for a service being operated and managed on an ongoing basis is not covered in ISO/IEC 15026.
2 Applicability
2.1 Audience
A variety of potential users of ISO/IEC 15026 exists including developers and maintainers of assurance
cases and those who wish to develop, sustain, evaluate, or acquire a system that possesses requirements
for specific properties in such a way as to be more certain of those properties and their requirements.
ISO/IEC 15026 uses concepts and terms consistent with ISO/IEC 12207 and ISO/IEC 15288 and generally
consistent with the ISO/IEC 25000 series, but the potential users of ISO/IEC 15026 need to understand
differences from concepts and terms to which they may be accustomed. This part of ISO/IEC 15026
attempts to clarify these differences.
3.1.2
claim
true-false statement about the limitations on the values of an unambiguously defined property—called
the claim’s property—and limitations on the uncertainty of the property’s values falling within these
limitations during the claim’s duration of applicability under stated conditions
Note 1 to entry: Uncertainties also may be associated with the duration of applicability and the stated conditions.
— claim’s property;
— limitations on the value of the property associated with the claim (e.g. on its range);
— duration-related uncertainty;
— condition-related uncertainty.
Note 3 to entry: The term “limitations” is used to fit the many situations that can exist. Values can be a single
value or multiple single values, a range of values, or multiple ranges of values, and can be multi-dimensional. The
boundaries of these limitations are sometimes not sharp, e.g. they may involve probability distributions and may
be incremental.
3.1.3
assurance case
reasoned, auditable artefact created that supports the contention that its top-level claim (or set of claims),
is satisfied, including systematic argumentation and its underlying evidence and explicit assumptions
that support the claim(s)
Note 1 to entry: An assurance case contains the following and their relationships:
— arguments that logically link the evidence and any assumptions to the claim(s);
— a body of evidence and possibly assumptions supporting these arguments for the claim(s);
3.1.4
dependability
collective term used to describe the availability performance and its influencing factors: reliability
performance, maintainability performance and maintenance support performance
Note 1 to entry: Dependability is used only for general descriptions in non-quantitative terms.
Note 2 to entry: ISO/IEC 25010[99] notes that “dependability characteristics include availability and its inherent or
external influencing factors, such as: reliability, fault tolerance, recoverability, integrity, security, maintainability,
durability, and maintenance support.” Several standards address dependability (e.g.[64] and[69]), and many more
address the qualities within it. IEC 60050-191 offers related definitions.[63]
Note 2 to entry: The “result” could in some cases be many related individual results. However, claims usually
relate to specified versions of a product.
Note 2 to entry: In practice, the interpretation of its meaning is frequently clarified by the use of an associative
noun, e.g. aircraft system. Alternatively, the word “system” may be substituted simply by a context-dependent
synonym, e.g. aircraft, though this may then obscure a system principles perspective.
3.3.2
integrity level requirements
set of specified requirements imposed on aspects related to a system, product, or element and associated
activities in order to show the achievement of the assigned integrity level (that is, meeting its claim)
within the required limitations on uncertainty; this includes the evidence to be obtained
Note 1 to entry: Since an integrity level is defined as a claim, the two phrases ‟achievement of the assigned
integrity level“ and ‟meeting its claim“ are equivalent.
Note 2 to entry: In ISO/IEC 15026:1998, 3.3.1 and 3.3.2 are referred to as the “integrity level” and “integrity
requirements” respectively. The latter has been changed to “integrity level requirements” both for increased
clarity and because this is common usage in safety.
Note 3 to entry: IEEE Std 1012:2004 defines “integrity level” as “a value representing project-unique characteristics
(e.g. software complexity, criticality, risk, safety level, security level, desired performance, reliability) that define
the importance of the software to the user.” That is, an integrity level is a value of a property of the target software.
Since both a claim and a statement that a property has a particular value can be regarded as a proposition of a
system or software, the two definitions of integrity levels have significantly the same meaning.
3.1.8
risk
combination of the probability of an event and its consequence
Note 1 to entry: The term “risk” is generally used only when there is at least the possibility of negative consequences.
Note 2 to entry: In some situations, risk arises from the possibility of deviation from the expected outcome or event.
3.4.4
error
erroneous state of the system
3.4.5
fault
defect in a system or a representation of a system that if executed/activated could potentially result in
an error
Note 1 to entry: Faults can occur in specifications when they are not correct.
3.4.6
attack
malicious action or interaction with the system or its environment that has the potential to result in a
fault or an error (and thereby possibly in a failure) or an adverse consequence
3.4.7
violation
behaviour, act, or event deviating from a system’s desired property or claim of interest
Note 1 to entry: In the area of safety, the term “violation” is used to refer to a deliberate human contravention of
a procedure or rule.
3.4.8
failure
termination of the ability of a system to perform a required function or its inability to perform within
previously specified limits; an externally visible deviation from the system’s specification
3.4.9
systematic failure
failure related in a deterministic way to a certain cause that can only be eliminated by a modification of the
design or of the manufacturing process, operational procedures, documentation or other relevant factors
Note 2 to entry: An identified part of an organization (even as small as a single individual) or an identified group
of organizations can be regarded as an organization if it has responsibilities, authorities and relationships.
Note 2 to entry: In two-party situations, approval authority often rests with the acquirer. In regulatory situations,
the approval authority may be a third party such as a governmental organization or its agent. In other situations,
for example, the purchase of off-the-shelf products developed by a single-party, the independence of the approval
authority can be a relevant issue to the acquirer.
3.5.3
design authority
person or organization that is responsible for the design of the product
3.5.4
integrity assurance authority
independent person or organization responsible for certifying compliance with the integrity level
requirements
Note 1 to entry: Adapted from ISO/IEC 15026:1998, in which the definition is: “The independent person or
organization responsible for assessment of compliance with the integrity requirements.”
5 Basic concepts
5.1 Introduction
This clause covers the concepts and vocabulary fundamental to all parts of ISO/IEC 15026.
5.2 Assurance
ISO/IEC 15026 uses a specific definition for assurance as being grounds for justified confidence. Generally,
stakeholders need grounds for justifiable confidence prior to depending on a system, especially a system
involving complexity, novelty, or technology with a history of problems (e.g. software). The greater the
degree of dependence, the greater the need for strong grounds for confidence. The appropriate valid
arguments and evidence to establish a rational basis for justified confidence in the relevant claims about
the system’s properties need to be made. These properties may include such aspects as future costs,
behaviour, and consequences. Throughout the life cycle, adequate grounds need to exist for justifying
decisions related to ensuring the design and production of an adequate system and to be able to place
reliance on that system.
Assurance is a term whose usage varies among the communities who use the term. However, all usage
relates to placing limitations on or reducing uncertainty in such things as measurements, observations,
estimations, predictions, information, inferences, or effects of unknowns with the ultimate objective of
achieving and/or showing a claim. Such a reduction in uncertainty may provide an improved basis for
justified confidence. Even if the estimate of a property’s value remains unchanged, the effort spent in
reducing uncertainty about its value can often be cost-effective since the resulting reduced uncertainty
improves the basis for decision-making.
Assurance may relate to (1) would the system or software as specified meet real-world needs and
expectations, to (2) would or does the as-built and operated system meet the specifications, or to both
(1) and (2). Specifications may be representations of static and/or dynamic aspects of the system.
Specifications often include descriptions of capability, functionality, behaviour, structure, service, and
responsibility including time-related and resource-related aspects as well as limitations on frequency
or seriousness of deviations by the product and related uncertainties.
Specifications may be prescriptions and/or constraints (e.g. for and on product behaviours) as well
as include measures of merit and directions regarding tradeoffs. Generally, specifications place some
limitations on when they apply such as on the environment and its conditions (e.g. temperature) and
possibly the conditions of the product (e.g. age or amount of wear).
5.3 Stakeholders
Through their life cycle systems and software have multiple stakeholders who affect or are affected by
the system and the system life cycle processes. Stakeholders might benefit from, incur losses from, impose
constraints on, or otherwise have a “stake” in the system, and therefore are those that provide the requirements
for the system. Stakeholders can include non-users whose performance, results, or other requirements might
be affected, e.g. entities whose software is executing on the same or networked computers.
A different but important kind of stakeholder is an attacker, who certainly imposes constraints or has
interests involved with the system. This International Standard includes the attacker as a stakeholder;
however, some in the security community and elsewhere exclude attackers from their use of the term
“stakeholders.”
The relevant stakeholders whose requirements are of concern include not only the system’s owners and
users, but also developers and operators who need to identify requirements for the development and
operation of the system. Depending on conditions and consequences, the various stakeholders require
grounds for justified confidence in properties of the system for which they identified requirements.
5.5 Property
ISO/IEC 15026 relates assurance to requirements of a property of a system or software product. A
property might include a condition, a characteristic, an attribute, a quality, a trait, a measurement, or a
consequence. A property might be invariant, or dependent on time, situation, or history. In ISO/IEC 15026,
a property is expected to be relevant directly or indirectly to a system or systems and thus have related
requirements.
Properties may have requirements for what they were in the past, what they are presently, or what
they will be in the future. Generally, the last is the most important in ISO/IEC 15026. As this knowledge
involves predicting the future, it is often the most difficult and uncertain to attain; therefore, a system’s
future behaviour and consequences (see 5.8) often become principal issues in its assurance.
Many of the properties with requirements are qualities of the system. Several standards and reports
provide lists and definitions of qualities that could be the subject of assurance including ISO/IEC 9126‑1,
ISO/IEC 25010 and the related series, ISO/IEC 2382‑14, ISO 9241, ISO/TR 18529, and ISO/TS 25238.
This use of the term “property” derives from, is consistent with, and subsumes the broad use of the term
“property” in ISO/IEC 25010 where it is used spanning properties that are inherent or not, internal,
external, and in use or context.
Producers and other stakeholders may prioritize properties such as efficiency and reliability and
perform trade-off studies between them and their related requirements. A number of techniques have
been created for addressing these trades, such as those in [25], [64], [122], [157], and [40]. The specifying of
a top-level claim for a property is sometimes the result of analyses including trade-off studies.
Often the property is specified as behaviour. During performed operations, behaviour-related properties
might be formally specified as a combination of:
— Restriction on allowed system states (sometimes called the “safety property”).
— System states that must be reached; required progress or accomplishment (“liveness property”).
— Constraints on flows or interactions; requirements for separation constraint.
These kinds of properties can be stated as conditions or constraints that must be true of the system.1)
In practice, these are non-trivial and modularized, involving time and starting state(s) as well as state
transitions related to interaction with the system’s or software’s environment.
Many kinds of flows such as of gases, fluids, traffic, or information are of possible interest as well
as constraints on them such as non-interference and separations to be maintained. In addition,
flow constraints are often convenient or necessary to specify aspects of information security[135]
including access control mechanisms and policies and restrictions on information overtly or covertly
communicated,
1) If specified formally, this can allow static analysis of conformity of designs and code, potentially adding
creditable assurance evidence.
The designers of the system might or might not be cognizant of all the initiating events for a condition
within the environment; however, dangerous conditions may need to be dealt with even though not all
of their initiating events are known or recognizable.
5.8 Consequences
Outside the system, much of the reasoning is based on conditions that could lead to adverse consequences
and their initiating events or preconditions. Inside the system reasoning is based on conditions that can
lead to dangerous system behaviours and the initiating events or preconditions for these conditions.
In practice, claims can extend beyond the boundaries of the system or its behaviours. In particular,
claims can place limitations on consequences of a system’s behaviour and/or system-related events,
activities, and/or conditions – especially on the values of consequences. One may refer to:
A consequence is desirable or undesirable from a stakeholder’s perspective, viewpoint or interests. A
consequence may occur anywhere in the system’s life cycle.
In complex socio-technical systems, explanations of mishaps or claim violations cannot be limited
to “component” failures. Adverse consequences can result from normal behaviour variability and
unintended or unanticipated interactions.[57][54] Regardless of how they arise, dangerous conditions
and adverse consequences are subjects for mitigation.
Attackers can possess capabilities, resources, motivations, and intentions that enable them to initiate
and carry malicious efforts to violate a claim. Violators use their capabilities to take advantage of
system-provided and/or environment-provided opportunities called vulnerabilities, i.e. “weaknesses…
that could be exploited or triggered by a threat source”[150].2)
A sometimes misunderstood point is that maliciousness and subversion are concerns even when no
security-related system property is involved. Malicious developers might have an effect on successful
achievement of almost any property.
Several standards or reports mention consequences associated with systems within a specific domain.
Examples include ISO 14620,[79] ISO 19706,[81] and ISO/TS 25238.[121] Risk management standards also
address consequences, for example ISO/IEC 16085[91]and ISO 31000.
6.1 Introduction
ISO/IEC 15026 or its parts can be used alone or with other standards or guidance. The parts of
ISO/IEC 15026 can be mapped to most life cycle standards and can use any set of well-defined qualities
or properties.
2) For many purposes, the meaningfulness and need to separate vulnerabilities from other weaknesses can be
low or non-existent. In addition, a question always exists about the current and future contexts that are relevant for
“could be exploited or triggered”.
includes relating integrity levels to an assurance case. ISO/IEC 15026‑3 concentrates more on the
system itself and its integrity levels rather than on external risk analysis and also includes the creation
of integrity levels. Clause 8 discusses integrity levels.
6.4 Authorities
Parts of ISO/IEC 15026 involve “authorities” as defined in Clause 3, Terms and definitions. For example,
ISO/IEC 15026‑3 includes obtaining agreements between the design authority and the integrity
assurance authority. Additionally, a new system needs the approval authorities of acquirers to take
charge of analysing the process of creating assurance cases with the design authority and the integrity
assurance authority of the suppliers.
However, the “approval authority” for the assurance case is not necessarily the judge of conformance
to a Part of ISO/IEC 15026. To the extent possible claims of conformance to Parts are judged on aspects
that are more straightforward and more difficult to dispute than the quality of artefacts and decisions
judged in the context of the system or project. In practice, contracts can explicitly call for the acquirer to
be the approval authority or the approver of conformance to parts of ISO/IEC 15026.
7.1 Introduction
ISO/IEC 15026‑2 covers the structure and content of an assurance case. It describes the five principal
components of an assurance case: claims, arguments, evidence, justifications, and assumptions. The
purpose of an assurance case is to improve assurance communications by informing stakeholders’
decision-making and supplying grounds for needed stakeholder confidence. The most common use of
an assurance case is to provide assurance about system properties to parties not closely involved in the
system’s technical development processes. Such parties may be involved in the system’s certification or
regulation, acquisition, or audit. Usually, an assurance case addresses the reasons to expect and confirm
successful production of the system, including the possibilities and risks identified as difficulties or
obstacles to developing and sustaining that system.
Unlike logical proofs of the deduction of the claims from the evidence, which covers the absolute truth
or Platonic truth aspects, assurance cases deal with the dialectic aspects of the system where the
truth is always relative or even subjective. In other words, logical proofs are described under a fixed
logical theory, but assurance cases may be rebutted on the basis that the underlying logical theory is
inappropriate. The need for assurance case arises when one realizes the properties of the systems in the
real world can never be completely formalized in a logical theory, but there is always something which
is not covered by any logical formalization.
NOTE When the top-level claim is about safety, security, dependability or RAM (reliability, availability
and maintainability), assurance cases associated with these claims are called safety cases, security cases,
dependability cases or RAM cases, respectively. See,[139],[142],[143],[146],[154],[155],[168],[74],[22],[23] and[24] in the
Bibliography.
Considered as an artefact, an assurance case has quality-related aspects such as the nature of content, its
form or structure (e.g. method of argumentation or modularity), semantic issues such as completeness,
creation and maintenance including tool support, usability and presentation, integrity, validity,
understandability, and having clearly stated conclusions with explicit degrees of uncertainty. One
article[164] covers a substantial list of quality-related characteristics for assurance cases. The quality-
related aspects of an assurance case are not covered in ISO/IEC 15026‑2 or any part of ISO/IEC 15026.
Any substantive modifications in the system, changes in the environment, or changes in the assurance
case’s top-level claims will necessitate recorded changes to the assurance case. Thus, an assurance case
usually contains a progressively expanding body of evidence built up during development and later life
cycle activities that responds as required to all relevant changes [[139], p. 5].
NOTE An assurance case’s claim(s) on the values of properties could include the system’s entire set of
requirements for a property of interest. One example might have a top-level claim composed of (1) required
limitations on consequences (2) functionality and properties of the system itself (e.g. that this functionality
cannot be bypassed). The qualities defined in the ISO/IEC 25000-series include qualities related to functionality
and constraints. The Common Criteria v. 3.1 Revision 2[30] is also interested in both.
approaches to reasoning differ among communities having differing motivations, mindsets, and often
multiple methods of reasoning.
Examples of methods of reasoning include:
— Quantitative:
— Deterministic (e.g. formal proofs).
— Non-deterministic formal systems for reasoning:
— Probabilistic,
— Game theoretic (e.g. minimax), or
— Other uncertainty-based formal systems of reasoning (e.g. fuzzy sets).
— Qualitative (e.g. staff performance evaluations, court judgements, and qualitative statements of
event causality.
Complex products and situations—and any involving humans—are beyond the current state of the
art to “quantitatively” create precise and accurate predictions. Subjective judgements are used in the
absence of affordable, suitable, more objective methods and techniques or where needed to supplement
or evaluate the results of such techniques. Supplementing quantitative techniques with expert review
and judgement is widely used and generally accepted. As with other forms of argumentation, subjective
judgements take the form of a claim and its support. While sometimes necessary or advantageous, use of
subjective judgement within the assurance case can lead to additional uncertainties, so, generally, (just
as with assumptions) the less critical the judgement is the better.
The patterns of occurrences of “natural” events and common, non-malicious human behaviours are
usually described probabilistically. However, possibilities for intelligent, malicious actions whose
probability is not determinable or not knowable is particularly a concern if the intelligent, malicious
adversary deliberately violates any probability estimates one could make regarding their behaviour, e.g.
to achieve surprise. This distinction is central to the difference in reasoning between safety and security.
The aviation and nuclear power industries have long histories of standards and certifications, and the
security community in ISO/IEC JTC 1/SC 27 has been working on the topic of assurance for many years.
Security examples include the Common Criteria, FIPS 140 for cryptology, and ISO/IEC 27002, Information
technology–Code of Practice for Information Security Management, combined with ISO/IEC 27001
(formerly with UK standard BS 7799-2:2002) form a basis for an Information Security Management
System (ISMS) certification of an operational system. The UK Ministry of Defence and Civil Aviation
Authority have also produced standards of interest including assurance-case-based standards for
reliability, maintainability, and safety—e.g,[139],[142],[143],[22] and.[23] Many standards are listed in the
Bibliography.
The safety community (e.g. commercial aviation) has used certification (designated agent or licensure)
of key personnel as part of its approaches. A number of safety and computer security certifications exist
from management-oriented ones to technical ones about specific products, e.g. certifications from the
International Information Systems Security Certification Consortium (ISC) and the SANS Institute.
8.1 Introduction
Integrity levels are suitable for use for certain levels of risk or to support an assurance case and impose
criteria especially on the project, evidence collected, and system. An integrity level can be viewed as
a representation of the degree of confidence that is used to reach agreement among stakeholders of a
system about risks related to that system.
ISO/IEC 15026‑3 first establishes an integrity level framework. The remainder of the standard covers
defining integrity levels, using integrity levels, determining system or product integrity levels using
risk analyses, assigning system element integrity levels, meeting integrity level requirements using
evidence, and agreements and approvals involving authorities (see 6.4).
Integrity level requirements reflect what is required to achieve and show that the system or system
element has (or had or will have) the properties claimed by its integrity level. A system’s integrity level
states what would be adequate in terms of properties of the entire system. Thus, showing the properties
has a basic role in showing the meeting of larger claims involving the system and its environment
including desirable or undesirable consequences. If such larger claims are not made, then achieving and
showing system element integrity levels supplies a basic part of showing the top-level claim regarding
the system itself.
In practice, integrity levels are often discussed in terms emphasizing the evidence needed to meet
the integrity level requirements and thereby provide evidence for the arguments supporting claims
regarding properties of the system itself. However, the quality of the arguments justifying meeting
integrity level requirements as showing the achievement of its related integrity level is also important
because of the affect of that quality on uncertainties. Argument-, evidence- and assumption-related
uncertainties are a part of establishing integrity level requirements.
NOTE Integrity levels and standards utilizing them have a significant history especially in safety. Integrity
levels in safety-related standards are defined in multi-level sets addressing varying degrees of stringency and/or
uncertainty of their achievement with higher levels providing higher stringency and lower uncertainty. One
example safety standard is IEC 61508, Functional safety of electrical/electronic/programmable electronic safety-
related systems.[70] Elsewhere, similar schemes are used with different labels, e.g. “conformance classes.”
and timings. Thus, integrity levels are a codification of what is needed to be done and shown for various
ranges and severities of limitations on property values and their associated uncertainties.
ISO/IEC 15026 does not cover risk analysis in detail. Many standards and guidance documents exist
that offer guidelines for risk analysis and can aid in the identification of potential adverse consequences.
IEC 61508[70] and IEC 31010 ed. 1.0 (2009-11-27), Risk management–-Risk management techniques,
provide approaches to risk analysis. As safety-specific terminology is used in IEC 300‑3‑9, the terms
“hazard” and “harm” should be interpreted as “dangerous condition” and “adverse consequence,”
respectively. IEC 60300, Dependability management,[64] also provides guidance.
Other specialized standards include ISO 13849[78] on machinery, ISO 14620[79] on space systems,
ISO 19706[81] on fire, ISO/TS 25238[121]on health informatics, ISO/IEC 27005[110] on information
security, and UK CAP 760[24] on air traffic and airports. Also of possible interest are the more general
risk management standards ISO/IEC 16085[91]and ISO 31000.
9.1 Introduction
ISO/IEC 15026‑4, Assurance in the life cycle, provides a process view for systems and software assurance
by providing a statement of purpose and a set of outcomes suitable for systems and software assurance.
The concept of a process view is formulated and described in an annex of ISO/IEC 15288, Systems and
software engineering — System life cycle processes. Like a process, the description of a process view
includes a statement of purpose and outcomes. Unlike a process, the description of a process view does not
include activities and tasks. Instead, the description includes guidance and recommendations explaining
how the outcomes can be achieved by employing the activities and tasks of the various processes in
ISO/IEC 15288 and ISO/IEC 12207, Systems and software engineering — Software life cycle processes.
All of the life cycle processes are described in both ISO/IEC 15288 and ISO/IEC 12207 although the
processes in ISO/IEC 12207 are specialized to software and, in some cases, have different names
reflecting that specialization. ISO/IEC 12207 contains processes not contained in ISO/IEC 15288 related
to software implementation processes; support processes, and reuse processes.
The processes, activities, tasks, and the guidance and recommendations all have to be performed in the
context of a life cycle model. The multi-part Technical Report ISO/IEC/TR 24748, Systems and software
engineering – Life cycle management is intended to facilitate the joint usage of the process content of
the two life cycle process standards. ISO/IEC/TR 24748 provides unified and consolidated guidance
on the life cycle management of systems and software. Its purpose is to help ensure consistency in
system concepts and life cycle concepts, models, stages, processes, process application, iteration and
recursion of processes during the life cycle, key points of view, adaptation and use in various domains.
ISO/IEC 24748-1 illustrates the use of a life cycle model for systems in the context of ISO/IEC 15288
and provides a corresponding illustration of the use of a life cycle model for software in the context of
ISO/IEC 12207.
ISO/IEC 15026‑4 gives the user the freedom to choose whether they use a specific artefact called an
“assurance case” or document the assurance-related information in other documents. The point is to
achieve the top-level claim and then to show the achievement of the claim for the value of a critical
property for a relevant stakeholder. Life cycle processes, activities and tasks need to reflect both
realizing an adequate system and being sure that the system is adequate by showing that achievement
to the required confidence of the stakeholders.
Users of ISO/IEC 15026‑4 may require risk assessment and risk management, measurement, and
requirements processes that are more fully elaborated than the treatments provided in ISO/IEC 15288
and ISO/IEC 12207. Three International Standards, ISO/IEC 16085, Risk management, ISO/IEC 15939,
Measurement, and ISO/IEC/IEEE 29148, Requirements engineering, are designed to be used with
ISO/IEC 15288 and ISO/IEC 12207 to provide more detail for these three processes. Other standards
that provide useful requirements and guidance for selected processes are ISO/IEC/IEEE 15289 for
documentation resulting from the execution of life cycle processes and ISO/IEC/IEEE 16326 for the
project management process.
ISO/IEC 15026 is intended to be compatible with these life cycle process standards. The goals of
assurance, the selection of claims to be assured, assurance-related planning, and the construction and
maintenance of the assurance case have influences within all life cycle processes.
10 Summary
This International Standard has been written to provide users of all parts of ISO/IEC 15026 an adequate
understanding of the concepts and terminology used in ISO/IEC 15026 that previously may not have
been shared across the communities served. The explanations of what is covered in each part of
ISO/IEC 15026 should provide a basis for selecting and using those parts as well as a rationale behind
the organization of the ISO/IEC 15026 series of standards itself.
Bibliography
[1] Abran A., & Moore J.W. (Executive editors); Pierre Bourque, Robert Dupuis, Leonard Tripp
(Editors). Guide to the Software Engineering Body of Knowledge. 2004 Edition. Los Alamitos,
California: IEEE Computer Society, Feb. 16, 2004. Available at http://www.swebok.org
[2] Adamski A., & Westrum R. Requisite imagination: The fine art of anticipating what might go
wrong.” In: [55], p. 193-220, 2003
[3] Adelard. The Adelard Safety Case Development Manual. Available at http://www.adelard.
com/web/hnav/resources/ascad
[4] Alexander I Systems Engineering Isn’t Just Software. 2001. Available at http://easyweb.easynet.
co.uk/~iany/consultancy/systems_engineering/se_isnt_just_sw.htm.
[5] Allen J.H., Barum S., Ellison R.J., McGraw G., Mead N.R. Software Security Engineering: A Guide
for Project Managers. Addison-Wesley, 2008
[6] Altman W., Ankrum T., Brach W. Improving Quality and the Assurance of Quality in the Design and
Construction of Nuclear Power Plants: A Report to Congress. U.S. Nuclear Regulatory Commission:
Office of Inspection and Enforcement, 1987
[7] Anderson J.P. Computer Security Technology Planning Study Volume I, ESDTR-73-51, Vol. I, Electronic
Systems Division, Air Force Systems Command, Hanscom Field, Bedford, MA 01730, Oct. 1972.
[8] Anderson R.J. Security Engineering: A Guide to Building Dependable Distributed Systems. John
Wiley and Sons, Second Edition, 2008
[9] Ankrum T.S., & Kromholz A.H. Structured Assurance Cases: Three Common Standards,” Ninth IEEE
International Symposium on High-Assurance Systems Engineering (HASE’05), pp. 99-108, 2005
[10] Armstrong J.M., & Paynter S.P. The Deconstruction of Safety Arguments through Adversarial
Counter-argument. School of Computing Science, Newcastle University CS-TR-832, 2004
[11] Atchison B., Lindsay P., Tombs D. A Case Study in Software Safety Assurance Using Formal Methods.
Technical Report No. 99-31. Sept. 1999
[12] ATSIN Number 17 Issued 9. Lapses and Mistakes. Air Traffic Services Information Notice, Safety
Regulation Group, ATS Standards Department. UK Civil Aviation Authority, August 2002
[13] Bahill A.T., & Gissing B. Re-evaluating Systems Engineering Concepts Using Systems Thinking.
IEEE Trans. Syst. Man Cybern. C. 1998 November, 28 (4) pp. 516–527
[14] Berg C.J. High-Assurance Design: Architecting Secure and Reliable Enterprise Applications. Addison
Wesley, 2006
[15] Bernstein Lawrence, & Yuhas C. M. Trustworthy Systems through Quantitative Software
Engineering. Wiley-IEEE Computer Society Press, 2005. About reliability not security
[16] Bishop M., & Engle S. The Software Assurance CBK and University Curricula. Proceedings of the
10th Colloquium for Information Systems Security Education, 2006
[17] Bishop M. Computer Security: Art and Practice. Addison-Wesley, 2003
[18] Bishop P., & Bloomfield R. A Methodology for Safety Case Development. Industrial Perspectives
of Safety-critical Systems: Proceedings of the Sixth Safety-critical Systems Symposium,
Birmingham. 1998
[19] Bishop P., & Bloomfield R. The SHIP Safety Case Approach. SafeComp95, Belgirate, Italy. Oct 1995
[20] Buehner M.J., & Cheng P.W. Causal Learning. In: The Cambridge Handbook of Thinking and
Reasoning, (Morrison R., & Holyoak K.J. eds.). Cambridge University Press, 2005, pp. 143–68.
[21] Cannon J.C. Privacy. Addison Wesley, 2005
[22] CAP 670 Air Traffic Services Safety Requirements. UK Civil Aviation Authority Safety
Regulation Group, 2012
[23] CAP 730 Safety Management Systems for Air Traffic Management A Guide to Implementation. UK
Civil Aviation Authority Safety Regulation Group, 12 September 2002
[24] CAP 760 Guidance on the Conduct of Hazard Identification, Risk Assessment and the Production
of Safety Cases For Aerodrome Operators and Air Traffic Service Providers, 10 December 2010
[25] Chung L. et al. Non-Functional Requirements in Software Engineering. Kluwer, 1999
[26] Clark D.D., & Wilson D.R. A Comparison of Commercial and Military Computer Security Policies,
Proc. of the 1987 IEEE Symposium on Security and Privacy, IEEE, pp. 184-196, 1987
[27] CNSS. National Information Assurance Glossary, CNSS Instruction No. 4009, 26 April 2010.
Available at http://www.cnss.gov/full-index.html
[28] Committee on Information Systems Trustworthiness. Trust in Cyberspace, Computer Science
and Telecommunications Board. National Research Council, 1999
[29] Committee on National Security Systems (CNSS) Instruction 4009: National Information
Assurance (IA) Glossary. Revised May 2003. Available at http://www.cnss.gov/Assets/
pdf/cnssi_4009.pdf
[30] Common Criteria Recognition Arrangement (CCRA). Common Criteria v3.1 Revision 2. NIAP
September 2007. Available at http://www.commoncriteriaportal.org.
[31] Common Weaknesses Enumeration. MITRE, 2012. Available at http://cwe.mitre.org
[32] Cooke N.J., Gorman J.C., Winner J.L. Team Cogitation. p. 239-268 In: [43]
[33] Courtois P.-J. Justifying the Dependability of Computer-based Systems: With Applications in Nuclear
Engineering. Springer, 2008
[34] Cranor L., & Garfinkel S. Security and Usability: Designing Secure Systems that People Can Use.
O’Reilly, 2005
[35] Dayton-Johnson. Jeff. Natural disasters and adaptive capacity. OECD Development Centre
Research programme on: Market Access, Capacity Building and Competitiveness. Working Paper
No. 237 DEV/DOC(2004)06, August 2004
[36] Department of Defense Directive 8500.1 (6 February 2003). Information Assurance (IA),
Washington, DC: US Department of Defense, ASD(NII)/DoD CIO, April 23, 2007. Available at
http://www.dtic.mil/whs/directives/corres/pdf/850001p.pdf.
[37] Department of Defense Strategic Defense Initiative Organization. Trusted Software Development
Methodology, SDI-S-SD-91-000007, vol. 1, 17 June 1992
[38] Department of Homeland Security National Cyber Security Division’s “Build Security In” (BSI)
web site, 2012, http://buildsecurityin.us-cert.gov
[39] Dependability Research Group. Safety Cases. University of Virginia, Available at: http://
dependability.cs.virginia.edu/info/Safety_Cases
[40] Despotou G., & Kelly T. Extending the Safety Case Concept to Address Dependability, Proceedings
of the 22nd International System Safety Conference, 2004
[41] Dowd M., McDonald J., Schuh J. The Art of Software Security Assessment: Identifying and
Preventing Software Vulnerabilities. Addison-Wesley, 2006
[42] Dunbar K., & Fugelsang J. Scientific Thinking and Reasoning. In: [59], p. 705–727
[43] Durso F.T., Nickerson R.S., Dumais S.T., Lewandowsky S., Perfect T.J. eds. Handbook of Applied
Cognition 2nd edition . Wiley, 2007
[44] Ellsworth P.C. Legal Reasoning. In: [59], p. 685–704
[45] Ericsson K.A., Charness N., Feltovich P.J., Hoffman R.R. eds. The Cambridge Handbook of
Expertise and Expert Performance. Cambridge University Press, 2006
[46] Fenton N., Littlewood B., Neil M., Strigini L., Sutcliffe A., Wright D. Assessing dependability
of safety critical systems using diverse evidence. IEE Proc. Softw. 1998 145 (1) pp. 35–39
[47] Gasser M. Building a Secure Computer System. Van Nostrand Reinhold, 1988. Available at http://
deke.ruc.edu.cn/wshi/readings/cs02.pdf
[48] Gray J.W. Probabilistic Interference. Proceedings of the IEEE Symposium on Research in Security
and Privacy. IEEE, p. 170-179, 1990
[49] Greenwell W., Strunk E., Knight J. Failure Analysis and the Safety-Case Lifecycle. IFIP Working
Conference on Human Error, Safety and System Development (HESSD) Toulouse, France. Aug 2004
[50] Greenwell W.S., Knight J.C., Pease J.J. A Taxonomy of Fallacies in System Safety Arguments. 24th
International System Safety Conference, Albuquerque, NM, August 2006
[51] Hall A., & Chapman R. Correctness by Construction: Developing a Commercial Secure System.
IEEE Softw. 2002 Jan/Feb, 19 (1) pp. 18–25
[52] Herrmann D.S. Software Safety and Reliability. IEEE Computer Society Press, 1999
[53] Hoglund G., & McGraw G. Exploiting Software: How to break code. Addison-Wesley, 2004
[54] Hollnagel E., Woods D.D., Leveson N. eds. Resilience Engineering: Concepts and Precepts.
Ashgate Pub Co, 2006
[55] Hollnagel E. ed. Handbook of cognitive task design. Lawrence Erlbaum Associates, 2003
[56] Hollnagel E. Human Error: Trick or Treat?. In: [43], p. 219–238
[57] Hollnagel E. Barriers and Accident Prevention. Ashgate, 2004
[58] Hollnagel E. Human Factors: From Liability to Asset. Presentation, 2007. Available at www.vtt.
fi/liitetiedostot/muut/Hollnagel.pdf
[59] Holyoak K.J., & Morrison R.G. eds. The Cambridge Handbook of Thinking and Reasoning.
Cambridge University Press, 2005
[60] Howard M., & LeBlanc D.C. Writing Secure Code. Microsoft Press, Second Edition, 2002
[61] Howard M., & Lipner S. The Security Development Lifecycle. Microsoft Press, 2006
[62] Howell C. Assurance Cases for Security Workshop (follow-on workshop of the 2004 Symposium
on Dependable Systems and Networks), June, 2005
[63] IEC 60050‑191, International Electrotechnical Vocabulary, Chapter 191: Dependability and
Quality of Service
[64] IEC 60300Dependability management [several parts]
[65] IEC 60300‑3‑15 ed1.0 (2009-06) Dependability management - Part 3-15 – Application guide -
Engineering of system dependability
[66] IEC 60300‑3‑2 ed.2.0 (2004-11), Dependability management – Part 3-2: Application guide -
Collection of dependability data from the field
[67] IEC 60812 ed2.0 (2006-01), Analysis techniques for system reliability - Procedure for failure mode
and effects analysis (FMEA)
[68] IEC 61025 ed2.0 (2006-12), Fault tree analysis (FTA)
[69] IEC 61078 ed2.0 (2006-01), Analysis techniques for dependability - Reliability block diagram and
Boolean methods
[70] IEC 61508 ed2.0, Functional safety of electrical/electronic/programmable electronic safety-related
systems [several parts]
[71] IEC 61508‑7 ed2.0 (2010-04), Functional safety of electrical/electronic/programmable electronic
safety-related systems – Part 7: Overview of techniques and measures
[72] IEC 61511 ed1.0, Functional safety - Safety instrumented systems for the process industry sector
[several parts]
[73] IEC 61882 ed1.0 (2001-05), Hazard and operability studies (HAZOP studies) - Application guide
[74] IEC CD 62741 ed.1.0, Reliability of systems, equipment, and components. Guide to the demonstration
of dependability requirements. The dependability case
[75] Std IEEE 1228-1994, IEEE Standard for Software Safety Plans
[76] International Council on Systems Engineering INCOSE. Guide to Systems Engineering Body
of Knowledge (G2SEBoK). Available at http://g2sebok.incose.org/
[77] ISO 12100:2010, Safety of machinery — General principles for design — Risk assessment and
risk reduction
[78] ISO 13849, Safety of machinery — Safety-related parts of control systems [three parts]
[79] ISO 14620, Space systems — Safety requirements [three parts]
[80] ISO 14625:2007, Space systems — Ground support equipment for use at launch, landing or retrieval
sites — General requirements
[81] ISO 19706:2011, Guidelines for assessing the fire threat to people
[82] ISO 20282, Ease of operation of everyday products [four parts]
[83] ISO 2394:1998, General principles on reliability for structures
[84] ISO 28003:2007, Security management systems for the supply chain — Requirements for bodies
providing audit and certification of supply chain security management systems
[85] ISO 9241‑400:2007, Ergonomics of human — system interaction — Part 400: Principles and
requirements for physical input devices
[86] ISO/IEC 12207:2008, Systems and software engineering — Software life cycle processes
[87] ISO/IEC 15288:2008, Systems and software engineering — System life cycle processes
[88] ISO/IEC 15408, Information technology — Security techniques — Evaluation criteria for IT security
[three parts]
[89] ISO/IEC TR 15443, Information technology — Security techniques — Security assurance
framework [two parts]
[90] ISO/IEC 15939:2007, Systems and software engineering — Measurement process
[91] ISO/IEC 16085:2006, Systems and software engineering — Life cycle processes — Risk Management
[92] ISO/IEC/IEEE 16326:2009, Systems and software engineering — Life cycle management -
Project management
[93] ISO/IEC 18014, Information technology — Security techniques — Time-stamping services [three parts]
[94] ISO/IEC 18028, Information technology — Security techniques — IT network security [many parts]
[95] ISO/IEC 19770, Information technology — Software Asset Management [two parts]
[96] ISO/IEC 21827:2008, Information technology — Security techniques — Systems Security
Engineering — Capability Maturity Model® (SSE-CMM®)
[97] ISO/IEC 2382‑14:1997, Information technology — Vocabulary — Part 14: Reliability, maintainability
and availability
[98] ISO/IEC 25000:2005, Software Engineering — Software product Quality Requirements and
Evaluation (SQuaRE) — Guide to SQuaRE
[99] ISO/IEC 25010:2011, Systems and software engineering — Systems and software product Quality
Requirements and Evaluation (SQuaRE) — System and software quality models
[100] ISO/IEC 25012:2008, Software engineering — Software product Quality Requirements and
Evaluation (SQuaRE) — Data quality model
[101] ISO/IEC 25020:2007, Software engineering — Software product Quality Requirements and
Evaluation (SQuaRE) — Measurement reference model and guide
[102] ISO/IEC 25030:2007, Software engineering — Software product Quality Requirements and
Evaluation (SQuaRE) — Quality requirements
[103] ISO/IEC 25040:2011, Systems and software engineering — Systems and software Quality
Requirements and Evaluation (SQuaRE) — Evaluation process
[104] ISO/IEC 25051:2006, Software engineering — Software product Quality Requirements and
Evaluation (SQuaRE) — Requirements for quality of Commercial Off-The-Shelf (COTS) software
product and instructions for testing
[105] ISO/IEC 26702:2007, Systems engineering — Application and management of the systems
engineering process
[106] ISO/IEC 27000:2012, Information technology — Security techniques — Information security
management systems — Overview and vocabulary
[107] ISO/IEC 27001:2013, Information technology — Security techniques — Information security
management systems – Requirements
[108] ISO/IEC 27002:2013, Information technology — Security techniques — Code of practice for
information security controls
[109] ISO/IEC 27004:2009, Information technology — Security techniques — Information security
management — Measurement
[110] ISO/IEC 27005:2011, Information technology — Security techniques — Information security
risk management
[111] ISO/IEC 27006:2011, Information technology — Security techniques — Requirements for bodies
providing audit and certification of information security management systems
[112] ISO/IEC 27011:2008, Information technology — Security techniques — Information security
management guidelines for telecommunications organizations based on ISO/IEC 27002
[135] McGraw G. Software Security: Building Security In. Addison Wesley, 2006
[136] McLean J. Security Models. In: Encyclopedia of Software Engineering, (Marciniak J. ed.). Wiley, 1994
[137] Meier J.D., Mackman A., Vasireddy S., Dunner M., Escamilla R., Murukan A.
Improving Web Application Security: Threats and Countermeasures, Microsoft, 2004.
Available at: http://download.microsoft.com/download/d/8/c/d8c02f31-64af-438c-
a9f4-e31acb8e3333/Threats_Countermeasures.pdf
[138] Merkow M.S., & Breithaupt J. Computer Security Assurance Using the Common Criteria.
Thompson Delamr Learning, 2005
[139] Ministry of Defence. Defence Standard 00-42 Issue 2, Reliability and Maintainability (R&M)
Assurance Guidance. Part 3, R&M Case, 6 June 2003
[140] Ministry Of Defence. Defence Standard 00-55 (PART 1)/Issue 2, Requirements for Safety Related
Software in Defence Equipment Part 1: Requirements, 21 August 1997
[141] Ministry of Defence. Defence Standard 00-55 (PART 2)/Issue 2, Requirements for Safety Related
Software in Defence Equipment Part 2: Guidance, 21 August 1997
[142] Ministry of Defence. Interim Defence Standard 00-56, Safety Management Requirements for
Defence Systems Part 1: Requirements, 17 December 2004
[143] Ministry of Defence. Interim Defence Standard 00-56, Safety Management Requirements for Defence
Systems Part 2: Guidance on Establishing a Means of Complying with Part 1, 17 December 2004
[144] Moore A., Klinker E., Mihelcic D. How to Construct Formal Arguments that Persuade Certifiers.
In: Industrial Strength Formal Methods in Practice. Academic Press. 1999
[145] National Aeronautics and Space Administration (NASA) Software Assurance Guidebook.
September 1989 (NASA-GB-A201). Available at http://www.hq.nasa.gov/office/codeq/doctree/
nasa_gb_a201.pdf
[146] National Offshore Petroleum Safety Authority. Safety case. [Online Documents [cited on:
20 Jun 2012] Available at http://www.nopsema.gov.au/safety/safety-case/
[147] National Research Council (NRC) Computer Science and Telecommunications Board.
(CSTB). Cybersecurity Today and Tomorrow: Pay Now or Pay Later. National Academies Press,
2002. Available at http://www.nap.edu/topics.php?topic=320&start=10
[148] National Security Agency, The Information Systems Security Engineering Process (IATF) v3.1. 2002
[149] Naval Research Laboratory. Handbook for the Computer Security Certification of Trusted
Systems. US Naval Research Laboratory, 1995
[150] NDIA System Assurance Committee. Engineering for System Assurance. National Defense
Industrial Association, USA, 2008
[151] NIST. Federal Information Processing Standards Publication (FIPS PUB) 200: Minimum Security
Requirements for Federal Information and Information Systems. March 2006. Available at http://
csrc.nist.gov/publications/fips/fips200/FIPS-200-final-march.pdf
[152] NIST. NIST Special Publication 800-27, Rev A: Engineering Principles for Information Technology
Security (A Baseline for Achieving Security). Revision A, June 2004. Available at http://csrc.nist.
gov/publications/nistpubs/800-27A/SP800-27-RevA.pdf
[153] NIST. NIST Special Publication 800-33, Underlying Technical Models for Information
Technology Security, December 2001. Available at http://csrc.nist.gov/publications/
nistpubs/800-33/sp800-33.pdf
[154] Process Framework O.P.E.N. Safety Cases. [Online Document cited on: 20 Jun 2012] Available
at: http://www.opfro.org/index.html?Components/WorkProducts/SafetySet/SafetySet.
html~Contents
[155] OPSI. The Offshore Installations (Safety Case) Regulations 2005. [Online Document cited on: 20
June 2012.] Available at http://www.opsi.gov.uk/si/si2005/20053117.htm
[156] Park J., Montrose B., Froscher J. Tools for Information Security Assurance Arguments. DARPA
Information Survivability Conference & Exposition II, 2001. DISCEX ’01. Proceedings, 2001
[157] Petroski H. Design Paradigms. Cambridge University Press, 1994
[158] Prasad D. Dependable Systems Integration using Measurement Theory and Decision Analysis, PhD
Thesis, Department of Computer Science, University of York, UK, 1998
[159] PSM Safety & Security TWG. Security Measurement. Nov 2004
[160] Pullum L.L. Software Fault Tolerance. Artech House, 2001
[161] Randell B., & Koutny M. Failures: Their Definition, Modelling and Analysis. School of Computing
Science, Newcastle University CS-TR NO 994, Dec 2006Randell B., & Rushby J.M. Distributed Secure
Systems: Then and Now. CS-TR No 1052 School of Computing Science, Newcastle University, Oct 2007
[162] Rechtin E. Systems Architecting of Organizations: Why Eagles Can’t Swim. CRC Press, Boca
Raton, FL, 2000
[163] Redwine S.T. Jr. ed. Software Assurance: A Guide to the Common Body of Knowledge to Produce, Acquire,
and Sustain Secure Software Version 1.1. US Department of Homeland Security, September 2006
[164] Redwine S.T. Jr. The Quality of Assurance Cases. Workshop on Assurance Cases for Security: The
Metrics Challenge, International Conference on Dependable Systems and Networks, 2007
[165] Redwine S.T. Jr., & Davis N. eds. Processes for Producing Secure Software: Towards Secure
Software. Vols. I and II. Washington, D.C.: National Cyber Security Partnership, 2004. Available
at http://www.cigital.com/papers/download/secure_software_process.pdf
[166] Ross K.G., Shafer J.L., Klein G. Professional Judgements and ‘Naturalistic Decision Making’. In:
[45], p. 403-420
[167] Ross R. et al. Recommended Security Controls for Federal Information Systems, NIST Special
Publication 800-53, Aug 2009. Available at http://csrc.nist.gov/publications/nistpubs/800-
53-Rev3/sp800-53-rev3-final_updated-errata_05-01-2010.pdf
[168] SAE JA1000, Reliability Program Standard, SAE International, June 1998. Available at http://www.
sae.org
[169] Saltzer J.H., & Schroeder M.D. The protection of information in computer systems. Proc. IEEE.
1975, 63 (9) pp. 1278–1308. Available at: http://cap-lore.com/CapTheory/ProtInf/
[170] Seminal Papers - History of Computer Security Project, University of California Davis Computer
Security Laboratory. Available at: http://seclab.cs.ucdavis.edu/projects/history/seminal.html
[171] Serene. “Safety argument.” [Online Document] [cited on: 13 Feb 2007] Available at: http://www2.
dcs.qmul.ac.uk/~norman/SERENE_Help/sereneSafety_argument.htm
[172] Severson K. Yucca Mountain Safety Case Focus of NWTRB September Meeting. United States
Nuclear Waste Technical Review Board. Aug 2006
[173] Sieck W.R., & Klein G. Decision making. In: [43], p. 195-218
[174] Software and Systems Engineering Vocabulary (sevocab). Available at www.computer.org/sevocab
[175] Sommerville I. Software Engineering. Pearson Education, Eighth Edition, 2006
[176] Stoneburner G., Hayden C., Feringa A. Engineering Principles for Information Technology Security
(A Baseline for Achieving Security), Revision A, NIST Special Publication 800-27 Rev A, June 2004
[177] Storey N. Safety-Critical Computer Systems. Addison Wesley, 1996
[178] Strunk E., & Knight J. The Essential Synthesis of Problem Frames and Assurance Cases. IWAAPF’06,
Shanghai, China. May 2006
[179] Swiderski F., & Snyder W. Threat Modeling. Microsoft Press, 2004
[180] U.S. NRC. “Quality Assurance Case Studies at Construction Projects.”
[181] Vanfleet W.M. et al. MILS: Architecture for High Assurance Embedded Computing,” Crosstalk,
August, 2005
[182] Viega J., & McGraw G. Building Secure Software: How to Avoid Security Problems the Right Way.
Addison Wesley, Reading, MA, 2001
[183] Walker V.R. Risk Regulation and the ‘Faces’ of Uncertainty, Risk: Health, Safety and Environment.
p. 27-38, Winter 1998
[184] Ware W.H. Security Controls for Computer Systems (U): Report of Defense Science Board Task
Force on Computer Security, The RAND Corporation, Santa Monica, CA (Feb. 1970)
[185] Weaver R. The Safety of Software – Constructing and Assuring Arguments. Doctorial Thesis –
University of York: Department of Computer Science. 2003
[186] Weaver R., Fenn J., Kelly T. A Pragmatic Approach to Reasoning about the Assurance of Safety
Arguments. 8th Australian Workshop on Safety Critical Systems and Software (SCS’03),
Canberra. 2003
[187] Whittaker J.A., & Thompson H.H. How to Break Software Security: Effective Techniques for
Security Testing. Pearson Education, 2004
[188] Williams J., & Schaefer M. Pretty Good Assurance. Proceedings of the New Security Paradigms
Workshop. IEEE Computer Society Press. 1995
[189] Williams J.R., & Jelen G.F. A Framework for Reasoning about Assurance, Document Number ATR
97043, Arca Systems, Inc., 23 April 1998
[190] Yates J.F., & Tschirhart M.D. Decision-Making Expertise. In: [45], p. 421-438
[191] Yee K.-P. User interaction design for secure systems. Proceedings of the 4th International Conference
on Information and Communications Security, Springer-Verlag, LNCS 2513, 2002
ICS 35.080
Price based on 24 pages
About us Revisions
We bring together business, industry, government, consumers, innovators Our British Standards and other publications are updated by amendment or revision.
and others to shape their combined experience and expertise into standards We continually improve the quality of our products and services to benefit your
-based solutions. business. If you find an inaccuracy or ambiguity within a British Standard or other
The knowledge embodied in our standards has been carefully assembled in BSI publication please inform the Knowledge Centre.
a dependable format and refined through our open consultation process.
Organizations of all sizes and across all sectors choose standards to help Copyright
them achieve their goals. All the data, software and documentation set out in all British Standards and
other BSI publications are the property of and copyrighted by BSI, or some person
Information on standards or entity that owns copyright in the information used (such as the international
We can provide you with the knowledge that your organization needs standardization bodies) and has formally licensed such information to BSI for
to succeed. Find out more about British Standards by visiting our website at commercial publication and use. Except as permitted under the Copyright, Designs
bsigroup.com/standards or contacting our Customer Services team or and Patents Act 1988 no extract may be reproduced, stored in a retrieval system
Knowledge Centre. or transmitted in any form or by any means – electronic, photocopying, recording
or otherwise – without prior written permission from BSI. Details and advice can
Buying standards be obtained from the Copyright & Licensing Department.
You can buy and download PDF versions of BSI publications, including British
and adopted European and international standards, through our website at Useful Contacts:
bsigroup.com/shop, where hard copies can also be purchased. Customer Services
If you need international and foreign standards from other Standards Development Tel: +44 845 086 9001
Organizations, hard copies can be ordered from our Customer Services team. Email (orders): orders@bsigroup.com
Email (enquiries): cservices@bsigroup.com
Subscriptions
Subscriptions
Our range of subscription services are designed to make using standards
Tel: +44 845 086 9001
easier for you. For further information on our subscription products go to
Email: subscriptions@bsigroup.com
bsigroup.com/subscriptions.
With British Standards Online (BSOL) you’ll have instant access to over 55,000 Knowledge Centre
British and adopted European and international standards from your desktop. Tel: +44 20 8996 7004
It’s available 24/7 and is refreshed daily so you’ll always be up to date. Email: knowledgecentre@bsigroup.com
You can keep in touch with standards developments and receive substantial
Copyright & Licensing
discounts on the purchase price of standards, both in single copy and subscription
format, by becoming a BSI Subscribing Member. Tel: +44 20 8996 7070
Email: copyright@bsigroup.com
PLUS is an updating service exclusive to BSI Subscribing Members. You will
automatically receive the latest hard copy of your standards when they’re
revised or replaced.
To find out more about becoming a BSI Subscribing Member and the benefits
of membership, please visit bsigroup.com/shop.
With a Multi-User Network Licence (MUNL) you are able to host standards
publications on your intranet. Licences can cover as few or as many users as you
wish. With updates supplied as soon as they’re available, you can be sure your
documentation is current. For further information, email bsmusales@bsigroup.com.