Rose-SQAP-v1 3 3
Rose-SQAP-v1 3 3
Rose-SQAP-v1 3 3
CASC/Computing
Version 1.3.3
6/20/2019
ROSE Software Project LLNL-SM-706671-DRAFT, Rev 1.1
Disclaimer
This document was prepared as an account of work sponsored by an agency of the United States
government. Neither the United States government nor Lawrence Livermore National Security, LLC,
nor any of their employees makes any warranty, expressed or implied, or assumes any legal liability
or responsibility for the accuracy, completeness, or usefulness of any information, apparatus,
product, or process disclosed, or represents that its use would not infringe privately owned rights.
Reference herein to any specific commercial product, process, or service by trade name,
trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement,
recommendation, or favoring by the United States government or Lawrence Livermore National
Security, LLC. The views and opinions of authors expressed herein do not necessarily state or
reflect those of the United States government or Lawrence Livermore National Security, LLC, and
shall not be used for advertising or product endorsement purposes.
Auspices Statement
This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore
National Laboratory under Contract DE-AC52-07NA27344.
Revision History
Document Revision
Version Date Originator(s) Revision Description
1.0 9/26/2016 G Pope Initial draft
1.1 10/20/2016 G. Pope Review edits
1.1 11/11/2016 G. Pope Added Figure numbering
1.2 1/24/2017 G. Pope Added ROSE review notes
1.3 7/2/2018 G. Pope Update Org Chart
1.3.1 10/16/2018 G. Pope Typos corrected from Ellen Hill review
1.3.2 2/15/2019 G. Pope Release versioning method, update to org chart.
1.3.3 6/20/2019 G. Pope Update Org Chart
Table of Contents
Approvals ............................................................................................................................................. iii
Revision History .................................................................................................................................. iv
1 Purpose.......................................................................................................................................... 1
2 Reference documents .................................................................................................................... 1
3 Management .................................................................................................................................. 3
3.1 Organization ........................................................................................................................... 3
3.2 Tasks....................................................................................................................................... 4
3.3 Roles and responsibilities ....................................................................................................... 7
3.4 Quality assurance estimated resources ................................................................................... 7
4 Documentation .............................................................................................................................. 7
4.1 Purpose ................................................................................................................................... 7
4.2 Minimum documentation requirements ................................................................................. 7
4.2.1 Software requirements description ................................................................................. 8
4.2.2 Software design description ............................................................................................ 8
4.2.3 Verification and validation processes ........................................................................... 10
4.2.4 Verification results report and validation results report ............................................... 13
4.2.5 User documentation ...................................................................................................... 13
4.2.6 Software configuration management activities ............................................................. 14
4.3 Other documentation ............................................................................................................ 15
5 Standards, practices, conventions, and metrics ........................................................................... 16
5.1 Purpose ................................................................................................................................. 16
5.2 Content ................................................................................................................................. 16
6 Software reviews ......................................................................................................................... 16
6.1 Purpose ................................................................................................................................. 17
6.2 Minimum requirements ........................................................................................................ 17
6.2.1 Software specifications review ..................................................................................... 17
6.2.2 Design review ............................................................................................................... 18
6.2.3 Managerial reviews ....................................................................................................... 18
6.2.4 Post implementation review .......................................................................................... 19
6.2.5 Verification and validation plan review ........................................................................ 19
6.2.6 Software configuration management plan review ........................................................ 19
6.2.7 Baseline configuration audit ......................................................................................... 19
6.2.8 Functional audit ............................................................................................................ 20
6.2.9 Versioning ..................................................................................................................... 20
7 Test .............................................................................................................................................. 20
8 Problem reporting and corrective action ..................................................................................... 20
9 Tools, techniques and methodologies ......................................................................................... 21
10 Media control .............................................................................................................................. 23
11 Supplier control ........................................................................................................................... 23
12 Records collection, maintenance, and retention ......................................................................... 24
13 Training ....................................................................................................................................... 25
14 Risk management ........................................................................................................................ 25
15 Glossary ...................................................................................................................................... 26
16 SQAP change procedure and history .......................................................................................... 27
17 Software application retirement .................................................................................................. 27
List of Tables
Table 1, Task Entrance and Exit Criteria .............................................................................................. 5
Table 2, Roles and Responsibilities ...................................................................................................... 7
Table 3, Planned Reviews ................................................................................................................... 17
Table 4, High-Level Tool Information ............................................................................................... 21
Table 5, Media Types and Control ..................................................................................................... 23
Table 6, Training Requirements ......................................................................................................... 25
Table 7, Acronym/Definitions ............................................................................................................ 26
List of Figures
Figure 1 ROSE Project Organization.................................................................................................... 3
Figure 2 ROSE Development Lifecycle ............................................................................................... 5
Figure 3 ROSE Framework Architecture ........................................................................................... 10
Figure 4 Test Matrix Combinations .................................................................................................... 12
Figure 5 ROSE Verification and Validation Process .......................................................................... 13
Figure 6 ROSE Configuration Management (CM) Process ............................................................... 15
1 Purpose
The purpose of this Software Quality Assurance Plan (SQAP) is to document the software quality
assurance (SQA) approach that the application development team will follow. The SQA activities
planned in this document meet the SQA requirements of its sponsors, stakeholders, and users. The
SQAP and its cited documents provide the framework for implementing the software engineering
activities and flowing down SQA requirements.
This plan applies to all software procured, acquired, or developed within the software effort’s
budget. For acquired software applications and/or components and new development for legacy
code, only parts of this SQAP are applicable. It does not address work products produced by end
users that utilize any of the effort’s software. In addition to the tools listed in Table 4, High-Level
Tool Information, this plan applies to the following software products:
• ROSE Software Framework and Special Purpose tools built using the ROSE Framework.
The software effort will follow a IEEE life-cycle model. See Section 3.2 for additional life-cycle
details.
ROSE consists of a library (and set of associated tools) used by computer scientists to apply
compiler techniques to source code to improve application performance, developer productivity,
deeper understanding of the code, and automate porting to current and future platform architectures.
ROSE also contains features to analyze binary code derived from the source languages to provide
further analysis, insights, and assurance of the compiled codes integrity.
The ROSE Framework is a Research and Development tool suitable for scientific discovery when
used by researchers and optimized for exploration and flexibility. The ROSE framework also
supports building of more narrowly focused special purpose tools which are targeted for application
developers who demand additional rigor and reliability in conducting their analysis, transformations,
and optimizations. This Software Quality Assurance Plan addresses the software development
processes for both the research and special purpose tools built using the ROSE frameworks.
2 Reference documents
The following are the applicable governing Federal Regulations and DOE Orders:
DOE O 414.1D Admin Chg 1, Quality Assurance
DOE O 200.1A, Information Technology
The IEEE Standard for Software Quality Assurance Plans (IEEE Std 730™-2002) was used as a
guide in the development of this document to flow down the requirements identified in RID-0118,
Software Quality Assurance Consensus Standards for Quality Assurance Criteria, using the graded
approach contained in FRM-3104, SQA Required Practices for Non-830 Software. No claim to
conformance with the standard is made.
In addition, the following institutional documents are used as applicable, unless superseded by
programs, standards, policies, procedures, or processes required by the organization:
• DES-0115, LLNL Quality Assurance Program
• ES&H Manual Document 3.1, Nonnuclear Safety Basis Program
• DES-0048, LLNL Assessment Program
• PRO-0050, Internal Independent Assessment
• PRO-0052, Management Self-Assessments
• PRO-0053, Performing Management Observations & Inspections
• ES&H Manual Document 2.2, LLNL Institutional-Wide Work Planning and Control Process
3 Management
This section describes the software effort’s organizational structure, its tasks, and its roles and
responsibilities.
3.1 Organization
The organizational structure that influences and controls the quality of the software is described
below. The roles and responsibilities for each item in the organizational description are defined in
the table in Section 3.3.
Tristan
Craig Rasmussen
Vanderbruggen
Researcher/
Researcher/
Developer
Developer
Peter Pirkelbauer
Researcher/
Developer
The ROSE Project Organization Chart is shown above in figure 1. The organization reflects the dual
nature of the ROSE project as both a research group and a tool production group. Traditionally
ROSE has been a research project, started over 15 years ago and winning an R&D 100 award in
2009. ROSE research has led to the development of a powerful and flexible framework that can be
used by researchers to develop custom transforms and analysis to advance computer science and
software engineering. The ROSE framework is also used by the ROSE team to develop specialized
tools for application developers, which requires a more disciplined approach. The ROSE
organization reflects this by having tandem leadership for each of the two ROSE customer types.
Dan Quinlan heads the research activities of the ROSE project as well as supporting ROSE
specialized products. Gregory Pope heads the programmatic responsibilities for delivering
specialized tools built using the ROSE framework. These specialized tools are subjected to
additional levels of rigor specific to the customer’s requirements.
Dan reports to the CASC (Center for Applied Scientific Computing) Division, primarily a research
organization, and Greg reports to the ASQ (Applications, Simulation, and Quality) division an
organization of application developers and software quality engineers. The need for the tandem
management approach evolved out of the popularity of ROSE and the diversity of its customers.
While the ROSE framework gives researchers almost unlimited flexibility, application programmers
demand a more focused set of features that are targeted for development environments where
deadlines and additional robustness are required. The ROSE project operates at two different Risk
Levels (RLs). The ROSE research framework is a RL4 code targeted for researchers who demand
flexibility and is subjected to rigorous automated testing. The specialized tools built using the ROSE
framework are RL 3 codes, since their consequence of failure may have delaying impacts on
customer software development schedules. These codes are subjected to an additional level of
independent functional, documentation verification, and stress testing.
3.2 Tasks
The life-cycle followed on ROSE consists of an Agile/Iterative as shown in the process flow
diagram below in figure 2. The stages/points in the process where quality checkpoints are located are
highlighted in yellow. These are decision points for continuing to the next phase of the life-cycle.
Future Release
Fix Fail
Research Fail
Discovery
Jira
Fix In Future Release
Tracker
Pass
No
CORE Users
Core Users
Tool System
Yes Pass Tool Users
Release Test
User Feedback
Fail
The tasks to be performed, with entry and exit criteria for each task, are shown in Table 1.
The following product and assurance tasks will be performed:
The various ROSE projects are planned and tracked using the ROSE master plan shown in the ROSE
development lifecycle. The master plan includes a schedule, milestones, deliverables, and assigned
staff for each task. The master plan also acts as a clearing house for new ROSE projects, since
requests for ROSE resources may come from many different stakeholders. Each request is evaluated
for technical content and programmatic impact to other task commitments before being added to the
ROSE master plan. This reduces the risk of ROSE resources being over committed. The Principal
Researcher and Product Realization Manager at a minimum will evaluate each new task request. The
ROSE master schedule is maintained using the Microsoft Project tool.
Tasks on the ROSE master schedule will be tracked using JIRA task tickets assigned to the ROSE
staff. A weekly scrum (a short project task review) is held to evaluate each ROSE staff member’s
task progress using the Kanban Board feature of JIRA. The Kanban Boards show current tasks,
future tasks, and completed tasks for each staff member using the JIRA task tickets. During the
weekly scrum, any obstacles to progress or task slippage are immediately identified so corrective
actions can be implemented. Corrective actions may include staffing changes, de-scoping, make/buy,
reuse, milestone realignment, outsourcing, feature prioritization changes, etc. During the week, any
time a JIRA task ticket is updated an email is sent to other staff or stakeholders that may be
impacted.
Project milestones on the master schedule are further broken down into feature lists needed to
complete each milestone in the requirements analysis step of the life cycle. Requirement reviews
may be conducted with the ROSE team members and stakeholders to clarify requirements. The
feature lists are further divided into sprints (a three or four week period) containing the features to be
added during that sprint, called the sprint backlog. This process is conducted in the design review
life cycle step. A feature may be a capability or repair added to the ROSE core framework, or
features added to build a tool being built with the ROSE framework. Further design review may also
be conducted by assembling the developer team in cases where the feature implementation is not
straight forward or has impact on multiple developers.
Next in the code/debug lifecycle step each feature and high level design is converted to C++ code by
developers. The new code is then debugged and committed to the GIT repository which triggers a set
of smoke tests to execute on the new source code. Smoke tests are a small but representative set of
the total test suite that execute in 2 to 3 hours and verify basic functionality. If the smoke tests pass
the new code joins the developer branch where a much larger set of regression tests begin execution
in the Continuous Integration step of the ROSE life cycle.
If the project is an enhancement to the core ROSE framework passing of the suite of regression tests
concludes successfully and passes the next feature in the sprint backlog is added. If the new code is
written using the ROSE framework to build a tool for a user, the next feature can be started by
developer(s) but when the sprint backlog is completed (or sooner if appropriate) an additional level
of independent system testing is conducted by the developer in test and/or the independent tester.
The independent testers install the ROSE tool when the current sprint is completed and test it from
the user’s point of view and assure the supplied documentation and tutorial problems work correctly.
Issues discovered in regression testing, independent testing, or after release (see appendix D for
release process) are tracked using the JIRA issue tracker.
In general, the ROSE project lifecycle attempts to identify as early as possible unanticipated
technical challenges or other obstacles so that corrective action can be taken and stakeholder
expectations can be updated weekly. This programmatic approach significantly reduces the risk of
missing delivery release dates for tools built using the ROSE framework by the ROSE team.
Role Responsibilities
Principal Researcher Lead Researcher, Technical Direction
Product Realization SQE Program Management, Software Quality
Researcher/ Developer Computer Science Discovery, Code Development
Developer Requirements Analysis, Design, Code Development
Developer Build, Test, Release Build and Test Automation, CM, Release Management
Developer in Test Test Design, Integration Test, System Test, User Support
Independent System Test Independent System Testing, System Test Automation
4 Documentation
NOTE: Throughout this SQAP, the terms “document”, “documents”, and “documentation”, when
used as a noun, may refer to a physical, paper-based or electronic document; artifact repository;
database; web page; or other means employed by the software development team to capture the
indicated information.
This section identifies document deliverables, as well as documents, that describe quality processes.
The level of detail in the documents is commensurate with the complexity and significance of the
system, the work environment, and the personnel proficiency and capability (education, training,
experience). These documents are controlled documents and require review and approval.
4.1 Purpose
This lists the documents needed for development, verification and validation, use, and maintenance
of the software.
may be split into multiple documents or combined and/or included in other documents, as long as the
noted content is adequately addressed.
Each requirement (stakeholder desired feature or bug fix) is uniquely identified and defined such that
its achievement is capable of being objectively verified and validated. These requirements may be
internally and/or externally (e.g., customer) generated. The tracking tool that is used for this purpose
is the Confluence Wiki and the JIRA tracking tool. Customer requirements may also take the form of
written and controlled documents or meeting notes.
For detailed implemented and released requirements information, see the ROSE User Guide
http://rosecompiler.org/ROSE_UserManual/ROSE-UserManual.pdf . For requirements being
implemented in the current development branch see the ROSE Project Confluence for each project’s
release and sprint back logs.
NAP-24A additional considerations: The ROSE framework has been analyzed by the Klocwork
static analyzer and security vulnerabilities fixed. The ROSE open source code is released with a
signed hash and the released and signed ROSE download available to the public may not be
modified by other than the ROSE project team. If a downloaded ROSE framework is changed by the
public locally it will no longer match the signed version.
reviewed by the code team in a design review. Design information is contained in C++ comments
and in the ROSE Documentation page at http://rosecompiler.org/?page_id=11 and includes all of
some of the following:
• Class definitions
• Methods
• Process flow
• Computational sequences
• Algorithms
• Physical models
• Control flow
• Control logic
• States and state changes
• Data flow
• Data structures
• Inputs and output
• Exception/error handling, reporting, and logging
• Design features that ensure proper use of the software; such as access control features, help
text, user interface design features, etc.
• Internal interfaces
• Relationships between the above
ARM/AMD64/
MIPS/M6800/ Binary
PowerPC/x86/ Disassembler
Binary Code
C++ code to
construct
Analysis
or
Transformations
In addition the ROSE project can use tools such as Visustin, Klocwork, or McCabe IQ to create
detailed calling diagrams, UML activity diagrams, or flow charts for the as built C++ code. These
tools are available to the ROSE project through the Product Realization Manager and SQE.
The extent of verification and the methods chosen are a function of the complexity of the software,
the degree of standardization, and the similarity with previously proved software.
The ROSE V&V processes include the description below of the qualification and acceptance testing
that is performed:
• There are multiple levels of testing for ROSE. ROSE uses a continuous integration (CI)
method of testing; the testing process starts as soon as changes are committed to the
developer’s branch.
• Developers are responsible for conducting unit testing on new or modified code. When the
developer is confident the new or modified code is ready for the next level of testing the code
is committed to the GIT repository, where a set of smoke tests run on the new or modified
code using a ROSE baseline on the developer’s branch.
• If the smoke tests pass the new or modified code is committed to the main developer branch
and continuous integration testing commences.
• Continuous integration testing at a minimum consists of Matrix Testing, Regression Testing,
and Standards Testing.
• Orthogonal to the CI testing is the Matrix testing which checks the new code and baseline
against numerous Compiler and Boost versions combinations. Additional testing
combinations include operating systems and versions. The total number of combinations can
reach into the tens of millions, hence matrix testing can take up to two weeks of continuous
running. Figure 4 indicates the matrix test names and number of combinations:
Name Count
assertions 1
boost 14
build 1
compiler 30
debug 1
dlib 9
doxygen 12
dwarf 1
edg 1
java 1
languages 3
magic 1
optimize 1
python 1
qt 1
readline 3
sqlite 1
warnings 1
wt 4
yaml 4
yices 2
Debian 6.0.10
Debian 8
RHEL 6.7
RHEL 6.8
RHEL 7.2
RHEL 7.3
Ubuntu 14.04
Ubuntu 16.04
Figure 4 Test Matrix Combinations
• Regression testing consists of running over 50,000 individual tests, many derived from prior
errors found on ROSE over many years.
• Standards testing consist of running ROSE against the Plumb Hall compiler industry
benchmark for C++ and C codes.
• If the software under test is a tool built from the ROSE framework then an additional level of
independent testing is conducted to assure the tool meets the user’s acceptance criteria, the
documentation agrees with the code, and the installation process works correctly on the
platform(s) of interest.
• The objective acceptance criteria for ROSE Framework development is passing of the smoke
and CI tests. The objective acceptance criteria for ROSE built tools include the additional
layer of user-based tests by independent testers.
• Each automated test case for smoke and CI tests contains an expected correct answer. The
expected correct answer has been verified a priori by the developers. The criteria for the
independent system testing are verification against the user guide, agreement with tutorial
problems, and the stakeholder statement of work.
• Independent assessments of the ROSE project have been performed multiple times in FY 16
and will continue into FY 17 and beyond as part of the robustification effort. Peer reviews,
design reviews, and requirement analysis reviews are conducted as part of the sprint planning
and as needed.
• The workflow for the ROSE development process is automated using the Jenkins and JIRA
tools to assure the lifecycle is followed.
This section of the SQAP describes the software verification and validation processes and tasks to be
performed to satisfy the V&V requirements.
GIT
Unit Testing Smoke Tests Pass
Repository
Fail
Continous Integration
Matrix/
Regression Application Standards Rose Built Ind. System
Static/Dyn Ran Pass Pass Pass Yes
Testing Testing Testing Tool Testing
Testing
Tool Release
Test reports contain the test number, date of test, time of test, the expected result of running the test,
the actual result of running the test, pass/fail.
The V&V results are approved prior to using the software. Using the software is defined as” by
users after a formal release of the software”. This approval of results signifies that the criteria in
Section 4.2.3 have been met. The approval is done by the Principal Researcher prior to a formal tool
release which goes to users (details of the release process are contained in the ROSE Confluence
Wiki) .
• Tools built by the ROSE team using the ROSE framework will contain additional user
documentation and tutorial problems.
• Installation instruct are contained on ROSE Confluence Wiki
(http://rosecompiler.org/ROSE_HTML_Reference/group__installation.html)
• Additional information about ROSE, including a “how to” section is also available on the
ROSE Confluence Wiki space. (http://rosecompiler.org/?page_id=11)
The following lists the types of configuration items (CI) that will be maintained:
• Requirements documentation (existing requirements are contained in the User Guide,
requirements for future version are tracked in JIRA.
• Design documentation is contained on the ROSE Confluence Wiki or as comments in the
ROSE C++ source code.
• Design drawings are contained in the ROSE Confluence Wiki.
• Interface specifications are contained in the ROSE C++ source code and in the User Guide as
descriptions of the ROSE commands.
• Quality process descriptions (This SQAP)
• Standards that were used to create items such as C++ v11
• ROSE source code
• Builds
• Build, execution, test, configuration, and release scripts and other automation mechanisms
• Build data
• Support software such as EDG, Boost, etc.
• Test plans
• Test cases and results
Distributed Distributed
Development Development
Branch Branch
Main
Development Promoted Test Build Promoted Released Build
Branch
GIT
Repository
Distributed Distributed
Development Development
Branch Branch
5.2 Content
This SQAP is consistent with the standards flow down contained in RID-0118, Software Quality
Assurance Consensus Standards for Quality Assurance Criteria, and the SQA practices as contained
in FRM-3104, SQA Required Practices for Non-830 Software. In addition, the following software
effort specific standards, practices, conventions, and metrics are being used:
• Applicable regulatory requirements are listed in section 2 of this SQAP
• LLNL supplied SQAP template, Doxygen, Confluence Wiki
• Compiler standards for FORTRAN, C++, C, Java, PHP, Open MP, Python
• Coding standards/conventions
• Variable and module naming conventions
• Commentary standards/conventions uses Doxygen
• Programming styles
• Inspection techniques
• Testing standards and practices are in section 4.2.3 of this SQAP
• Quality requirements are listed in this SQAP
• Quality metrics include:
o Matrix testing results
o Static analysis reports
o Code count
o Comment count
o Code cyclomatic complexity
Quality metrics are collected for testing during the Continuous Integration process. Additional code
quality metrics are reported when a static analysis run is completed using the Klocwork tool.
No additional standards, practices, conventions, or metrics, other than those outlined elsewhere in
this document, will be followed.
6 Software reviews
The types of review to be conducted during the life-cycle of this software effort include, as
applicable (but are not limited to), managerial reviews, technical reviews, inspections, code
Documentation of review comments and their disposition will be retained until they are incorporated
into the updated software or associated document(s). Comments not incorporated and their
disposition will be retained until the software is approved for use. See Section 12 of this plan for
additional records retention requirements.
Review comments, action items, and notes from the status meetings are kept in the ROSE
Confluence Wiki.
6.1 Purpose
Software reviews are conducted throughout the software effort’s life-cycle to satisfy governing
quality assurance and application-level requirements and to provide assurance that the product is
ready to advance to the next life-cycle phase.
This review is conducted to assure the adequacy of the applicable requirements for the ROSE
software. As shown in the ROSE Lifecycle, this review takes place to analyze the list of stakeholder
desired features and to assure that they are adequately understood by the developers. Outcome of the
review is to assign features to the next release and developers to work on those features. A second
possible outcome of the review is to defer features for future releases or deprecate existing features
that are no longer desired. Additional information may be sought from stakeholders if features are
not fully understood during this review. Some features may be straight forward enough that this level
of review is not necessary or can be combined with the ROSE weekly staff meeting. .
NAP-24A additional consideration: New features to be added may also be accompanied by one or
more new test cases to verify these features.
Those involved in this review include individuals or groups who did not perform the work being
reviewed. The design approval occurs when this review is deemed successful.
A Design Review may be called by the Principal Researcher, Realization Manager, or Developers
when the implementation of a new feature is sufficiently complex or has multiple possible valid
solutions or implementations. The design may be represented with pseudo-code, C++ or C code,
UML, or flow charts. This optional review takes place after the requirements are analyzed.
Typically, the review can be in a conference room or automated using Jabber.
Examples of formal managerial reviews that may be performed include, but are not limited to:
• PRO-0053, Performing Management Observations & Inspections
• PRO-0052, Management Self-Assessments
• PRO-0050, Internal Independent Assessments, are an option for high risk software. These
are performed by the MAS SQA office by request.
Formal reviews are held at management’s discretion to assess the execution of actions and items
identified in this Software Quality Assurance Plan. The results of these reviews and the metrics
identified in Section 5.2 of this document form the basis for quality improvement per DES-0115,
LLNL Quality Assurance Program, Section 4.2.3.
Managerial reviews are held weekly or more frequently in person or using the Jabber communicator
tool. Documentation of items to be discussed is distributed prior to the staff meeting. The Product
Realization Manager runs the meeting as the developers discuss progress/obstacles. Anomalies,
risks, and show stopper level defects may also be discussed and inserted as tasks or bugs into the
JIRA tracker tool.
This review is held after a major release. It is post mortem review to assess what went well and what
needs to be improved for the next release. This meeting is chaired by the Product Realization
Manager and attended by developers, testers, and optionally users. This review may also be
combined with a weekly staff meeting.
The Verification and Validation plans are contained in section 4.2.3 of this SQAP. The Verification
and Validation process is periodically reviewed as part of the weekly staff meeting. The Jenkins tool
controls the Continuous Integration of new or modified code. At the time the new or modified code
is checked in the GIT repository the test suite begins running thousands of automated tests. A test
report is generated recording the pass/fail status of all tests run.
scripts and build logs to assure all of the required software and supporting libraries and versions are
contained in the build.
6.2.9 Versioning
Version numbering for the release of ROSE follows an w.x.y.z. format. Therefore, z increments for a
release with bug fixes only, y increments when a release contains feature improvements, x
increments when the release contains a major new feature or features. Lastly w increments for major
public releases.
7 Test
All testing is covered in Section 4.2.3 of this plan and its referenced documents.
Testing is to be completed and reviewed prior to the software being released to users.
Discovery and removal of code defects is one of the purposes of the Verification and Validation
process described in section 4.2.3 of this SQAP. Minimizing defects delivered to users is one
way to improve the quality of the user’s experience. Logging the discovered defects into
JIRA helps assure they will be repaired and retested. However quality also includes the
defect prevention and that the code contains the user desired features. On the prevention side
requirement reviews may generate a problem report if the desired feature description is not
adequate or conflicts with other features. Requested features themselves can also be entered
and tracked by the JIRA tool. Developers are responsible for developing their own unit tests
and assuring a peer review is conducted prior to committing their new code to the
development branch. Additional tools are available to conduct static analysis using
Klocwork.
The organizational responsibilities for implementing this process are covered in Table 2, Roles and
Responsibilities, in Section 3.3 of this SQAP.
If problems or issues are reported which need to be tracked at the institutional level, PRO-0042,
Issues and Corrective Action Management, will be followed.
• Confluence
• Wiki
• Atlassian
• Contact information, including:
o 1098 Harrison St, San Francisco, CA 94103
o https://www.atlassian.com/why-wiki-collaboration-software
o https://www.atlassian.com/resources/support
o (415) 701-1110
• Site License
• Annual renewal
• Software Development Management
• JIRA
• Defect, Task, Requirement Tracker
•
• Atlasian
• Contact information:
o 1098 Harrison St, San Francisco, CA 94103
o https://www.atlassian.com/why-wiki-collaboration-software
o https://www.atlassian.com/resources/support
o (415) 701-1110
• Site License
• Annual renewal
• GIT
• Version Control System
• Open Source
• Contact information:
o https://github.com/git/git-scm.com
• Open source
• Eclipse
• Integrated Development Environment (IDE)
• Open source
• Contact information, including:
o http://www.eclipse.org/downloads/packages/eclipse-ide-java-developers/marsrEmail
address
• Cisco Jabber
• Unified Communication
• Cisco
• Contact information, including:
o http://www.cisco.com/c/en/us/products/index.html
o (800) 553-6387
• Site license
• Annual
10 Media control
The software effort deliverables, as well as items provided through acquisition, will use the
following media types and be controlled as indicated below in Table 5, Media Types and Control.
The primary means for dealing with damage (i.e., corruption) is to back-up the software.
Media is protected by being on a private lab network, requiring OUN, badge smart card, and PIN to
access the network. In addition, permissions must be established to access the Confluence Wiki
pages for the ROSE pages which contain the download links to the media.
11 Supplier control
For procured/acquired software, institutional governing requirements and procedures will be
followed. The relevant documents are listed below. They contain information to assure the supplier
receives adequate and complete requirements and the product is suitable for its intended use (i.e., it
meets established requirements).
• LLNS Procurement Standard Practices Manual
• PRO-0097, Acceptance of Procured Items and Services
• PRO-0098, Determining Procurement Quality Levels and Controls
• PRO-0099, Supplier Evaluation
• PRO-0100, Nonconformance of Procured Items and Services
• SCM IQP 0010, Supplier Corrective Action Request Process
All procured software has been acquired in accordance with the LLNL approved procurement
systems.
The following will be retained for at least 10 years from the date of this document:
The following documents will be collected, maintained, and retained in the appropriate software
configuration management repository:
• Software Quality Assurance Plan (In GIT)
• Software requirements descriptions (In JIRA and Confluence Wiki)
• Software design description (In Confluence Wiki and source code comments)
• Verification and Validation Plan (In SQAP)
• Verification and validation results report(s) (In JIRA)
• Individual V&V activities results (In JIRA)
• Software Configuration Management Plan (In SQAP)
• User and developer documentation (In Confluence Wiki)
Any procurement related documentation will be managed through the Procurement organization.
The Principal Researcher determines the methods for reviewing and approving quality records.
NAP-24A additional considerations: Use of software development tools such as JIRA, Jenkins,
GIT, and the Confluence Wiki produce records and logs to provide evidence of software
development process activities. These records and logs are maintained on the tools for a period of
time to be determined by the Principal Researcher or their designee and/or as required to support
audits as determined by the Product Realization manager. .
13 Training
All developers, testers, and evaluators are hired with commensurate skills, knowledge, and abilities
for the job and are subject to the institutional training requirements as specified in the Personnel
Policy and Procedures Manual, Section I, Employee Development and the ES&H Manual Document
40.1, LLNL Training Program Manual. Software effort-specific training requirements are
determined by the Software Effort Lead consistent with DES-0115, LLNL Quality Assurance
Program, Section 4.2.2. Some training may be accomplished via on-the-job experience. Formal in-
house training is recorded in the institutional tool, Livermore Training Records and Information
Network (LTRAIN). Additional training requirements are specified in The ROSE development team
members have formal academic education in computer science as well as many years of relevant
experience in scientific software. Additional training requirements are determined by the functional
management for appropriate levels of security and safety training requirements as documented by
LTRAIN. Completion of informal on-the-job training is captured (e.g., email, memo-to-file, training
checklist) and approved (e.g., mentor’s/manager’s signature) for future auditing purposes.
NAP-24A additional consideration: evidence of training is tracked in the LTRAIN tool. Training is
continuing to maintain proficiency.
Table 6, Training Requirements.
The ROSE development team members have formal academic education in computer science as well
as many years of relevant experience in scientific software. Additional training requirements are
determined by the functional management for appropriate levels of security and safety training
requirements as documented by LTRAIN. Completion of informal on-the-job training is captured
(e.g., email, memo-to-file, training checklist) and approved (e.g., mentor’s/manager’s signature) for
future auditing purposes.
NAP-24A additional consideration: evidence of training is tracked in the LTRAIN tool. Training is
continuing to maintain proficiency.
Table 6, Training Requirements
14 Risk management
Risks associated with the actual development effort such as resource availability and funding
shortfalls, are managed according to the ROSE Master plan.
Risks are identified at the ROSE weekly staff meeting. Risks that cannot be mitigated immediately
are added to the JIRA tracker, prioritized, and assigned to an ROSE team member by the Principal
Researcher for mitigation implementation.
NAP-24A additional considerations: Software security related risks as well as structural issues are
identified using the Klocwork static analyzer. The areas of concern in the code will be repaired and
retested to remove important code security vulnerabilities and structural issues.
The application is software risk graded according to PRO-0107, Software Risk Grading. The
Process/Development Environment report, which is part of the software risk grading process, gives a
high-level evaluation of possible risks during the development life-cycle.
15 Glossary
Table 7, Acronym/Definitions
Acronym/Term Description
AST Abstract Syntax Tree
Admin Administrative
Anomaly An observation of an off normal or unexpected
event/behavior. These may be process and/or product related.
Other terms commonly used include: issue, bug, and defect.
CD Compact Disk
Chg Change
CI Continuous Integration
DC Development Control
DES Description document
DOE Department of Energy
DOE O DOE Order
DVD Digital Versatile Disk
EDG Vendor that supplies C,C++ compiler front ends
ES&H Environment, Safety and Health
FRM Form
IDE Integrated Development Environment
IEEE Institute of Electrical and Electronics Engineers, Inc.
LLC Limited Liability Company
LLNL Lawrence Livermore National Laboratory
LLNS Lawrence Livermore National Security, LLC
LTRAIN Livermore Training Records and Information Network
MAS Management Assurance System Organization
Non-830 Software Software that is not Nuclear Safety Software
PRO Procedure
Rev Revision
RID Requirements Interpretation Document
RL Risk Level
SCM IQP Supply Chain Management Internal Quality Procedure
SCMP Software Configuration Management Plan
SQA Software Quality Assurance
SQAP Software Quality Assurance Plan
SQL Structured Query Language
Std Standard
Software: ROSE
Date: 9/26/2016
Author: Gregory Pope
This form documents the software quality assurance (SQA) activities and implementation plans for Risk Level 3 practices for Non-830
Software when there is major development control. A software risk grading was performed in accordance with LLNL Institutional
Software Quality Assurance Program for Non-830 Software 1 requirements implementing DOE O 414.1D Admin Chg 1 and other LLNS
contract requirements.
The flow down of SQA requirements to consensus standards is documented in RID-0118-00, Software Quality Assurance Consensus
Standards for Quality Assurance Criteria. RID-0118 references the following IEEE (Institute of Electrical and Electronics Engineers,
Inc.) standards:
IEEE Std 730™-2002, IEEE Standard for Quality Assurance Plans
IEEE Std 828™-2012, IEEE Standard for Configuration Management in Systems and Software Engineering
IEEE Std 1012™-2012, IEEE Standard for System and Software Verification and Validation
Document templates have been created for flowing down the SQA requirements. These templates may be downloaded from the SWING
wiki. The use of the templates is optional and may be used to ensure requirement flow down.
Documents Templates
Software Quality Assurance Plan (SQAP) Non-830 Software SQAP-3M
1
DES-0108-00, Non-830 Institutional Software Quality Assurance Program and FRM-3104, SQA Required Practices for Non-830
Software
RL4, Non-830 Software, Major DC (insert Application title) Page 28
SQAP – RL3, Non-830 Software, Major DC
Template Rev. 1
The table on the following pages details the required plans and products. Each practice has an indicator of the level to which the practice
is to be implemented. These levels are:
X Indicates the practice is required and fully implemented.
Indicates the practice is required, but may be partially implemented with written justification.
O Indicates the practice is recommended, but optional.
Cross-references to the IEEE requirements that are to be used to implement the practice are indicated in “[]” for each practice. For
example [SQAP S1, S3 (all)] refers to sections 1 and 3 (with all of its subsections) of IEEE 730-2002.
This Implementation form applies to tools built for customers using the ROSE framework
DOE O 414.1D Work Activity/ Impl Justification for Current/Planned Practice Gap Closure/Acceptance
Practice Level Non-Applicability or Implementation and/or Change Plans
Tailoring
SQA Work Activity 1: Software Project
Management and Quality Planning
Prepare, review, and approve a Software Software Quality Assurance Complies
Quality Assurance Plan (SQAP) using Plan to be written to
O
the institutional template specific to the accommodate work for
risk grading. others and DOE
Plan and manage project activities and ROSE master plan is done Complies
commitments (including organizational using MS Project and
structure/interfaces; roles and includes staffing, tasks, sub
responsibilities / authority; schedule; tasks, s schedule, budget,
budget; and resources). resources.
[SQAP S4.3.1, S4.3.2]
Software: ROSE
Date: 9/26/2016
Author: Gregory Pope
This form documents the software quality assurance (SQA) activities and implementation plans for Risk Level 4 practices for Non-830
Software when there is major development control. A software risk grading was performed in accordance with LLNL Institutional
The flow down of SQA requirements to consensus standards is documented in RID-0118-00, Software Quality Assurance Consensus
Standards for Quality Assurance Criteria. RID-0118 references the following IEEE (Institute of Electrical and Electronics Engineers,
Inc.) standards:
IEEE Std 730™-2002, IEEE Standard for Quality Assurance Plans
IEEE Std 828™-2012, IEEE Standard for Configuration Management in Systems and Software Engineering
IEEE Std 1012™-2012, IEEE Standard for System and Software Verification and Validation
Document templates have been created for flowing down the SQA requirements. These templates may be downloaded from the SWING
wiki. The use of the templates is optional and may be used to ensure requirement flow down.
Documents Templates
Software Quality Assurance Plan (SQAP) Non-830 Software SQAP-4M
Software Configuration Management Plan (SCMP) Non-830 Software SCMP-4M
Software Verification and Validation Plan (SVVP) Non-830 Software SVVP-4M
The table on the following pages details the required plans and products. Each practice has an indicator of the level to which the practice
is to be implemented. These levels are:
X Indicates the practice is required and fully implemented.
Indicates the practice is required, but may be partially implemented with written justification.
O Indicates the practice is recommended, but optional.
Cross-references to the IEEE requirements that are to be used to implement the practice are indicated in “[]” for each practice. For
example [SQAP S1, S3 (all)] refers to sections 1 and 3 (with all of its subsections) of IEEE 730-2002.
2
DES-0108-00, Non-830 Institutional Software Quality Assurance Program and FRM-3104, SQA Required Practices for Non-830
Software
RL4, Non-830 Software, Major DC (insert Application title) Page 38
SQAP – RL3, Non-830 Software, Major DC
Template Rev. 1
This Implementation form applies to new code or improvements made to the ROSE framework.
DOE O 414.1D Work Activity/ Impl Justification for Current/Planned Practice Gap Closure/Acceptance
Practice Level Non-Applicability or Implementation and/or Change Plans
Tailoring
SQA Work Activity 1: Software Project
Management and Quality Planning
Prepare, review, and approve a Software Software Quality Assurance Complies
Quality Assurance Plan (SQAP) using Plan to be written to
O
the institutional template specific to the accommodate work for
risk grading. others and DOE
Plan and manage project activities and ROSE master plan is done Complies
commitments (including organizational using MS Project and
structure/interfaces; roles and includes staffing, tasks, sub
responsibilities / authority; schedule; tasks, s schedule, budget,
budget; and resources). resources.
[SQAP S4.3.1, S4.3.2]
Identify, define, plan, and track software Covered in ROSE Software Complies
quality assurance activities, including Quality Assurance Plan
software development methodology,
configuration management, verification
and validation, management reviews, and
testing.
[SQAP S4.4.2.1, 2, 3, 6; S4.5.1, 2; S4.6.1;
S4.6.2.8; S4.7; S4.8; S4.9; S4.11]
Determine applicable regulatory Covered in ROSE Software Complies
requirements. X Quality Assurance Plan
[SQAP S4.5.1, 2] section 2
Identify formal documents. Covered in ROSE Software Complies
[SQAP S4.4.2.1, 2, 3, 6] Quality Assurance Plan
section 4.2 and 4.3
Magnitude of Prob of
Description Status Impact Priority Occurrence Risk Reponses Mitigation Actions
One team member becomes indispensable to the team Increased responsibility for Marcus and Liao. Training
because of knowledge of sections of the code. materials or classes from EDG for their portion of the
Modifications of sections of the code can only be done by Active 1 - Highest 1 - Highest 1 - Highest Mitigate front end code.
Code going directly from researcher/developer to user Independent system testers added, funded by ISCP,
without any independent system testing. Documentation documentation versions synchronized to versions.
not verified. Active 1 - Highest 1 - Highest 1 - Highest Mitigate
System testing not planed or included in ROSE ISCP funding covers 1.5 FTE for system testing and
development model. documentation verification. All tools built for users form
Active 1 - Highest 1 - Highest 1 - Highest Mitigate the ROSE framework will require independent system
Deliverable milestones missed and/or capability of tool The customer based ROSE masterplan to be reviewed on
does not meet customer expectations. Tool too hard to a weekly basis detecting problems, identifying obstacles,
install and/or use. Active 1 - Highest 1 - Highest 1 - Highest Mitigate and setting work arounds into place at the earliest point.
Team or team members attracted to new and exciting Limit annual time budget for Bluebird requests to a
sounding code challenges diverting resources and causing capped amount. Some of these efforts are useful (eat
longer term project milestone slip right. Active 1 - Highest 1 - Highest 2 - High Mitigate your own dog food) and important , so do not want to
Technical debt has accumulated in ROSE code over the past Prioritize and repair 600 or so defects in backlog
15 years making it harder to trouble shoot and maintain.
Active 1 - Highest 1 - Highest 5 - Lowest Unassigned
Do not want to retard research and exploration, want to Dual track risk levels in SQAP. RL 4 for ROSE framework
have more rigorous code tools built from ROSE. (core) for researchers, RL 3 for tools built with ROSE
Active 1 - Highest 2 - High 1 - Highest Mitigate framework for application developers.
ROSE has historically been funded from various sources Additional LLNL internal funding applied to help fill in the
without long term continuity. This has made the team funding GAPS. Expansion of number of tasks for GS using
leadership conservative about growth and support and Active 1 - Highest 2 - High 2 - High Mitigate planning, hiring, and contracting to supplement ROSE.
Rose managed as one large project with various add-ons ROSE master plan breaks out ROSE work by customer
for versus customers. projects.
Active 2 - High 1 - Highest 2 - High Monitor
Versions of EDG, BOOST, Compilers, Operating Systems, Automated testing reduces effort necessary to adapt to
Compiler Standards, and language popularity changing new condition. Previously relased ROSE tool impacts
impacts ROSE performance. Active 2 - High 2 - High 2 - High Mitigate considered. Updates to new version only done when
ROSE users may not think like compiler specialists Orient documentation to support application developers
that are not compiler experts
Active 2 - High 2 - High 2 - High Mitigate
Create cookbook instructions for tools and a knowledge
Documentation is emense and hard to find specific topics base forcommon user problems
Active 2 - High 2 - High 2 - High Mitigate
Users service requests may get lost. Deploy service desk tool in JIRA for customers and
workflow for assuring request gets serviced.
Identified 2 - High 2 - High 2 - High Mitigate
Platforms used for testing ROSE are aging and need to be Use a portion of the project funds to purchase new
upgraded to be more consistent with customer platforms. platforms.
Identified 2 - High 3 - Medium 2 - High Mitigate
User support is handled by asking developers for help System testers will be used as the first level of user
directly which may distract them from other commitments. support. If system testers can not help the user then they
Active 2 - High 3 - Medium 2 - High Mitigate will talk with developers to get additional information to
The ROSE install packages includes ROSE tests which use up Tests will be removed from user install media. The tests
a majority of the users time loading and a large amount of can still be obtained but only if the user wants to have
platform memory. Identified 2 - High 3 - Medium 2 - High Move them.
The ROSE tool folder contains dozens of different tools Pare the tool folder down to less than a dozen useful
collected over the years. Supporting all of these tools is tools. Use the system testers to validate the tools work
very time consuming. Identified 2 - High 3 - Medium 2 - High Avoid correctly.
EDG goes out of business ROSE demonstrated to work with LLVM (CLANG) for C.
1. All tools will have specific features expected from production level tools. Features
common to all tools released under this process:
a) Command line help
b) Documentation in a README file at the top-level directory
c) Smoke tests
d) One or more examples to use as regression tests