Current State of Human Factors in Systems Design
Chair: Karen Feigh, Co-Chair: Zarrin Chua
Georgia Institute of Technology
Panelists: Chaya Garg, Medtronic; Alan Jacobsen, The Boeing Company; John O’Hara, Brookhaven National Laboratory; William Rogers, Honeywell Aerospace; John Shutko, Ford Motor Company
Many papers at previous HFES Meetings have touched on the theme of how to either incorporate human
factors earlier in the design process, including development of new methods and discussion of what existing
methods and tools are applicable at the preliminary and conceptual design phases. To many HFES members,
these questions are not an academic exercise, but daily challenges. The goal of this discussion panel is to
summarize the current state of human factors in systems design, particularly outside of the context of human
factors-centric operations and to assess the gaps in human factors methods to support the incorporation of
human factors earlier in the design process. Panel members represent industry organizations who work daily
to incorporate good human factors principles into design.
INTRODUCTION
Since the 1950s, several books, but fewer journal articles, have
been written regarding human systems engineering, or the integration of human factors (HF) in systems engineering. This list
of literature includes the extensive works of human factors specialists (HFSs) such as Bailey (1996), Chapanis (1996), Meister (2001), Proctor and van Zandt (1994), Stanton (2005), and
Wickens (2003). Many of these works have proffered discussion of the role of the HFS, the types of methods that should
be used, and what methods are most applicable at each design
stage. Although every author has slight variations on each of
these topics, most follow a general outline featuring the equivalent four design stages: conceptual, preliminary, detailed, final.
It is widely acknowledged that these stages are both archetype
and iterative. For a thorough description of the four design
stages, the reader is referred to the reference list; for completeness, only brief descriptions are provided.
The conceptual stage of design is marked by a significant collection of information regarding the user type, the usersystem objectives, and the system measures of performance.
The role of the HFS is to profile the user that will employ the
system, to help set system requirements and objectives from the
user’s perspective, to determine system feasibility and operating
conditions. The HFS tries to answer “what, how, who, when,
where” questions such as, What should the system do? How
should the system achieve these goals? Who should operate
the system? When might the system operate? Where should
the system used? To achieve these goals, the HFS may observe
users, conduct interviews, or survey potential users. Answering
these questions and specifying user-system requirements progresses the system to a preliminary design.
In the preliminary design stage, the HFS is tasked with
continuing to develop detailed design specifications, allocating
functions between human and automated agents, clearly defining the user’s task, and evaluating alternative design options.
During this stage, questions regarding human performance capabilities are frequently asked: Can the user perform all of the
required functions within the context of each system design option? What information does the user require and how should
this information be presented? What responsibilities should be
performed by the user or the system? The HFS relies on performance evaluations, function allocation methods, and cognitive/task models to fulfill design responsibilities. Once a single
design has been selected, the system matures into more specific
and detailed design.
The detailed design stage is when the HFS is asked to begin quantifying human performance, configuring the physical
layout or system display, and developing documentation for the
user. The questions asked in this stage primarily concern the
ergonomics of the system and the expected user-system performance: Are the tasks designed in such a way that users can
be trained to perform them? Is the environment sufficient for
the user? Cognitive models and rapid prototyping address the
quantitative aspects of this design stage, whereas writing documentation focuses more on training users on the system task.
The system is ready for a conclusive design when each question
is sufficiently answered.
The final design stage is focused on implementing trial
runs, eliminating any previously unforeseen design errors, and
preparing the final design for distribution. This design stage is
most traditionally associated with HFS as the required work is
clearly and directly linked to HF. Questions posed in this stage
are focused on the user’s overall experience with the system:
Are the users able to perform their tasks using the system? Are
satisfactory levels of human performance achievable? What design deficiencies, if any, must be corrected? HFSs design and
implement human-subject tests and determine any system inadequacies.
Although a universal approach to infusing HF into systems
engineering has been identified and prescribed in the literature,
in reality, HF is not always included effectively at each design
stage. Multiple authors in this field cite various explanations
for the diminished role, or even exclusion, of HF, with common
themes: difference in design philosophy, misunderstanding of
HF capabilities, lack of quantifiable metrics, difficult to follow
HF guidelines, and deficiencies in company support. First, design philosophies may differ between system designers (SDs)
and HFSs; SDs often view humans as external to the system
being designed - “machine-centered, a physical view”, whereas
HFSs consider the system boundaries to include the humans “at the center of picture” (Wasserman, 1989 cited by Chapa-
nis, 1996). Therefore, the system is designed first, then the user
trained to the system, rather than the user applying the system
to achieve a specific goal. Second, the systems design community does not recognize that HF capabilities do extend beyond the detailed and final design stages, i.e. beyond just the
creation of a system interface (Proctor and van Zandt, 1994).
This misunderstanding is often caused or exacerbated by the
lack of quantifiable HF metrics available at the conceptual and
preliminary design stages (Chapanis, 1996). HF features are often implementation-dependent, leaving many HFSs reluctant to
specify achievable performance in early design stages. Additionally, HF design principles and guidelines are not available
in a language understandable to SDs without formal HF training (Chapanis, 1996). Lastly, because of these aforementioned
reasons, the SD may not see the immediate economic benefit
of HF, especially to large projects intended for many diverse
users (Proctor and van Zandt, 1994; Sanders and McCormick,
1993). As such, company organizational structures may not be
arranged in a fashion that is conducive to promoting a user-first
design philosophy.
While progress has been made in addressing many of these
areas, it is useful to assess the status of the effective integration
of HF into system design in industrial applications. The five
experts on this panel, representing the automotive, commercial
aircraft, medical devices, and nuclear industries, provide perspectives on the current state of HF integration into the systems
design process by answering these questions:
1. In addition to the factors mentioned by the authors in the
literature, are there any other contributing factors to the
exclusion or under-utilization of human factors in system
design?
2. At what design phase(s) are you and your HFS colleagues
involved with the design of new products?
3. In what design aspects do you feel there is a gap in the
current HF methods or knowledge?
4. What can be done better to integrate human factors into
systems engineering?
COMPOSITE PANELIST RESPONSES
Q1: In addition to the factors mentioned by the authors in
the literature, are there any other contributing factors to the
exclusion or under-utilization of human factors in system
design?
The panelists highlighted three common themes regarding
the exclusion or under-utilization of HF in system design. First,
they confirmed the findings in the literature - the most obvious
factors are still relevant. SDs may not be aware of HF or specialists within their respective organizations. Due to infrequent
interactions with HF and HFSs, previous poor experiences may
sour the potential for future inclusion of HF or introduce more
misconceptions regarding the uses of HF. Additionally, SDs
may not trust HFSs to respect their processes and thus may not
seek their input. Lastly, there is a need for more HFSs, improved HF processes, and more robust HF tools that mandate
rigor and thoroughness in formal engineering processes.
Beyond these reasons, the panelists noted that there is still
a misunderstanding of what HF is and its distinction from other
relevant specializations. A great deal of diversity exists in the
focus and organization of HF groups in various companies.
There are groups that are focused on particular areas within HF
- task modeling; ergonomics; training; usability testing; interface design, etc. SDs may not appreciate the differences between each of these groups or even recognize that each group
is a subset of the overall field of HF. Nor may they recognize
the differences between HF and other groups, such as marketing and industrial design, that are also concerned with the users’
experience. As such, including HF appears overly complicated
to SDs and they may feel as if they already consult HFSs.
The unfamiliarity of the contributions from HF is also a result of poor integration of this expertise into the systems design
process. The panelists indicated that, often, HFSs fail to appreciate the skills needed or the constraints imposed by the system
design process. The HF effort must recognize and fit within the
allocated time and budget, attending to the cost of conducting
business and the timeliness of determining an answer posed by
the SD. Every design program has real schedule constraints and
the HF effort needs to recognize and fit within the allocated time
and budget. Additionally, HFSs need to respect engineering demands, such as the need for working solutions and not necessarily an optimized solution. This may extend even to aesthetics.
Overall, the panelists believe that HFSs must take the lead
on fully integrating themselves in the development process. In
particular, the metrics generated by HFSs must correspond to or
inform metrics used in system design. Most SDs understand the
value of metrics such as weight, volume, and power for a product, but the value of human performance metrics are less obvious. The challenge exists of quantifying the value or practicality
of chosen human performance metrics. Additional quantification of these metrics would be useful in bridging the gap.
An interesting dichotomy arose from the panelists’ responses, between panelists in fields dictated by certification versus those driven by customer demands alone. The certification
of products has resulted in established procedures for the inclusion of HF in the design process and a method to assess the
effectiveness of an overall system. For example, following the
accident at the Three-Mile Island nuclear plant, nuclear industry attention focused on the need for better HF. Many of the
contributing factors to the accident can be directly traced to deficiencies in control room design, procedures, and training. As
such, the U.S. Nuclear Regulatory Commission now mandates
a safety evaluation of all existing and new plants that include
HF systems. The HF systems safety review focuses on both
the design process and design products. The design process review encompasses areas such as program management, operating experience, function requirements and allocation, task analysis, reliability, interface design, training, verification and validation, design implementation, and human performance monitoring. The design products review includes an evaluation of
all user interfaces, procedures, and training. Similar regulations
apply to both aircraft and medical devices.
all productions), internal HF guidelines, and HF component on
organizational charts.
Q2: At what design phase(s) are you and your HFS
colleagues involved with the design of new products?
Q3: In what design aspects do you feel there is a gap in the
current HF methods or knowledge?
The panelists indicated that HFSs are employed in all
phases of the design process, as early as system conceptualization through system delivery and often beyond product launch.
The specific roles range from working in research and development to maturing technologies for demonstrations; working on
the HF aspects of new products once product design program
approval is achieved; final product testing and validation; customer support and in-service evaluation. HFSs are also involved
during initial concept of operations, preliminary functional requirement specification, and operation analyses. Separate HF
teams may address different concerns; for example, the work
completed during research development is handed to another
group charged with designing and manufacturing the system.
In addition, HFSs have active roles beyond the four traditional design phases. For example, they may also work closely
with marketing team members to understand market needs capturing the voice of the customer, understanding and applying
regulations, interpreting and implementing customer directions.
In critical infrastructure applications such as the commercial nuclear industry, homeland security, and air traffic control, they
are extensively involved in safety reviews of proposed designs
and operational readiness reviews. In the post-launch or postrelease phase, companies continue to listen to their customers
and many changes – many of which are related to improving
human factors issues – are made.
While HF is represented in all stages of design when evaluating the collective sum of industries, the success of implementing HF across all elements and phases of development of
a complex product is not always ideal. Some new systems may
be on a “go fast implementation” and have been approved to be
implemented just as a supplier has it designed. As such, HF involvement can be very late in the design process. Furthermore,
factors such as business and platform priorities and constraints
may drive the involvement of HF, rather than the design itself.
In these situations, a clear and explicit issue, such as safety concerns, is needed to justify the inclusion of HF expertise. Lastly,
as our panelists noted, different HF groups may focus on different aspects of the design process. Continuity is a major concern,
especially to minimize work losses or duplications. Challenges
in using HF effectively and efficiently continue.
Although inclusion of HFSs is already well established
where mandated by regulations, the need for HFSs during all
stages of design is becoming more widely recognized in situations not covered by government rules. Panelists noted that
companies are increasingly sensitive to the advantages HF provides. Products designed to promote a positive user experience
are gaining a competitive edge. As such, changes in policies
and additional training programs are gaining in popularity and
are expected to do so in years to come. For example, Usability
Maturity Models push for the identification of an HFS, and the
development of several key tools: corporate qualifications for
HF, internal engineering procedures (including a HF review of
The panelists agreed that while HF methods have progressed significantly in the last few decades, there are two areas
that still need development: predictive models and design guidance. Models for predicting joint human-system performance
continues to be highly desirable, but difficult to achieve. Cognitive modeling is relatively immature when compared to other
domains used in design. At the moment, HFSs have very good,
efficient, and effective physical human models for design but
cognitive models are not yet capable of being employed pragmatically within a design project. The use of such models - human performance models and federations of models (interacting
models) - is becoming widespread in many industries. There
must be an improved means to validate their application. Furthermore, the HF community needs to provide better guidance
on how these models should be incorporated into design. Making key design decisions for actual systems from these model
analyses requires clear direction on their appropriate use.
Additionally, few methods that are able to reliably predict
customer satisfaction, which ultimately drives industry success,
currently exist. The panelists also noted the need for improved
models that can assist with determining users’ acceptance rates
regarding the general system, false alarms, and degraded system performance. These newer generation models must also be
capable of providing a systematic and structured approach to error identification and mitigation and tie into detailed design and
usability testing. Having such an approach will help not only
the HFS at a higher level of design but also provide rationale
for supporting and delivering user-centered designs and functionality. Lastly, a strong demand exists for multi-level models
that can work at multiple levels of abstraction. Current models
exist in various levels of granularity, but few are able to capture
effects at multiple levels. Those that do are built from the lowest to the highest level, where, what is needed in systems design
is the opposite. As such, no one cohesive model can be used
throughout the entire design life cycle.
Overall, models are needed during the design cycle to provide guidance for designers. In general, much is known of
user-system interaction issues at very detailed levels, for example, the impact of user interface design on performance: fonts,
color, alignment, etc. As such, HFSs are generally tasked with
addressing these later design issues, contributing to the underutilization of HF in system design. Less is understood, however,
regarding deeper interaction issues such as automation philosophy or function allocation. In fact, automation technology and
implementation in modern systems has greatly outpaced the HF
community ability to provide guidance to SDs concerning the
effects on crew performance. Superior HF methods and tools
are needed to support designers in making decisions regarding
function allocation, adaptive vs static automation, and the implementation of operator interfaces for managing (e.g., monitoring, interacting with, and backing-up) automation. In addition, while there are tomes written on automation, practical and
realistic guidelines at the appropriate level of detail to support
actual design are in limited quantity.
Q4: What can be done better to integrate human factors into
systems engineering?
A common sentiment shared by the panelists is that the education of SDs and others about the existence and benefits of HF
still plays an important role in better integrating HF into systems
engineering. However, the education needed now differs from
what was previously described in the literature. Education is
needed, less on the existence of HF, but more on the significance
and use of such expertise. The value of HF/human-system integration needs to be succinctly articulated and clearly communicated across the design organization. The panelists suggested
several ways to achieve this goal.
HF must “buy their way onto a design project”, or to prove
to managers and SDs the benefit of HF. As illustrated in several
industries, HFSs can be added to a project due to regulations,
driven by customer demand, or both. The latter promotes an
environment where HFSs are actively sought for the additional
value associated with a project, rather than just to satisfy requirements. However, in order to demonstrate this benefit, HFS
must communicate in the language of systems design. Already,
HFSs are creating simplified checklists and using them in an effective way to ensure that HF is considered by SDs who may not
be familiar with how to account for HF. However, they must also
develop human performance metrics that can be traded against
other design metrics. There is no deficiency in human performance metrics, but the metrics must be articulated in a way that
is relevant to the current project. HFS must for example, explain
the tradeoffs between system weight and user workload.
Natural forces - such as customer appreciation of usability or regulations - are helping to better incorporate HF within
systems design. Globally, organizations can build on early successes to create an environment where HFSs is sought out rather
than avoided. Such an environment is developed with defined
design roles where HFSs can make clear, meaningful contributions at reasonable costs, demonstrating quantitatively (if possible), the significant benefits of incorporating HF into design.
Organizations can promote a mentality of HFS-SD complement
by structuring the two disciplines as peer functions in a larger
organization.
Locally, however, every HFS can aim to formally incorporate HF within standard system development and reviews,
starting with program planning and work estimates. To do so,
HFS must actively capitalize on opportunities to demonstrate
how HF should be woven into systems design. HFSs need embedded themselves within design teams and work hand-in-hand
with other engineers and designers throughout the design process. However, these experts also need a way to stay coordinated across the design element and phases of development, including post-design product test, evaluation, and support. HFS
must define and capture HF guidelines and processes. These
processes need to be ensconced into or at minimum, linked to
the overall design process definition.
CONCLUSION
Five panelists, representing the automotive, commercial aircraft, medical devices, and nuclear industries, were asked to
provide their perspectives on the current state of human factors in systems design. In particular, they opined on the role
and representation of today’s human factors specialist throughout the design cycle and the possible factors contributing to the
exclusion or under-utilization of human factors. Additionally,
the panelists noted shortcomings or gaps of knowledge within
the human factors field and provided suggestions on improving
integration of human factors in systems design.
Overall, human factors is a widely-recognized field of expertise that many industries, either to achieve competitive advantage or satisfy government regulations, are beginning to incorporate across all design phases. This current state of affairs is
a progression from what was described by the literature even as
recent as a decade ago. No longer is the human factors community seeking acknowledgment of existence, but rather, the focus
is now more on effectively integrating human factors in systems
design. Misunderstandings regarding the breadth and depth of
human factors still exist, but can be ameliorated with company
organizational structure and increased appreciation for the role
of human factors. This appreciation can be furthered with improved design guidance across detailed and abstract design decisions, which calls for multi-level predictive models that can
bridge the gaps between design and user experience. By addressing these concerns and continuing to provide clear, concise, engineering-oriented design work, human factors specialists can prove the value of human factors, leading to effective,
safe, user-centered designs.
PANELIST BIOGRAPHIES
Chaya Garg
Medtronic, Inc.
Dr. Garg is an Engineering Director with Medtronic and manages the Human Factors and Technical Communication groups
in the Cardiac Rhythm Disease Management business unit. She
has been with Medtronic for 11 years. Her responsibilities include functional leadership, ensuring human factors regulatory
requirements are integrated into the product development process, and met on all products for commercial release. She also
ensures that human factors helps facilitate a consistent user experience across the CRDM product portfolio. Previously, Dr.
Garg was a Staff Scientist at the Honeywell Technology Center.
She has worked on a variety of complex control systems including aerospace, industrial automation control, building control
systems, and medical devices. Her areas of expertise include
front-end knowledge acquisition, requirements gathering, usercentered design, and human performance modeling. She received her Ph. D. in Cognitive Engineering and Human Factors
from Purdue University in 1988.
Alan Jacobsen
The Boeing Company
Dr. Jacobsen is a Technical Fellow in human factors engineering and flight deck design at The Boeing Company. Currently,
his activities include technical oversight of Flight Deck and
Crew Station Research and Development as well as overseeing Human System Integration efforts on Advanced Airplane
Concepts. He also participates on the team responsible for integrating Human Factors and Ergonomics into the Boeing Commercial Airplane design process. Prior to his current position,
he was a technical lead for the Boeing 787 flight deck as well
as team leader for the 787 Human Factors & Ergonomics Team.
He has led numerous efforts to develop and move new technologies from research and development to production. In addition
to his work at Boeing, Dr. Jacobsen has participated in and led
the development of numerous aerospace industry standards related to human interface considerations. He has been actively
involved in human interface design, evaluation, and product development for over 25 years. Dr. Jacobsen received in Ph. D.
in Experimental Psychology from the State University of New
York at Stony Brook in 1984 and his B. A. from Luther College,
Decorah, IA, in 1975.
John O’Hara
Brookhaven National Laboratory
Dr. John O’Hara is a Scientist at Brookhaven National Laboratory. His research programs address human-systems integration
and the effects of technology on performance and safety. He has
conducted design certification reviews for the U.S. NRC related
to the HFE aspects of new nuclear plant designs. In addition to
his work in the nuclear industry, John has conducted research
related to many other industrial systems including: space, aviation, telerobotics, maritime, and homeland security. John is a
Fellow of the Human Factors and Ergonomic Society, a Certified Human Factors Professional, and Past Chair of American
Nuclear Societys Division of Human Factors, Instrumentation,
and Control.
William Rogers
Honeywell Aerospace
Dr. Rogers serves as a Technology Fellow for Human Automation Interaction technologies within Honeywell Aerospace.
Prior to his current role, he was Director of an Information and
Decision Technology Center of Excellence. He has 35 years of
human factors experience, including 10 years with the Department of the Navy, 3 years with The Boeing Company, 7 years
with Bolt, Beranek and Newman, and 15 years with Honeywell.
He spent 7 years on-site at NASA Langley Research Center. His
technical areas of expertise are human performance and human
error measurement, human-centered automation and flight deck
design philosophy, function allocation principles and methods,
cognitive task analysis, and human interaction with highly automated systems. Dr. Rogers received his Ph. D. in Experimental Psychology from the University of Connecticut in 1985, and
M. S. and B. S. degrees in Psychology from the State University
College at Brockport, NY in 1975 and 1973, respectively.
John Shutko
Ford Motor Company
Mr. Shutko is a Technical Specialist in Research and Advanced
Engineering at The Ford Motor Company, specializing in Human Factors and HMI working on advanced projects and conducting research. He serves as chairman of ISO TC 22 SC 13
- Road Vehicle Ergonomics. He has been with the Ford Motor
Company for over 10 years, previously working in the Human
Factors department within Product Development where he specialized in Driver Distraction and Usability. His areas of expertise include system development, system evaluation, usability,
and human performance measurement. Mr. Shutko received an
M. S. in Human Factors and Safety Engineering from Virginia
Tech in 1999; and a BS in Industrial Engineering from Virginia
Tech in 1995.
REFERENCES
Bailey, R. (1996). Human Performance Engineering. New Jersey: Prentice Hall
PTR.
Chapanis, A. (1996). Human Factors in Systems Engineering. New York: John
Wiley & Sons, Inc.
Meister, D., & Enderwick, T. P. (2001). Human Factors in System Design, Development, and Testing. CEC Press.
Philips, C. A. (2000). Human Factors Engineering. New York: John Wiley &
Sons, Inc.
Proctor, R. W., & Van Zandt, T. (1994). Human Factors in Simple and Complex
Systems. New York: Allyn and Bacon.
Sanders, M. S., & McCormick, E. J. (1993). Human Factors in Engineering
and Design. New York: McGraw-Hill, Inc.
Stanton, N. A., et al. (2005). Human Factors Methods: A Practical Guide for
Engineering and Design. Burlington, VT: Ashgate Publishing Company.
Wasserman, A. S., (1989). Redesigning Xerox: A Design Strategy Based on
Operability. In E. T. Klemmer (Ed.), Ergonomics: Harness the Power
of Human Factors in Your Business (Chapter 1, pp. 7-44). Norwood, NJ:
Ablex.
Wickens, C. D., Lee, J. D., Liu, Y. and Gordon-Becker, S. (2003). Introduction
to Human Factors Engineering. Prentice Hall.