Cousins but Not Twins: Instructional
Design and Human Performance
Technology in the Workplace
Wellesley R. Foshay , Steven W. Villachica ,
and Donald A. Stepich

Abstract
Instructional design (ID) and human performance technology (HPT) stem from a common
origin in systems thinking and behavioral psychology, but today the two fi elds employ dif-
ferent research bases, system foci, and methods. To contrast these fi elds, this chapter pres-
ents an idealized and abstracted discussion that examines the theoretical origins of the two
fi elds, brie fl y describes their similarities, and focuses on their differences in terms of ana-
lytical frameworks and methods. We conclude that contemporary practice in most contexts
combines elements of ID and HPT, particularly when working in cross-functional teams
seeking to improve organizational performance. Practitioners of ID are likely to encounter
HPT in their work, and they may be called upon to serve as part of a cross-functional team
using HPT as a common conceptual framework. Keywords
Educational technology • Human performance technology • Instructional design • Instructional designer • Performance-based training • Training

Introduction
While they share commonalities, instructional design (ID)
and human performance technology (HPT) employ different
research bases, system foci, and methods. Practitioners of ID
are likely to encounter HPT in their work, and they may be
called upon to serve as part of a cross-functional team using
HPT as a common conceptual framework. Some ID profes-
sionals have successfully broadened their careers to include
both training interventions (using ID) and non-training inter-
ventions (using an HPT framework and drawing from other
fi elds). The relationship between the two is suf fi ciently close
that some professional preparation programs in ID also offer
HPT electives and concentrations. Other programs focus on
HPT, with additional coursework in ID. For all these reasons,
it is probably a good idea for ID professionals to have at least
some awareness of HPT.
To contrast these fi elds, this chapter presents an idealized
and abstracted discussion that examines the theoretical ori-
gins of the two fi elds, brie fl y describes their similarities, and
focuses on their differences in terms of frameworks and
methods. The chapter concludes with a description of a
“savvy instructional designer” that combines elements of ID
and HPT. To avoid presenting idiosyncratic comparisons as
generalities, the chapter employs widely cited (“classic”)
references that provide a representative view of each fi eld.
The authors encourage readers to consult these references for
more thorough introductions to HPT.

Origins of ID and HPT
Historical accounts of the origins of systematic ID in the
1960s typically attribute its roots to a combination of the
then-dominant behavioral learning psychology, combined
with the metaphorical inspiration of general systems theory.
This resulted in the endless variations of the analysis-design-
develop-implement-evaluate (ADDIE) model (Molenda,
2003 ) , which IDs came to accept as both
An analytical framework for systems thinking and design •
in training and education
A project management structure for development, imple- •
mentation, and maintenance
Drawing on educational psychology, educational technol-
ogy, instructional technology, communications, and related
fi elds, IDs now create instruction for a broad variety of popu-
lations in different settings. IDs may support populations in
educational settings in K-12 schools, colleges, and universi-
ties. IDs may also support workplace populations in business,
government, military, and nonpro fi t settings. Aguinis and
Kraiger ( 2009 ) de fi ne “training” as a systematic approach to
learning with the goal of improving individual, team, and
organizational effectiveness. They maintain that development
refers to the acquisition of new knowledge or skills for pur-
poses of personal growth. As it is dif fi cult to determine where
one ends and the other begins, this chapter uses the term
“training” to describe any instruction occurring in the work-
place. Because there is as yet little documented application of
HPT in school settings, this chapter examines ID and HPT in
workplace settings and excludes educational settings.
ID and HPT share a common analytical framework drawn
from operations research and common origins in behavioral
psychology (Defi nition & Terminology Committee of AECT
2007 ). Historical accounts of ID often fail to mention that
similar efforts at systems thinking and systematic develop-
ment were taking place in a wide range of fi elds over the
same time. Of particular interest are American industrial
training specialists and industrial psychologists, who found
useful a similar, ADDIE-like framework for systematic anal-
ysis and intervention to improve human performance in
organizations (for an early example, see Gilbert 1996a ,
1996b ) . According to Rummler ( 2007 ) , the roots of HPT
arose in the 1960s, with publications appearing in the 1970s.
These publications include the work of early theorists such
as Mager and Pipe ( 1970 ) , Rummler ( 1972 ) , Harless ( 1973 ) ,
and Gilbert ( 1996a, 1996b ). In time, leaders active in what is
now the International Society for Performance Improvement
(ISPI) came to refer to the many variations of this framework
collectively as HPT (Van Tiem et al. 2004 ). Practitioners
now refer to HPT using a variety of terms, including “work-
place learning and performance improvement” (Beich,
2008 ) . In addition to behavioral psychology and disciplines
related to ID, HPT draws on additional disciplines ranging
from organizational development to process improvement.
Unlike the research base supporting ID, empirical research
in HPT is largely limited to reporting cases studies consist-
ing of various performance improvement solutions.
Contrasting ID and HPT
In workplace environments, ID and HPT practitioners can
share a variety of goals, frameworks, methods, and evalua-
tion strategies while differing in subtle but important ways.
Four commonalities are apparent:
• Evidence-based practices emerge from application of rel-
evant research, observation (re fl ective practice), and other
credible sources of evidence.
• Goals, standards, codes of ethics have been established,
associated with respective professional organizations
(ASTD, n.d. ; ISPI, 2002a ) . For HPT, two professional orga-
nizations have developed formal professional certi fi cation
programs. The ISPI program produces Certi fi ed
Performance Technologists (CPTs) based on its standards
(ISPI, 2002b ) . The program from the American Society for
Training and Development (ASTD) produces Certi fi ed
Professionals in Learning and Performance certi fi cation
(CPLPs) based on its competency model (ASTD, 2008 ) . • Systemic and systematic approaches are common to both
fi elds of practice although they vary in scope, as discussed
below.
• Formative, summative, con fi rmative evaluation are con-
sidered standard practice in both fi elds, though with some
differences in measurement strategy, as discussed below.
While these commonalities are important, there are also
important differences in frameworks and many nuances of
method. Table 4.1 summarizes the major contrasts between
ID and HPT within the context of the workplace. This sec-
tion describes each of these important differences.
Frameworks
Research Base
Molenda ( 2010 ) traces the evolution of ID theory from roots
in behavioral learning theory and cognitive psychology,
beginning with Bruner and continuing through the four suc-
cessive editions of Gagne’s Conditions of Learning ( 1985 ) ,
and on to current cognitive learning theory. He points out
that this work rapidly matured from an early focus using pro-
grammed instruction and computer-based learning to a more
generalizable framework for a technology of teaching which
could be instantiated effectively in any medium—even class-
rooms using nothing beyond the familiar lesson plan and
standard curriculum materials.

Molenda also traces the development of ID methodology.
He attributes its origins to the application of operations
research to training development in the military, where the
emphasis was on training as part of integrated operational
systems, such as weapons systems. Thus, training was pro-
vided as part of a larger system that defense contractors
delivered to the military. The methodology for this systems
view of training was developed at Florida State University as
the Instructional Systems Development (ISD) model. The
methodology was in itself a systematic method for develop-
ment of training, which embodied both
• Project management principles (such as a work fl ow using
the ADDIE steps)
• Design processes intended to proceduralize the best avail-
able decision-making principles for application of the
emerging technology of instruction
Thus, ISD was originally de fi ned as both a systems
approach to creating training and a systematic approach to
managing training development projects. ISD also had the
goal of systematic design by incorporating procedures for
design of the training itself. IDs completed phases and activi-
ties that became project deliverables. These deliverables
became inputs for subsequent phases and activities.
As HPT branched from ISD, the development of HPT
theory followed a substantially different course. Rosenberg,
Coscarelli, and Hutchison ( 1999 ) state that from ISD, HPT
took the systems analysis framework, but it was substantially
broadened: they attribute to Mager ( 1988 ) the point that “In
the HPT suprasystem, instructional technology is a subsystem,
and HPT is a subsystem in the overall management
suprasystem” (Rosenberg et al., p. 25). Thus, while the focus
of ISD was on the training (sub) system within the context of
operational systems, the systems framework took HPT in a
different direction: HPT’s focus is on the entire organiza-
tion’s performance, and within that the performance of work
groups and individuals. Work groups range in size from
small teams to larger departments to global enterprises.
More recently, while learning theory evolved from a
behavioral to a cognitive learning theory perspective, HPT
has retained much more of a behavioral orientation (although
the Rosenberg et al. discussion of the fi eld’s origins does
include cognitive engineering). HPT practitioners creating
custom software solutions that provide on-demand access to
information, advice, tools, and learning also draw on cogni-
tive psychology to create performance support systems.
Creating systems that help knowledge workers recognize
situations, make decisions, and solve problems, HPT practi-
tioners may use cognitive task analysis to ensure that the user
interfaces they create match both the mental models (i.e.,
“thought fl ow”) and work fl ow that exemplary performers use
to complete their job tasks (c.f. Villachica & Stone, 1999 ;
Villachica, Stone, & Endicott, 2006 ) .
The evolution of HPT methodology also diverged from
ISD’s systematic methodology. In HPT, attention was at fi rst
on the major analytical frameworks. Process models of the
problem-solving process or the project management system
of the sort contemplated by ISD came later, with the work of
Mager and Pipe ( 1970 ) , Rossett ( 1987 ) , Rosenberg ( 1990 ) ,
and Hutchinson ( 1990 ) . It is probably fair to say that the
de fi ning focus of HPT has remained on the analytical frame-
works, rather than on standardization of procedural method-
ology. For example, the standards which de fi ne ISPI’s CPT
are performance-based and do not require the use of any par-
ticular methodology (ISPI, 2002b ) .
The evolutionary trends of HPT development are substan-
tially different from those in fl uencing ID. As HPT practitio-
ners view training as only one of many possible interventions
to improve performance, the fi eld has been in fl uenced by a
wide range of fi elds and the interventions they create.
Rosenberg et al. ( 1999 ) include information technology,
ergonomics and human factors, psychometrics, behavioral
feedback systems, organizational development, and change
management. Each of these fi elds has had its own in fl uence
on the evolution of theory and practice of HPT.
Systems View
The concept of a “system,” is a cornerstone in both ID and
HPT. Brethower ( 2006 ) de fi nes a system as “a collection of
elements and relationships held together by a purpose in
common” (p. 124). As an example of a system, consider an
automobile assembly plant. The plant is made up of a collection
of elements (people, machinery, processes, etc.) com-
bined to accomplish the purpose of producing fi nished
automobiles that can be shipped to dealers throughout the
world.
Starting with this cornerstone concept, a systems view has
three essential characteristics (Anderson & Johnson, 1997 ;
Brethower, 2006 ; Meadows, 2008 ) :
• It is holistic . A systems view attempts to encompass all of
the system’s elements—both tangible elements (people,
buildings, and machinery) and intangible elements
(work fl ow processes, organizational culture, company
policies, and safety regulations). However, the system is
seen as more than the simple sum of its parts. For exam-
ple, in the automobile assembly plant, the people cannot
produce automobile without the machinery and the
machinery cannot produce automobiles without the
people.
• It focuses primarily on the interactions among the elements rather than on the elements themselves . The tangible and intangible elements of the system interact in ways
that are complex, dynamic, and interdependent. Small
changes in one element may ripple throughout the entire
system, influencing all of the other elements. Elements of the system may interact in ways that produce unexpected
consequences. For example, in the automobile assembly
plant, a new piece of machinery will often result in
changes in work fl ow processes, organizational culture,
and safety regulations. • It views systems as “nested,” with larger systems made up
of smaller ones . For example, one smaller system within
the automobile assembly plant is the building, which is in
turn, made up of smaller systems—lighting, heating, and
ventilation. Conversely, the assembly plant itself is part of
a larger system of the manufacturer, which is in turn part of
an industry that is part of national and global economies.
Both ID and HPT begin with this systems view. However,
they apply it to different systems. ID considers an “instruc-
tional system” while HPT considers a broader “performance
system.” Each system has the same three essential character-
istics. However, the purpose and elements of the systems dif-
fer. The ID process creates an instructional system, the
purpose of which is to promote the acquisition of speci fi ed
knowledge or skills. Morrison, Ross, and Kemp ( 2007 ) pres-
ent one view of an instructional system that consists of four
interdependent elements ( see Fig. 4.1 ):
Learners—characteristics of the individuals who will par- •
ticipate in the instruction
Objectives—the knowledge or skills the learners are to •
acquire
Methods—the means that will be used to help the learners •
learn
Evaluation—the means to be used to determine the extent •
to which learning has occurred
Other descriptions of instructional systems appear within
Dick, Carey, and Carey’s ( 2009 ) ID model, Smith and
Ragan’s ( 2005 ) ID model, Gagne’s nine events of instruction
(Gagne, Wager, Golas, & Keller, 2005 ) , and Merrill’s ( 2002 )
fi rst principles of instruction. Each instructional system is
made up of a different set of elements. Within in each system
the elements interact to promote the acquisition of speci fi ed
knowledge or skills. In contrast to ID’s focus on instructional
systems, HPT focuses on producing performance systems
that promote the consistent performance of a speci fi ed job or
task in ways that meet organizational expectations. Gilbert
(reprinted in 1996a , 1996b ) presents one view of a perfor-
mance system comprised of six interdependent elements
( see Table 4.2 ). According to Gilbert, worthy performance
occurs when both environmental supports and a person’s
repertory of behavior work together to produce consistent
performance that meets organizational expectations. This
systemic view stresses multiple elements working at different
levels in ways that produce competent human performance. Other performance systems arising from HPT include
Kaufman’s ( 1983 ) organizational elements model, Langdon’s
( 2000 ) language of work model, Marker’s synchronized
analysis model ( 2007 ) , and Rummler’s ( 2006 ) anatomy of
performance model. Each performance system is made up of
a different set of elements operating at levels of individuals,
work groups, departments, enterprises, and even societies.
But in each system the elements interact to promote the consistent
performance of a speci fi ed job or task towards goals
that organizations value.
Methods
Owing to similarities and differences in their frameworks, ID
and HPT use a variety of similar methods but sometimes in
different ways.

Core Processes
As depicted in Fig. 4.2 , both ADDIE and HPT are linear,
systematic models. Both models help practitioners address
complexity inherent in these efforts by ordering their phases
and component activities. IDs and HPT practitioners com-
pleting these phases and activities produce deliverables that
become inputs for subsequent phases and activities. Both
models employ aspects of analysis, design, development,
implementation, and evaluation. While both ADDIE and
HPT embed evaluation throughout all phases, HPT can be
heavier on analysis, with phases addressing both perfor-
mance and cause analysis. HPT also speci fi es both imple-
mentation and change management. Where ADDIE separates
Design and Development, HPT combines them. An ID
creating training will complete all or part of the ADDIE
phases. HPT practitioners creating performance improvement
systems will complete all or part of the phases and activities
comprising the HPT model, depending on the nature of the
project they are working on. In both ID and HPT set-
tings, senior project personnel typically complete aspects
of analysis, design, and evaluation. Less experienced
personnel often address development and implementation.
Unfortunately, Both ID and HPT models commonly omit a
maintenance phase. This omission makes it impossible to
calculate trustworthy life cycle costs and bene fi ts associated
with these efforts. As depicted in their models, both ADDIE and HPT employ
linear, “waterfall” core processes, where the completion of
one phase leads to the beginning of the next. The exception
lies in evaluation, which provides feedback informing all of
the phases in the model. ID has seen the emergence of proto-
typing-based models that employ iterative mock-ups created
collaboratively with end users. Baek, Cagiltay, Boling, and
Frick ( 2007 ) describe how rapid prototyping and participative design overcome the bureaucratic and linear nature of
ADDIE, speeding up its otherwise slow design and develop-
ment processes. Ross et al. ( 2007 ) mentions the role of proto-
typing in design research and natural work settings. Aside from
a discussion of rapid application development (RAD) in creating
performance support systems (Villachica et al., 2006 ) , rapid

prototyping and participative design do not appear in the
most recent edition of the HPT Handbook .
Performance Analysis
One of the major contrasts between ID and HPT lies in the
area of analysis. Typically in response to some formal or
informal request for training, IDs completing the analysis
phase of the ADDIE model specify broad learning goals as
well as learner characteristics and workplace contexts. IDs
may also specify learning hierarchies and job tasks during
the analysis phase. IDs subsequently use the outputs of the
analysis phase to form instructional objectives during the
design phase. In contrast, HPT practitioners begin with a
performance analysis targeted at specifying the nature of the
problem or opportunity. The performance analysis consists
of three different analytical activities: organizational analy-
sis, environmental analysis, and gap analysis. This phase of
the HPT model ensures practitioners align any gap between
actual and desired workplace performance with the organiza-
tion’s missions and business goals at the levels of the organi-
zation, work, and worker. HPT practitioners will use a
statement like this to describe the performance gap itself:
What we want our (insert target population here) to do •
is (insert expected behavior here) at (insert expected measurement here) What our (insert target population here) are doing now is •
(insert existing behavior here) at (insert existing measure-
ment here)
Use of this convention tends to clearly specify the perfor-
mance problem as well as when it will be solved: when oth-
ers in the organization meet the desired performance. During
the performance analysis, HPT practitioners will also make
sure the speci fi ed performance gap is worth closing.
Harless ( 1973 ) coined the term “front-end analysis” to
refer to these activities, and he addressed what is now the
performance analysis phase of the HPT model in the fi rst of
his 13 “smart questions”:
1. Do we have a problem?
2. Do we have a human performance problem?
3. How will we know when the problem is solved? 4. What is the performance problem?
5. Should we allocate resources to solve it (p. 231)?
To answer these questions and complete the performance
analysis, HPT practitioners will partner with clients, sponsors,
and other stakeholders.
Cause Analysis
In focusing on knowledge, skills, and attitudes, IDs do not
employ a cause analysis to investigate the sources of a perfor-
mance gap. The closest they may get might be to determine
whether learners meeting a particular objective might use a
job aid (e.g., Mager, 1997 ; Morrison et al., 2007 ). In contrast,
HPT uses a solution-neutral troubleshooting approach that
refrains from specifying a treatment—whether it is training,
other changes to environmental support, or other changes to
the personal repertory—until the diagnosis of the perfor-
mance gap is complete. Cause analysis focuses on identifying
all possible environmental and personal sources of the perfor-
mance gap, and HPT practitioners expect to see multiple,
interacting sources of any given performance gap.
In diagnosing the sources of a gap, HPT practitioners will
address potential sources arising from inadequate environ-
mental support before those arising from an inadequacy in
people’s repertory of behavior. The reason lies in the con-
cept of leverage (Chevalier, 2003, 2006 ; Gilbert, 1996a,
1996b ) . Environmental sources of performance gaps tend to
be more common, anecdotally accounting (by a common
“rule of thumb”) for roughly 75 % of all performance gaps
(Dean, 1997 ) . HPT practitioners will consider a lack of skills and knowledge as the source of a given performance gap only after ruling out all environmental sources of a performance gap.
Intervention Selection
In ID, intervention selection focuses on the selection of train-
ing media and perhaps supplementing it with job aids as their
default solution. In addition to many workplace executives,
managers, and supervisors, IDs tend to presume that learning
is good, and more learning is even better. This perception
leads to the widespread belief that training is the default
solution for any gap between actual and desired performance
in the workplace. In contrast, HPT practitioners will investi-
gate all potential sources of a performance gap and then use
all potential means to close it (Molenda & Pershing, 2007 ;
Rummler & Brache, 1990 ) . In HPT, sources of performance
gaps arising from the cause analysis lead to recommended
interventions to close the performance gap. HPT practitio-
ners refrain from recommending solutions (or interventions)
until they have identi fi ed the source(s) of the performance
gap. In specifying only those solutions that address corresponding
sources of a performance gap, HPT is “solution
agnostic.” In selecting interventions associated with multiple
causes of performance gaps, HPT practitioners are more
likely to create, implement, and maintain solution systems,
rather than isolated interventions.
As interventions fi xing sources of performance gaps that
lie in the environment tend to be faster and less expensive to
create, implement, and maintain than those involved in
changing behavioral repertories, HPT practitioners tend to
view instructional interventions as among the most costly
and least desirable of performance solutions. This perception
is sharpened by Dean’s ( 1997 ) anecdotal observation that
only 10.5 % of performance gaps arise from a lack of
required skills and knowledge, meaning that training that
enables learners to acquire such skills is a special case of
HPT, appropriate for closing a relatively small number of
performance gaps.

Measuring Results
While both ID and HPT emphasize evaluation, the approaches
most commonly used differ. In ID, the focus is on training.
Perhaps the most commonly used analytical framework is
Kirkpatrick’s with Phillips’ extensions (Kirkpatrick and
Kirkpatrick, 2006 ) . The purpose of the model is to demon-
strate return on investment for training, using a 4- (or 5-)
level analytical framework. However, the top levels of the
model have been criticized as dif fi cult to develop, and aimed
at the wrong target: isolating the effects of training (Watkins,
Leigh, Foshay, & Kaufman, 1998 ) —a goal of interest to
trainers, but often not to the business. In fairness, we believe
some of this dif fi culty comes from the experience of practi-
tioners who often implement the model starting with the
lowest level, rather than the highest, and thus fi nd themselves
focusing on outcomes which are of least importance. When
this happens, the measures used can be highly misleading.
By contrast, the HPT approach resolves from the start the
challenge of measuring important outcomes: the focus of
front end analysis in HPT is on closing the performance gap
with real business consequences to be measured in ways
which are meaningful to the client (Moseley and Dessinger,
2010 ; Winiecki, 2006 ) . There is no intent to isolate the
impact of each performance improvement intervention,
including training. Thus, development of meaningful busi-
ness impact measures is not an added, arti fi cial exercise; it is
an inherent part of the initial problem de fi nition (Brinkerhoff,
2006 ; Pershing, 2006 ). This approach has the added advan-
tage of assuring the sponsorship to gather the data for the
measures of results, because they are part of the business’
normal work, and not an added, arti fi cial step.
True Confessions: Limitations of the Preceding Comparisons
Thus far this chapter has presented only an abstracted com-
parison of the “classical” analytical frameworks used by ID
and HPT. Both fi elds are undergoing constant evolution,
drawing both on practitioners’ re fl ections and advances in
underlying theory. IDs and HPT practitioners constantly
adapt these fi elds to meet their own, clients’, and stakehold-
ers’ requirements. Accordingly, in any real-world ID or HPT
project, the devil is in the details. The authors would like to
explore two such issues of evolution and context here: the
rise of a cross-disciplinary approach to design thinking and
the emergence of savvy IDs who blend elements of ID and
HPT in improving workplace performance.
The Rise of Design Thinking
Conversations about the nature of design in instructional sys-
tems development (e.g., Boling & Smith, 2007 ; Ertmer et al.,
2008 ; Rowland, 1993 ; Silber, 2010 ) also involve conversa-
tions about the design process and design thinking (e.g.,
Brown, 2008 ; d.school, 2010 ; Lawson, 2006 ; Myerson,
2001 ) . This broad conception of design cuts across
Disciplines, including architecture, engineering, commu- •
nity planning
Professions such as graphic design, product design inte- •
rior design, and textile design (Lawson, 2006 )
Jonassen ( 2004 ) maintains that design involves ill-struc-
tured problem solving in the face of vague goal statements
and few constraints. There are multiple, unde fi ned criteria,
with no right or wrong way of solving the problem, only bet-
ter and worse ones.
Elements of design thinking are beginning to make
inroads into both ID and HPT. As depicted in Table 4.3 , Baek
et al. ( 2007 ) apply them in their discussion of user-centered
design in ID. Villachica and Stone ( 1998, 2010 ) have dis-
cussed elements of design thinking in creating both instruction
and performance support systems based on the use of
Martin’s ( 1991 ) RAD. Readers wishing additional informa-
tion on this topic may want to review Susan McKenney and
Jan Herrington’s chapter on Design Research appearing in
this Handbook.
The Savvy Instructional Designer
In workplace settings, IDs creating training would be wise
to adopt a performance-based approach that mixes elements
of ID and HPT (Sims & Koszalka, 2007 ). This approach
lends itself to improved transfer of learned skill and knowl-
edge to workplace. In this setting, training that closes a skill
gap removes a barrier to meeting a business goal or enables
an organization to meet some aspect of its mission. Training
professionals who fail to align their efforts with business
goals or consider non-skill sources of performance gaps and
non-training solutions proceed at risk, with these factors
being the top two reasons contributing to the failure of train-
ing and development efforts in the workplace (Phillips &
Phillips, 2002 ) . Like their HPT counterparts, savvy IDs
align their efforts with meeting the needs of their workplace
sponsors in ways that focus on results, take a systems view, add value, and establish partnerships (Addison, Haig, &
Kearny, 2009 ) . A savvy instructional designer:
Collaborates with others in the organization to •
Identify performance gaps –
Align them with missions and business goals to focus –
on valued performance Determine whether the gaps are worth closing –
Identifies all possible causes of given performance gaps •
and collaborates with others to address them
IDs often address knowledge gaps by creating training –
and guidance gaps by creating job aids


sources of gaps, knowing that the training department
is often blamed for any unclosed gap IDs may employ strategies associated with either the –
performance support or the technology integration lit-
eratures. Readers wishing more information about the
latter may want to review Mark Lee’s chapter on
Technology Integration Work Settings appearing in
this Handbook.
Collaborates with others to ensure that training transfers •
to the workplace. This involves asking for executive and
management support for transfer before and after the
training (Broad & Newstrom, 1992 )
Partners with others in the organization to ensure that the •
different components of the solution system integrate in
ways that close the gap Employs rapid prototyping and participative design to •
shrink development time while improving quality Reports the extent to which the solution system closed the •
performance gap
Collaborates with others to conduct needs assessments •
and evaluations to answer other questions that keep deci-
sion-makers up at night This recommendation for performance-based ID corre-
sponds to Robinson and Robinson’s ( 1990 ) concept of train-
ing for impact. As depicted in Fig. 4.3 , a continuum of
training approaches lies between training for activity and
training for impact. In the former, a requestor typically asks
for some sort of training. IDs create the training. Once deliv-
ered, the activity is fi nished. This form of topic-focused
instruction often fails to transfer to the workplace. While
training for activity is unfortunately commonplace, this
approach does not prepare people to perform their jobs.
Robinson and Robinson contrast this approach to training
that produces a positive impact in the workplace. Performance-
based training is designed to produce such a favorable orga-
nizational impact.
Robinson and Robinson ( 2006 ) later re fi ne this continuum
to compare traditional to performance-centered approaches,
where the former is characterized by focus on learning produced
in a fi re fi ghting mode largely independent of collabo-
ration with the client group. In this approach, implemented
learning equates with success. In a performance-centered
approach, the focus is on what people need to do in the work-
place, with learning and other solutions being means to this
end. Practitioners of this performance-centered approach are solution-neutral, partnering both proactively and reactively
with client groups to identify causes of gaps and potential
solutions. In the performance-based approach, success means
closing performance gaps. In workplace contexts the practice
of ID should be informed by HPT. While the two fi elds are
not twins, they should be cousins in practice.
Conclusion
Clearly, it is conceptually possible to do ID without using an
HPT framework, and it is equally possible to do HPT without
doing ID. The two fields shared common theoretical roots
and methodologies more than a generation ago, but they have
different goals and have evolved in very different ways into
different professions. That said, it is increasingly common
(except perhaps in academic settings) for ID to be done
within an HPT framework which coordinates a broad range
of training and non-training interventions, using cross-func-
tional teams, and to evaluate the overall success of the proj-
ect in terms of improved organizational performance, as HPT
requires. We believe that (at least in nonacademic organiza-
tions), the trend will continue of training departments
rede fi ning their mission in organizational performance (HPT)
terms. Thus, we believe the fi elds will continue to cross-fer-
tilize and evolve their theoretical structures and methodolo-
gies. For example, the emerging interdisciplinary fi eld of
design is an in fl uence on both ID and HPT. However, we
believe ID and HPT will remain distinctly different fi elds of
professional practice. The savvy ID practitioner, therefore,
should develop the conceptual fl exibility to work effectively
within an HPT framework, on a multidisciplinary team.