A241125 Task Identification and Evaluation Systex (Ties)
A241125 Task Identification and Evaluation Systex (Ties)
A241125 Task Identification and Evaluation Systex (Ties)
Submitted To:-
Submitted By:-
Edward- Boyle
Logistics -and--Human, Factors -Division
0-Air, ore Human Resources Laboratory
-Wright-Patterson-AFE, Ohio--45433-
Sharon K. -Garcia
-Manpower, and' Pe rsoninel -Division
Air Force-Human Resources Laboratory
Brooks AFB, TX 78235-5661
Contract-No.- F33615-83-C-003-0
31-July 1986
91 92-6 0=66-
NOTICES
Publication of this report does not constitute approval or disapproval of the ideas
of findings. It is published in the interest of scientific and technical information
(STINFO) exchange.
The Office of Public Affairs has reviewed this report, and it is releasable to the
National Technical Information Service, where it will be available to-the general
public, including foreign nationals.
Vjj_ For
ROG . ORD, Lt o , -- - C
Chief; Manpower and Personnel Research Division' DTCrA l
an C e J3
8u.. . . ............................
..
Distibutiol I
Avai8bility CoCos
Avail and Ior
Dist Special
yss
REPORT DOCUMENTATION PAGE L OMB No_74.188
Public reporting burden for this collection InfInk alon Is ee.tatd to average I ho r onse, including
ire the time .' r wing insreuction, s aeching edsting d iasources,gatheng
and maintaining the dta neded,. and completing and revie ng the collection of Infomation. Send comnienu mgarrg thus burden etma or any other aspect of thi collecton o
information, Including or ouggee
reducng
d this bun, to Wahingt H ua, Service. Directorate for Infonmakr Ow tns and Re,"%ts, 1215 Jeff~eon Dav, Highway, Suite
1204, Adlngtn, VA 22202-4302, and to the Office of Maa 1ent ad Budget, PapewoKrk Rejuction Project (0704186), ,ymhington, DC
1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 3. REPORT TYPE AND DATES COVERED
August 1991 Interim - January 1986 - July 1991
4. TITLE AND SUBTITLE S.FUNDINGNUMBERS
Task Identification and Evaluation System (TIES) C - F33615-83-C-0030
PE - 62205F
PR - 7719
6. AUTHOR(S) - TA - 19
Waiter E. Driskill WU - 11-
Edward Boyle
Sharon K Garcia -
Five maintenance task data systems inthe Air Force provide a variety of task information. Each was
developed for different purposes, but -only two of these systems use any of -the data from the others. The five
systems are the Occupational Survey Method, the Maintenance Data Collection System, the Logistics
Composite Model, the Logistic Support Analysis Records, and Instructional Systems Development data. Each
of these task data systems and-their uses are described in detail. Data elements which would permit their linking
in a Task Identification and Evaluation System (TIES) are identified. While-the different systems can be linked,
considerable subject matter specialist assistance will be required to -automate the system. Nevertheless, the
potential importance to -manpower, personnel, and training integration makes development of a TIES highly
desirable.
i 2W6-02
14. (Concluded)-
Logistics Composite Model (LCOM)
Logistic Support Analysis Records (LSAR)
Maintenance Data Collection System (MDCS)
Manpower
Occupational Survey Method (OSM)
Personnel
Small Unit Maintenance Manpower Analysis (SUMMA)
Task Identification and Evaluation System (TIES)
Training-(MPT)
iii
TABLE-OF CONTENTS-
I. -INTRODUCTION..............;.................. ....................... 1
Background........................................................... 6
Davelopment and Administration of Occupational Surveys ............... 7
Backg~round Information Section....................................... 8
Duty-Task List Section .............................................. 8
Administering USAr Job Inventories .................................. 20
Analyzing Occupational Survey-Data.................................. 20
Task Difficulty-Ratings ............................................ 28
Training Emphasis.................................................. 28,
PRTMOD .............................................................. 28-
Storage and Retrieval of OSM Data ................................... 29
Reliability of Occupational Analysis-Data ........................... 29
Uses of Occupational-Analysis Data .................................. 31
Critique of The Occupational Analysis Program ...................... 37
Origini of MDC....................................................... 39
Development of MDC Data Base ....................................... 40
Processing-and Analyzing MDC Data .................................. 51
Reliability- of MDC Data ............................................. 57
Uses-of MDC Data...................................................... 67
Critique of MDC ............................................ 9
Origin-of LCOM..................................................... 69
Origin of LCOM Task Data............................................ 70
The LOOM Model...................................................... 75
Analysis of LCOM Model Data........................................ 80
Uses of LCOM ........................................................ 82
Critique .......................................................... 83
Origin............................................................. 84
Developing LSA ............................................... ..- 87
LSA Data Storage and Retrieval ..................................... 101
Reliability and Validity of LSA Data .............................. 109
Uses of LSA Data .................................................. 110
Critique of-the LSA............................................... 111
Wi
VII. INSTRUCTIONAL SYSTEMS DEVELOPMENT ................................. 112
List of Figures
Figure Page
5 Dictionary of Variables,............................................. 23
iv
'Figure Page
V
Figire Page
List of Tables
Table Page
vi
Preface
vii
SUMMARY
Five task data systems supporting key Air Force manpower, personnel, and
implementation, and uses. These -five systems were the -Occupational Survey
Model (LCOM) data, Logistic Support Analysis Records (LSAR), and Instructional
Systems Development data (ISD). The purpose was to determine- if -these varied
informed-, and comprehensive. Also, each system was assessed in terms of its
present purposes and how well it serves these purposes. The basic features of
The conclusion was that while there are formidable barriers to -cross-
walking the data, task matching could be done. While much of this matching
potential for linking occupational analysis data with other data systems.
Further, no single system at present fully serves all MPT uses; heice, MPT
analysis integration. A TIES would provide much of the MPT data needed for new
weapons systems acquisition and for force management decisions, especially for
and greater discipline in the generation of task data is needed and should be
implemented immediately.
viii
- .SK TEICATIN AM EVAWW SYSE24 (TIM)
I. nC]N
Given the present situation in which the different systems provide data
about work accomplished in maintenance specialties and the lack of any means of
interfacing the data among systems, several important questions emerge:
1. Is there overlap in the data they provide and, if so, are they
justifiable?
3. Are the data adequate for the various MPT uses to which they are put?
2
very detailed, showing the action taken at the component level. This detail is
essential for identifying reliability and maintainability of equipment and
systems. LSA data, for the same reasons, are available at the same level of
detail. In addition LSA can provide the component steps or elements of each
action taken as well. LCOM uses MDC action taken data as an input for building
task networks and estimating maintenance probabilities and manhours. As
employed for manpower determinations, MDC actions taken are compressed. Thus,
LCOM tasks are slightly broader than MDC or LSA tasks (actions taken).
Occupational analysis tasks are more generic, infrequently providing any
actions taken at the component level. These tasks are most frequently used -for
personnel and training purposes. ISD tasks- take a variety of forms. Since
they are directly used in developing training courses and materials-, the level
of description varies according to the particular situation. Sometimes they
are broadly stated, incorporating several occupational analysis tasks. Other
times, they are described in more detail than the MDC or LSA actions taken. In
any case, ISD uses require that each task, however described, must be -analyzed
into its component steps. Table 1 briefly displays the levels of task
identification by the five systems.
MDC x x x x
LSA x x x x x
LCOM x x x x
OSR x x x * *
ISD x x x x x
* Infrequent
The word task and the terms task identification and task analysis have
different-meanings depending upon the user, as is the case in the five systems.
Each is based on "task" performance but those actions referred to -as a "task"
are generic at best.
3
The problem for any task identification system is to employ a definition that
is most powerful for producing the data desired of the system. As will be
noted in the discussion of the systems that follow, that which may be-
identified as. a "task" is consistent with the use to which data are employed.
Perhaps it should be stated at this point that the concept task is not explicit
in the MDC, LSA, or LCOM data. Rather, the term action taken is used.
While task definitions vary along two major dimensions (Fleishman- &
Quaintance, 1984), the tasks of the five systems fit under only one dimension.
They state that ".-. .task definitions vary greatly with respect to their breadth
of coverage (p.48)." At one end of this dimension-are definitions that view a
task as an integral part and indistinguishable from the larger work situation.
A task is the totality of the situation imposed upon the performer. Such a
definition is consistent with the practice in ISD.
At the other end of this dimension is the definition that treats a task as
a-specific performance. MDC usage, in the form of action taken on a specific
component, is consistent with this definition. LSA usage, which parallels MDC
except that it is developed for new weapons systems, also is consistent with
this definition.
Ranging.between these two extremes are the definitions employed by the OSM
and LCOM. These will be described in more detail later. Briefly, the 0SM
tasks are more generic (Driskill & Gentner, 1978), LCOM less so, but not so
specific as MDC and LSA. LCOM tasks (actions taken) are derived from
combinations of action taken entries in MDC (DPSS working paper, 1986). Each
of the five systems is in fact a task identification system, not a task
analysis system, except for LSA when task analysis is also procured-
The term task analysis, too, has a variety of definitions (Fleishman &
Quaintance, 1984) and provides an even wider variety of information. In
practice the term has sometimes referred to data that are more appropriately
task identification, as in the frequent reference to OSk and LCOM data as task
analysis. These data have never been purported -to be task analysis. More
correctly, the term applies to situations ranging from the analysis of- tasks
into their component steps to more detailed analysis of such items as knowledge
and -skill requirements and cognitive processes involved in task performance
(Horton, 1985). The scope- of task analysis information is extensive, Reed
(1967), for example, cited 45 different types -of such information.
In regard to task analysis, each of the five systems provides, first, the
source of the analysis--task identification. Second, each provides
information, such as equipment used, relevant to task analysis. Of -the five
systems, only the LSA may provide the component steps of a task. •-None provide
all of the information needed for task analysis, however one might define the
term.
3. Use of the data from any of the systems is limited by its level of
specificity.
5
Background
Research -conducted by the Air Force Human Resources Laboratory (AFHRL) led
to the establishment of the OSM operationally. According to Christal (1985),
research began as a result of an Air Staff requirement for a job analysis
system that would provide detaiied -task data collected from a large number of
job incumbents. Initial efforts were directed at devising methods for
describing the tasks of an Air Force Specialty (AFS) and administering these
task lists to job incumbents to collect job data. Later efforts developed the
Comprehensive- Occupational Data Analysis Programs (CODAP) to analyze data
collected from the incumbents. Although the OSM was implemented in July 1967,
research on job analysis methods and CODAP extensions have continued, extending
the technology greatly. The present OSM is an outgrowth of this research and
the application of the processes to most of the AFS and to more than one and
one-half million Air Force enlisted, officer, and: civilian job incumbents. In
addition the procedures were -adopted by the other U.S. Services and the
Canadian Armed Forces as operational programs, as well as other allied nations
and civilian businesses.
In the beginning, the- objective of the OSM was to provide data for use in
personnel classification, specifically for AFS structure and description in AFM
39-1. The importance -of task data to training decisions soon became apparent,
especially for use in ISD and was followed by the emergence of other
applications. These added applications caused subtle but significant changes
in task description methods and in CODAP. Still, the current program clearly
and closely resembles the one originally- implemented in 1967.
6
5. Relative training emphasis recommended for each task.
The numbers and kinds of data reports- available are almost unlimited and can be
overwhelming. It must be clearly realized, however, that these reports provide
the six kinds of information shown.
Initially AFSs were scheduled for survey on a periodic basis. Now, the
schedule is the result of a priorities- working group consisting of
-representatives of the Air Force Military Personnel Center (AFMPC), the Deputy
Chief of Staff for Technical Training (ATC), the Air Force Human Resources
Laboratory, the ATC Surgeon- General, the Air Staff (DPMPT), and the USAFOMC.
This group considers requests for surveys three times a year. It also reviews
AFSs for which surveys are dated (four to five years old). Requirements are
assigned priorities and scheduled in Section 8, Volume II, Program of Technical
Training (PTT). The PTT provides for a three-year schedule.
The PTT is the source document to determine what AFSC are to be surveyed
and when the completion dates are projected. In addition the PTT provides the
date the last survey of an AFSC was completed.
The basis of data-produced by the OSM is the USAF Job Inventory for an AFS
being surveyed. A Job Inventory consists of two parts: a background
information section and a list of duties and tasks comprising the AFSC:° The
Job Inventory task list is the instrument for collecting -the percent members
performing and relative ratings of time spent performing tasks, task
difficulty, and task training emphasis. The background information section
provides the data from which computer-generated job -description reports are
designed and the basis for summaries of work environment information. In
addition, the background information contains items to capture job incumbent
job satisfaction, sense of accomplishment, and reenlistment intentions (for
enlisted personnel).
7
Background Information Section
Two examples of data-gathering items from job inventofes e' s0wn in
Data collected from the Backgroufnd Information Section are essential for
later analysis. These data serve as the basis for generating job descriptions.
Inadequacies in the content of this section limit data analysis, because
information essential for identifying job incumbents for whom job descriptions
are desired is not available.
8
RACKGROUN INFORM ATION I CASE., ONTROL NUMBE R
C=)C==
C=)=C=:)C=) - --THISSEE
FORM-IS SUBJECT
SEPARATE TO THEACT
PJVACY PRIVACY ACT oF I974 g
STATEMENT)
C=)C=C=C=)C_= PRINT YOUR-ANSWERS AND CHECK PROPER BOXES
CZ= ) C= " -NAME (LasL, First Middle Initial)
F:i[
SOCIAL SECURITY ACCOUNT NUMBER I - -F] I
DATE IUse numocis only)
AS AMN AIC
[0
0_ SSI
I F1_ [
T Sv-,
hNSgl SStI
]Leal CMsIt _Yea:
_
Month Day
CIRCLE THE HIGHEST SCHOOL GRADE OR COLLEGE/UNIVERSITY-YEAR YOU HAVE COMPLETEO-IncIuoe eaual level, li e GED. but
NOTfspecial It;indinL like vocaional..outside-cerjlat school)
[n 1
r-(Le&ve bltik if none)---I
i l I 0 11 r-(LLave blank if none)---
i-1 I
I- 1..., I1
r-(Leava blank if none)-
OA O O OF OH OJ ON
AC r,,_JUSAFA [. USAFE DAFLC DAFSC lT [HQ USAF
OQ OR OS OT OU oY 09
MAC 0 PACAF ,r SAC 1 -jTAC ESC AFCt [-AFMOC
is 1wrSPACE
j-"-' r] AF ENG 3C AF
EUOP€ 3V OTHER',
*- cr L SRC ~ _A
LEMENTS F AF-- ELEMENTS
EJOTHER
SER CTR _JEUROPE
TITLE OF YOUR PRESENT-JOB (Duty assipment). -NOTE: Db. ';t; ve)u..r:', , o,,.J A ,Force S ecially UNLESS t-I is the onl
name you job has)
YEARS -m0NT)HS
9
rEaCIXGROUND INFORMATI ON- (CONTrINUED)
C=)=)=)C)=BLACM~] CIRCLE (1J TO THM RIGHT:OF
EACH RESPONSE YOU WISH TO INDICATE
O- gus!b101 h~L~rougl
4~u 43 5 'f~. 77cle k 1 -to Ene riga -
iuuld-IE o~elow
t±. apic Due re '"L 4ave _?SeDS-Yqu
~ deoLcm..eanyCourses
-~ ~ ~ ~~~_~ * -
-. ~ ~L~L~'h3I~
lrZaIL. Lect~ric Systems IEfinic-al
~
~rcrafL
Ri
~~~YST
eniyzers
'eneraltor-e, SEWV
ans Cvarl-orivaj-
-
1 o oa
.3. R~meLers-
o. battery Ana~yzersI
l0
* AFS 423X0
[
15o
16.- Load
17.,
Linedr Actuator Test-Stands
Ijultimeters
-ank's -CC0DQD0a>I
z____________
0CCCC10
19. Oscilloscopes- -
21. Reflectometers I
22. Resistance-Decade Boxes -~ ~ xCDD0a
Figure 2 (Continued)
11
- Ar Z 4_'3X0
-m ~ I t~1 I__ __ __
.. I. noicate Delow any aircraft e.Lectr.ca.l systems you maintain-
-in -your -present assignment. Blacken--circle (l1)to-the right NW I
oraeach response you wisn to inoxcate.
i . A-/D0LUMDOC
- - .. AU-.L4UA
- . L-i.=U I CDDCaD aa
- 0-iJU 0000000/
1U. C-7Amo
- i. U-9 MQ>ww
-12. (.-1SA 0)OMM
10. Gi3;-*MOMOM
11. G-1403=UDLa:a
- bIt. C-Unjxoc1X
19. C-H30DD7DDaa
20. CH/M~-53C O MW M
- ~21. (;T-39j ~ o ow
12
AFS 423X6 1
25. F-4C
27. F-4E-
30. F-5B
31. F-15 -
32. F-16-
38. fIM-3E-
CODE-0-9Figure 2 (Continued)
13
c=c)=_-)) AFS 423XO -
=)c=Dr_)c=)c=BACKGROUNDINFORIIIATION (CONTINUED)
41. Ioa-ac.a
~a
30u. -3
- :,±.
jc 00 o-
33. - 1437 ~x
-- j Un It-
---.
- 3~~~~~~.
Uf I F~ _ _ _ _ _ _
O3V. WLut
ibuZD( 1
- 01u. WL; - - -I
-14
AFS 423XO
of th repe hich-bmaena
d c~ oucseet
lt.f o
n~loescribe
aa u-
yorpetjbassignment OI0. Chorsexaonly on resioa -18
blacke aicl
no a ) bgesid tha aircraft maintenance function- o~aoxe
-1 . i vionics
Lae aic at op
nenance l n ti n w ic e c i ~ s]
15
be most effective. Tasks are listed alphabetically within a duty, research
again revealing this organization to be most effective.
16
a. Each task should be time-ratable--that is, the job incumbent can
reasonably estimate the relative amount of time he or she spends-on each task.
This criterion normally eliminates tasks that begin with such words as
"insure", "have responsibility for" and "understand", which make it
difficult or impossible to determine the relative time devoted to this
activity.
17
=C)) C) 1. checkc lais you oeflornow '~ CHECK TIME SPENT-
C= C C )C )= 2. it you don't do it - on't chieck It. Present Jot)-
C===C) 3. In th'e"TimetSDent" COlumA. -Y I"ta~kS an time Soent
1ate all Ch~ecked RATE__________
= C)~ ) C ) in otesent lot. It you chaecked it -Rate-it. I. Very. Amali amount.-
AFS 326XOC/D 3vc
2 1.s. avg.
C==(C=)) PNCILONL-PLASEIF -1. Slightly 60;0. *'5.
PENCILC=C DONE S. Abouta @,.
NOW 6. slightly 4-80. avg.-
S NOTE: If any task you perform under this 7. Above avg
dutY is not listed, please specify V~ uhu@*5
on blaak pages at end of booklet. thK9
Wl Very *
!3,.:mrq .
CODE 99
Figure 3a. Job Inventory Task List, AFS 326X0C/D
18
C=C--=):-= 1. Check tusk~s you perform now I V" CmECK PIMESENT
4 =C)CC)= 2. It you don't-fO It -Don't check it rAetJo
C=C)=C)Z 3. In the "Time Spent" column, rate all checked IV taskcs on time spent RT
C=
C ) in present job. If you-Checked-it -Rate it. - ~ '. ml mui
C= 423X0 2=)
. sac le. ,.
- ELECTRICAL SYSTEMS
-. -3l~ 4 Mafucio ns on ALgenerator systems WD(DGaaa
~
239.~~ ~
rdun va M ir~dc
Uraf L power cistrti 0 )ion)C 1 0CDC
circuits
2-46.ibutce wafunmns2o aical power isrioution
circuitsI
Z41. 1SV 1 6H aircrat 11i control circuits VC)MIa(l
Job incumbents in the AFSC being surveyed respond directly in the Job
Inventory. They complete the standard background items and the work
environment information appropriate- to each of them. They then complete the
duty-task list section. Each respondent, first, goes through the entire list,
checking those tasks performed in the present job. Next, the respondent rates
each task on a 9-point relative time spent scale indicating the relative amount
of time spent performing each task. The scale and instructions to the
respondent can be seen in Figures 3a and 3b. It should be noted that the
relative time spent means relative to all other tasks; it is not an absolute
task time estimate.
Background information section and task list data are entered into the
AFHRL Sperry 1180 for processing. Initial processing provides further quality
control of responses. Final input must be at least 99 percent of the initial
input.
20
Of special relevance to TIES are several CODAP computer reports (briefly
described in the Appendix. The first of these is a duty-task list (SK XX,
see Figure 4). This report is a complete listing of the tasks and duties in
alphanumeric order as they appear in the- inventory. As the figure shows, each
task has a unique alphanueric identifier (Q636, for example). This listing
would appear to be the priAry source of OSM tasks for linking with tasks of
other systew.
Anothe iqztart initial report is the dictionary of variables (DICTXX,
Figure 5). "lis catalogues by numrber each of the backgroundinomtn
section itm. Those numbers preceded by V indicate item in the -background
section (for example, V0011, Duty ASC, in Figure 5). Those numbers preceded
by C are computed variables. In these ccputed variables, values of two or
more sets of item may be combined, or may represent variables generated from
the data. This report provides the format for each of the background
variables, and is essential in designing subsequent analysis.
Also at this time, a job description for each survey respondent is
computed and stored for later retrieval as a job description for a single
individual, or for combination with other respondents to form group job
descriptions. 'These cuibinations of individual job descriptions are based on
ccrbinations of variables selected, from the background section items and
prpsented in the dictionary of variables.
atation of individual job descriptions is straightforward. The time
ratings a worker provides for each of the tasks performed in his or her
present job are somned, divided by the nwber of ratings, andmnultiplied by
100 to
yield a relative tire spent rating for each task. The job description for any
individual can be reported, along with the background section items to which
the worker responded.
cmrutation of group job descriptions is also Forplicated.
For
whatever job description is designed, -based on one or mre background or
computed variables, the job descriptions for the individuals who responded to
these variables are combined. The relative percentage of tine of all in the
group for each task is suned and divided by the nwber in the group to yield
an average percent tine spent for each task.
The tire spent performing tasks is -not only a relative figure, but it is
expressed as a percentage. Since it is a relative figure, it is questionable
that any attempt to convert to- absolute or real time spent should be made.
The purpose of the relative time spent totals is to rank order the tasks
performed by any group front the nost to least time-consuming. -Since
incumbents may work shifts of differing lengths, or may only be working part
tine on some or all of the tasks, conversion to real time spent is difficult.
There is, haever, evidence that the ordinal relationship between
relative tire spent and actual tii- spent is high. Garcia (1984) reported a
correlation of .81 between relative tine spent on the 9-point scale used in
OSM and actual observed time. Earlier, M.Farland (1974), in a study of the
use of occupational analysis data (OSM) by nnagement engineering teams (MET)
21
coo
I -7=0
winn
w w
WlLii
_j = U
< =
L
C
0- IL C0
wA.0 -, Li
IL C, 4 . V4 41
.44 Ca a .. . a G w ~ in
3 3 A XI9
C E
aA ' .O I Ia4 L. &n4
o~~~~~~ 2.4 > IC CI L -
'4' * J
0.5 - L
I~4 .6 -. 1 Li 1,
AI 4, Ca. a0 U, -14..','~
*. L, C_ -1. i 91 M
a 0 W_- a-.- C uu4 0 L. 0-
CC al 0cnr. a a ),4,M *-i ).4 wo a I -0 M L L.- 4J
Li C Li L- IA -I>.JWA A ~ . I I CAA1- > >- a C..
- 00. . .4C0 4.4.. r 003C
. a* I L- L aC 4,
4..~ ~~~
4, 4. 4,9I-LC 4. 4, ~ ~ - .4A L L i- - I.4.
a - - - -- 0-- iC
1~ a . U vlSq C CO41
MEAAA
co41 m I= a
4 - -- 0 ,4 - I - - .N C I I~ UUL iuE
(* I I
=z
,
0 w w
.
I; oC.JI-
V-
5I 1
I
~.4--;~ii
M W EaCCII
L4,
aAal tM in.
C'N3N3N r.
I I I
I
I L a 4
I u.
.4 U. wwU .0 LL. U. C-I ) 3..,:04U'a
c
C ~ ~ n.-00 A
MT_)
V D
I'
1.4iL
0C.(4 C
VA 0. 144, I AAIa aa
I-. U 0C4 a &iC 0 LC4L.~aL.4.4 z CC %..1, f4.W W
a00C a =.a i-a- aC0IL a00~I~ I Ii I IN-i L
a . Uo . IA I r-
IA
Licfl-.0
4.
0
Li 0 Ci L
0 ai 0 0
L.
0
0
L. L.
-A
0
i0 C
I(n2 r 1
fmI I
I
UL6
L.
4,
-
00
4,
,2 a
A
-C
&-'=
I
W(4, 0,,.6
IL-4 W44.4..0
, IU
ED
~ I 1,
LL-L
. - 4-'
iI 1,44
.U WU.UI ULL>
,44-
01 11.411 i
r_ 0 VC O 41 N ('Ca~ -i -l U2'NO f f f '4 ('2> mC2 I -C '
4 a = 2, D E0 !. 'S-2 -10 1 0 0 .- O C a4--
>4 P4I~
win n l4
am I-, iIA I
Li I .4I CC-I
0I cow I 10--I
. '0 N co 4,
0.-0 W N ('2 i, am
4. I i c Ln Kq. o c- C''@
-r 'ON
-a.4K004N la-I ' 4 u
4.4 %0.~ 0 -0-0 -0 'a %a '0 vC 0 '00
-4G000''.' '0.00'o.%0
a.4 -'0.0
0 3 o 00 0 0D 0 i _ c a: c W w : 1 ui tn cn u wwf m
I II
x*
I I
'0I
ENI II
('2 I 22
000
(12 C2
W 0 fn
M UI
LLIW
U=0-
- N-.44444
c a)4>=444 P' 4
C3
- ~r ~r ~
-O - 'r. t 'r,
r.'JU
3.. 0- , - - 0- 0Z
W tn-. 1W.4 s 4i -
L..0 C & = C 0 ..
o =- 44.4 a~ a.=.=-
tr -to ,. 0 AnC..J
000 44 aOO00
-~~~~. 44(0 I IA-11
rd X0 P4' :1 1 4.L*-
40.4
04 &44 .-0C..C
44 U.U.L L & U U 4- IA %A M00000
4 IL ti ( M W44. =4 -0 . 0 '.
W . . W __ W I! Q
.... -, L - _'. 0 M..' ; 4 )I.0 cc44
0 =4 t-0 c.0nrC;f 0. _a0 C
Z . 3 . .U L44 L mC.C. -=.=-4.(
4 =EN-ala 4
X4 4c-
'0(
4, on a4 a0 L. §4 009 Li.0-~ I. L0 L
o0 a,4 W 0 U v ' 0 CL 09='.-44
3. IL a 41 41 .0..L
CY>>>L L0r 0If .<< .L .v0 - '0 L 29L 4- te44.
LL . 4La a= -'
00iW _W OLZ - - a4. a a,. a, C.>U9444490 OC- c
C2 W 6,q 044 0 44' a" > - - -- -N&02aa .. 004 04-4-4-4-4-4
20l6
to44444CU ot E1 E. 0
.4 :- * zZ:F M.0C> ot X30 3 0.
44J~it9I4. 49 0'0' E0_ a -. ,.o
L 0 ",.44 L 2.1 -2 z9904
4 0 4 L4"44-!-a44200C
0 UU..,1.L.I
4440 ' 40. VI.. 00 0 ~004444234
cin
C .
wn
- - -- - - -- - ----- ----
0j <
1:0
0.1
-- " N C
LUm .u a
I-
C C C C CC CC CC C CCC CCCCC
w
,.0, "If1r a, . ,
NfL. tlu
"" ', 2-. cI N7- -N. C rN-l_,
aCON N N_ N fl-- - I" -.P7
v.,NNNNNNNNN
aC =
Q w a, - m.'N -N L.. &-N1n
N NNN .
a, L- L ,4,J Ill. to L a G f- L
_a aSI
ll
VM
'<Z C'J(7JC'
II,-
----
NJL L: L.r
V
-toZ..
I C~C
MUC3IL
rIAnmL C"
U%,
I" :l. u,
" I -I,---
-
o,
IA
IIA0
IA-
.I.,_
=:+
%- IAIA
Vt
P7t
C -
-
Z'9.'-I I aA90 A I
c L
.wqC
A- 2.jI
IA
0a
IA eIA -.------.-------
-iV Ag
* -AI
LA-
vi2L
a.
I =
>
A
gu
LA41 0L ta.
LJJ M-
-AIA,I
=,=
4B
NC
ViL
-toC
IN-s
LAC
c-LA
IL
aI f
AA9
Il
a
-a I -A
aC-
IA LIA IA
- CN.J-- _W
_
-
0 C
A
.
IA IA -IA IA 4
--- LAnG.
I
L
L
0'
. CA
0a
---
-~ ~~~~~~~~~ . ->A
-A- -AJ-I
-JJ'INNN..
-- -aN
-CIA- -~B-A - -C IA
-J m
>
NJ ~ 00-
>>
~..J > ~
> rN r. M
~ AL 11 Z-ILa CL
-I a&aAN
9
a,~aN
L 41 -a
91JN
. LL INL =
. JUJLI ,41-
-
rIaLMA..L _ -IA
aIA
1 LII
a L r- -. mLA
CU
C.
IoA; NJ -A C -ZN
-- - -A-A
IA W O J J L LA aI - J~
UA 04 A9C%. A 0COC.A.AI-N. IO--ALLAAOAABICI.0 L L
a> =C2nI.CN2AI-~.CJ-N20-JLAIAAAA-
:2> OOJA0CAN
-- B A tI--
O~. .2 ..- 0L E I CI.0 A A AIA~ ALA 01 A . L224-.0
and the measured actual time provided by the MET. In the civilian community,
Page (1976) found a similar relationship between relative and actual percent
time spent. Since the ordinal relationship- is quite good, it may be that some
linear transformation of relative time spent to actual time spent is possible.
If such a transformation is possible, use of OSM for manpower determinations
could be valuable. The importance of a relative-actual time spent
transformation method would be especially important for LCOM modelling of non:
maintenance AFSs.
The analyst determines, from the diagram and from actual job descriptions
for groups selected from the diagram, the structure of the specialty. This
structure is expressed by the job types--those people who are doing the same
tasks--and job- clusters, which are groups of highly related job types. These
data are important for classification purposes--deciding whether a career
ladder should be subdivided or merged with another--and for training decisions-
whether to channelize or not. The data also are useful for multiple other
purposes, such as in the benchmarking and performance measurement areas. Of
considerable importance is the existence of different jobs in every specialty
(Driskill & Mitchell, 1979). The similarity, or lack of similarity, dictates-
classification structure and training programs. A significant problem with the
grouping process is- the lack of an empirical basis for determining clusters and
job types- or for defining the level of homogeneity or sameness of the work
-performed by the members in a given group (-for a discussion of this issue, see
Harvey, 1986). The product of the hierarchical grouping process is provided in
summary form in the- occupational survey report prepared for each specialty.
25
xO a 0r 0U)V
-o
CV N~G. C.) m
> CC
ON -C ) N 0 0N'0N0C4 .0 0 C CO
2 ~ 0 00. mN- .7-Ml CY rCP. 0 -0 N
C 0
- -L00.
00 1
00.O 0
Cy 0-
LncnaC-.
CO V'
0
a,
N
.0 c') -
---
OCy
3
cc
OCU
== 0 0 -NM 'T W WN M N. 0 .NLNCO-0N
N.N M N M'
wmzC C
mm_ C 0.
=s-
4"
wcn
owto
~ ~ 0000..
4,
~~~... ~
N0..Ll
...
~... 0 ~~ ?. ~
0. .
l
...
-I 00) -0..0m
V,0N
00
. .0C . .
.- C~
N .N.NN.N
Lnc'-C.
N.
N00
. ..
CC4, N 000C)C :C -C .0C .. ) -r l O . ' 9 -)C .C
C.mCNC =aOCO0 Mo, 0 D-CpC c 3 CO C=D mmN0 )
WW1
C L C
to t . C0o a
AD o, a~, N . E. .1 C_ c -oZ
C--t.-.- =&, * Eu
a oata2- a
a. w - Ad COf-r.
C .L3=' .. nNW
a0W
i.. j' c 1 0C' 00.
r- r_ CC 0. -O '. .0.t
4,_d
L0 L oU4 0
Go LA
4,~'s
.a ..- Cn ~- -0
.
U! C.
!-
U!
S r
L-0A..
a.toCat -0
U2-C
I- -W L ut t3 4
MN-LIA . 4
-C C-w),.' 0 *J)0 = 6- , -V j = aU! .)r - 41e. -=
a, I . 0a C =-I . , a o 0:3- 0-c - C.j
-U -1 kn Cn
c. U! -
U C. aZ -~ =4, -L6 ..- ~ aM I C
I
. 0.
-Ca_
C-
U!C
Ci
-a
_%CU.X
-~nE
.- -
X a
=-= -U!
EC 2. a
4
-
U
2
4~.
U) to . L
a L
.
L_
~.
CC
AC a
t .
m'-4U 4, C~a 0
U! C0~- C
a~
V,.. CO.0..' IA Ow -1AWI 4, . -fU&
-U t.=-U .U-
4, 0CW C - 0=.U! IU!L L.4"~! CZ.'C a - -a
4, GO !2 0J!
.w L (-. L I. LU!_0aCC .: a 0to 0F
0, alf 4'-!U.. a.U=! -U! to - 3'C: -u IOU! 0... -C 4j >I -'0 0
to : l C
oj-t .. 3 C0 to in U ! W .0
o! A4, aL zCC- 'C a :_ -> LU
& 4 IA-C. - 0Lw>wL4__ 0 0 In j , bl r.C- C U!
C4 N. l U C a
... LI0 C,.3 C-lLz ; .'O..
cc c -- L)O C3> a!.C
-0 0 ) CCEP...cU!4,
WU M-0L -.-- U!- 4, (m.U4,
a. 10,-C'!
0.0 N N 0 V *
r!LC
uJC V- L.C-ZZ
inccmow a- m Ci -or-ic 6.uiaU!c 'n-m Zo o-U!
Car CC y ,0El - -0 -ON- -- l4, 4IAC P COO C
w,,
U.
U!,~U-CCLUU!-U! L a-
Lw
C C .. U N.-'
N
N26C) i
cm U) m CVI , 0 C
> 0 C 0
W CD- N OOO~C0C
--
0
L)
L0w -. 0NL.40 .
.0
&'- -N C C1) N 0 C4N
a-.=' 0 N- m m0 N w~0~
1 caj Lf c. N
4- C, C. -4 N,-4 C
Q1.
C.
InI
4.
I-
.2
- 0- -N
i c-
0 ft
Co
N0 t-
a
CD
N ~ '' N
c cm N o -NM so 'l
> 0.0.
-9 W,41 ,a 6i
01 0. . 0U) r f - aa= I
LI c
.0 >1moo
U)0
27 J
In addition to the job descriptions and- background data for each of the
job groups (described above), a number of other analyses are available.
Typically, job descriptions for each skill level, for each using command, and
CONUS-overseas group are reported. Each of these is accompanied by a summary
of the background information section variables for each group.
When the background section may not include appropriate items for
computing a desired description another option exists if the task list is
equipment specific. One or more tasks can be combined into a computed variable
to use to identify persons performing the key tasks. This computed variable
then can be used to print a job description for those personnel.
Using the USAF Job Inventory task list, the USAFOMC obtains responses from
a small number of senior technicians (30 to 50) about the relative difficulty
of each task, defined in terms of-how long it -takes a person to learn to do the
task relative to all other tasks in the inventory. The raters use a 9-point
relative difficulty scale. Results are averaged for each task across raters
and summarized and an intraclass correlation of the responses to determine
interrater agreement is computed. A .90 level of agreement is the criterion,
and over the years few instances of agreement falling below .90 have occurred.
The task difficulty ratings can-be reported by each task alone, or along with
other task data such as percentage performing. An important point to remember
is that difficulty of tasks associated with specific items is not available if
the task list specificity-is not at the end item level.
Training Emphasis
PRTMOD
While numerous data displays are available from CODAP, the most important
for TIES may be the product of the program with the acronym PRTMOD. The run
stream producing this display permits the matching of survey tasks and
associated data items (such as percentage of incumbents performing and ratings
28
of task difficulty and training emphasis) with data items from other sources.
Most occupational survey reports include several of these displays--at least
one matching tasks and Specialty Training Standard (STS) elements and another
matching tasks to Plan of Instruction (P01) criterion objectives.
1. It shows how information and data from sources outside CODAP files can
be matched and displayed with survey task data.
4. ASCII CODAP, developed by MAXIMA for the AFHRL, will permit various
statistical analyses of data from the different sources and display of these
analyses.
For TIES, the PRTMOD process will permit the display and, potentially, the
analysis of data from the various task identification systems. These data
would be grouped into modules that may be displayed individually or in
combination-with other modules.
29
I
B
B
I
C;
*r
1I 00
WLn I
.0
U
1
I
000
Ar U
1
I 0 00 1
I
Co
r
0
a0 B 0 1 0 00 1 .) 00 to B 0Y 0
ai B; Wf I;
I -U II I 0- Is I 4 0
co' ca ca 0. N CD rt"C a
- ca I M I I ! 0 v
I B
I B . I C. Iv
-a Bn B~ W 1 Ik
7 - 7; B
B- 0, -0 1. 1
W- I B B B
~~C ~ BBLI
0 toB .Z .O
U ~~~ ao B
ac I c a I - Ba0c
1 .
B
wwI 2
-l B B u. LL 0 .
>- * I aa
zrC BBI B! I W.
00 BN
1'. a~ M~. -CU I -
I CD
crI - %I B B 0 4 -C10 B m C L B4. . c
II
B I L a I VJC
c Bn I (a 1ara ) 10 U r. w a o-a -- .( .
B ~ ~~~ -' a2
B W
B,
1 C- B1 BI 22 B" Co B
.. BW
I ) I B.
10 B) 03 MC B -
B)
It B B0
It La 'I aC cLa a.
as
I Bt
1L CC B1 .L
3r I r Bl
1 I.- L .1 K-a
B
B ~~~2 Ci - 0 2 C CL B
B I I 0) B IC -V IL, I MCC BI
a aC L a -a CC B Ca B0
within each of these ladders were randanly divided in halves. Consolidated
job descriptions were cczputed for each of these half samples. The percent
pbrfonning and percent time spent values for each pair of job descriptions
were correlated. Table 2 is reproduced from the Christal report. The
correlations, as the table shows, are very high* In addition, Christal
reported that the high values were applicable to job descriptions for as :few
as 15 mmbers.
The second evidence of stability of the data over tire is that reported
by Driskill and Bower (1978). They reported that of 76 career ladders
surveyed between 1 January 1977 and 30 June 1978, 71 of them represented-
resurveys. Of these 71, 59 remained stable over the tire since the previous
survey. They specifically cite two surveys as examples. No differences of
structure were found for either the Dental Laboratory or Recruiter
specialties. Further, for the Recruiter specialty, none of the participants
in the second survey were the same as in the first: inventory development
specialist, analyst, or respondents. Since specialty assignments are
controlled tours, none of the personnel with the AFSC who were in the initial
survey mere still assigned the AFSC five years later when the resurvey
ocurred. - Yet, the two surveys yielded highly caqarable results. Other
evidence concerning the conisterny and stability can be found in Christal
(1969).
R !!Percent Time
Percent- Spent by
a
Air Force Specialty 1/2 N Performing Total Group
Helicopter-Mechanic-
43130 39 .965 .955
43150 256 .996 986
43170 81 .961 .828
43190 26 .957 .965
Medical Administrative
90630 76 .936 .894
90650 '347 .985 .969-
90670 189- .981 -.95L
90690 56 .968 .96i
Management Engineering
73331 42 .971 .958
73370 101 .984 957
73371 180 .994 .973
73391 133 .952 .889
Outside Wire/Antenna
36150 199 .987 .965
36170 92 .963 .919
36190 22 .958 .895
Electridal Power Production
54330 70 .953 .952
54350 457 .987 .986
54370 143 .981 .970
Radioiog-
90350 180 .997 .996,
90370 78 .986 .975
Education and Training
75132 146 .983 .972
75150 45 .974 .918
75170 30 .960 .935
75172 381 .995 .992
75190 28 .891 .895
Medical Materiel
91530 63 .888 .8!3,
91550 292 .978 .959
91570 137 .949 .908
91590 21 .931 .890
PreventiveMedicine
90750 113 .979 .955
90770 63 .966 .917
'Jet Engine Mtchanic
43230 76 .979 .961
43250 473 .997 .992
43270 241 .985 .968
43290 35 .982 .974
aThese values indicate the number-of cases in each of the two subsarnples. Total number of cases entering into
computaions reported in this table was 9.822.
32
Second, two or more career ladders are -sometimes surveyed jointly to
determine the degree to which members of one AFSC perform the tasks of another
AFSC. These data impact decisions to merge career ladders. The same
difficulty exists as for decisions for dividing an AFSC--the data do not
address similarity of knowledge and skills involved in task performance. The
question of whether members of one AFSC can effectively transfer to work in
another AFSC is not normally addressed by survey data. There have been a few
occasions when the question has been addressed in multi-ladder surveys by
employing concurrent interviews with members of each of the AFSC being
surveyed. During these interviews, each task for each AFSC is reviewed to
determine similarity -of knowledge and skills with the tasks of another AFSC.
This process is much more time-consuming and resource intensive than the normal
inventory development process. The process does, however, yield greater
information on which to base AFS merger decisions.
A further use for classification has been to use task difficulty data to
determine aptitude requirements for AFS (see, for example, Garcia, Ruck, &
Weeks, 1985). As a result of research by the AFHRL, difficulty ratings for the
tasks of a specialty are benchmarked so that they may be compared with the
ratings of other specialties. Based on this benchmarked difficulty, aptitude
requirements for an AFS are estimated. While the feasibility of operational
implementation-was established, such action has not occurred.
The importance of the technology lies in its being- at present the best
method of relating aptitude to work requirements. For TIES, it offers in
combination with other data from the OSM, LSA, and MDC, a means of benchmarking
difficulty of tasks for new equipment and estimating aptitude requirements for
the tasks for this equipment.
Survey data also were the basis for categorizing career ladders according
to their mechanical,- administrative, general, and electronic requirements
(MAGE) (Driskill, Keeth, & Gentner, 1981; Bell & Thomasson, 1984). Each of the
four categories was broken down into smaller, more meaningful categories. For
example, for the administrative area, three types of administrative tasks were
discovered: clerical, computational, and office equipment operation. Subject
matter specialists from various administrative fields then identified a set of
tasks representative of each of these types.
33
More recently, the data have been a primary source for Utilization and
Training (U&T) Workshops. Representatives from the Air Staff, using commands,
and the training community review the job data to determine how the workforce
is being -utilized. Once utilization issues are resolved, they then sort the
tasks-, using ATCR 52-22 guidelines and task difficulty and training emphasis
ratings, into those tasks to be trained in the initial resident training
courses and those tasks -to be trained on-the-job or in advanced and lateral
courses. This process is being modeled in some current research to develop al
Training -Decision System (TDS), which-will be addressed briefly below.
It is important to note here that the data are employed for decisionmaking.
Interviews with training development personnel reveal :that the- task data
(percent performing, difficulty, and training emphasis) are the initial source
for ISD. They indicate that for actual development of training, task
specificity is an issue. In some cases, tasks are combined to represent what
more appropriately might be entitled jobs. In others, tasks are further
defined. In every case, detailed task analysis is required.
Also, in the training area, occupational data have been the basis: for
forecasting requirements for training for developing systems for which real
data are not yet available. In one instance, a scenario-based approach was
used (Tartell, 1979). Highly qualified personnel were presented an extensive
list of tasks and equipment that might be employed to accomplish an
accompanying scenario. These personnel indicated which of the tasks and
equipment they believed would be involved. Interrater agreement was
exceedingly high (in excess of .90) and the results were used to establish
training.
34
described the comparable systems, subject matter specialists for the newer
system indicated tasks they believed appropriate for the newer system. Again,
agreement among the specialists was quite high, and their estimates- of
requirements were used to develop training programs. While the accuracy of
their estimates could not be assessed, their high level of agreement provided
the best basis for training decisions in the absence of any other source.
The task modules are developed- initially from occupational analysis data
by hierarchical clustering of the tasks to show how they are co-performed--
that is, tasks that tend to be performed together by the same job incumbents.
These modules are modified by subject matter specialist judgments. Since the
USAF Job Inventory tasks are the basis of the clustering, there are instances
(see, for example, the tasks in Figure 3b) when the tasks are very generic.
The array of systems or equipment on which job incumbents work are not
enumerated. Addition of MDC data, which is weapon system and even end item
specific, would add a level of detail that can be expected to enhance decisions
about the task modules.
35
7. Task certification-before.performance requirement codes
1. Task statements
36
6. User identification codes
7. Training materials identification and locations
8. Task certificaticn-before-performance requireumnt codes
9. Recertification requirement codes and frequencies
10. Camun task requirement codes
11. Position task requirement codes
12. Task factors
13. Weapon system or equipment that AFS supports
14. Support equimnt required to perform tasks
15. Training/evaluation/development priority codes
16. Task steps/perfomance sequence
survey tasks, since they are not limited to Am specific tasks, can
provide sane of these data. Depending upon the -cmprehensiveness of the
Lnventory, tasks my not be included. Thus, additional local task development
effort can be anticipated. Availability of MDC and MEin a system that would
permit aggregation of similar task item could materially improve the
develor~n and quality of the master and local task- lists.
A third effort, Small Unit Maintenance Manpower Analysis (StM), is
developing an F-16 data base oriented toward- job specialty restructure. This
data base will permit redefinition of specialtie s without regard to existing
s ialty structure (Moore & Boyle, 1986). Special purpose task analysis
efforts have been required (e.g. to assess similarity of knowledge and skill
requiremnts). WM data tend to be general (in most cases), and not always
directly relatable to specific equipment, making cross-APS and cross Mission
Design Series analysis difficult. Specific equipment tasks can be identified,
hadever, in OM by Using cabinations of background items to generate jb
descriptions. The inability of existing task identification methods and data
bases to handle this urgent MPT problem (Boyle, Goralski, & Meyer, 1985) was
one of the reason for the development of a TIES (Ruck &Boyle, 1983).
Critique of The _cgkational Analysis Proq_.
Occupational analysis data have several important features that need
smmarizing:
1. The survey program extends to all AFS, providing data for a variety
of personnel, training, and research uses. In contrast to ISA, MDC, and 1COK,
which are maintenance oriented, the data are APS oriented.
37
2. Tasks in the survey data are at the highest level of generality of the
four data systems. While they have been extremely useful for classification
and training decisions, development of training materials depends on further
task analysis. All uses of the data could be enhanced if task description were
more consistently and specifically related to equipment maintained or operated.
In the LSA and MDC systems, a limited set of actions are. related to equipment
items (by work unit codes),.
3. Tasks are written in the language of the worker and are intended to
differentiate among workers. Thus, in the maintenance AFSC, there is not a
close relationship among the action verbs with those used in MDC, LCOM, and
LSA. Where task writing guidelines permit, closer correspondence of survey
task language and MDC, LCOM, and LSA tasks or actions- taken is desirable.
5. Data analysis using CODAP is highly flexible and provides a wide array
of powerful analysis strategies.
6. Data that have been analyzed-as well as the raw data are fairly easily
accessed. Since the raw data tapes are maintained by the AFHRL, further
analysis can be made.
7. The CODAP capability to display and analyze data from other sources
along with task data is an important asset directly usable in a TIES.
c. The generic nature of the task statements also poses problems for
some applications. Foley (1980), while recognizing the power of the
occupational analysis methodology, nevertheless criticized the task statements
38
for their lack of hardware specificity and for their use of nonstandard action
verbs and functions.
10. All the same, survey data, besides the task difficulty and, training
emphasis information, provide information not available in other systems and
which- is exceedingly important for MPT. First, tasks can be related to skill
level. Second, the job typing analysis provides work structure information.
The power of these data would be enhanced if they could be related to MDC and
LSA data. It would be particularly attractive if job types could be assessed
with regard to the detailed MDC or LSA actions taken tasks comprising them.
The advantage would lie in the ability to assess whether common knowledge and
skill requirements exist--or whether there- are sufficient differences to
warrant different training or-classification and assignment vehicles.
The MDC is the oldest of the task identification and data systems
discussed in this report. It is an equipment or weapons system oriented data
base that does not easily lend itself to analyses of task performance by AFS.
No data for nonmaintenance AFS work performed is recorded. Data are collected
for aircraft, missiles, and communications equipment.
Origin of MDC
MDC is described in AFR 66-1, TO-00-20-2, and AFM 66-267. Its purpose is
to collect at base level detailed-maintenance-data, including work center, work
unit code, action taken, how malfunctioned, maintenance time, crew size, and
39
worker identification. From these data, a variety of analyses providing
information about manhours, the reliability and maintainability of equipment
and weapons systems, product performance, weapons system readiness, product
improvement, and- support requirements and costs may be obtained (Quick
Reference Guide, undated).
The MDC data base consists of detailed data about on-equipment, off-
equipment, and depot maintenance work. The data originate at base level (or
depot), the input originating on AFTO Form 349 (Figure 9) with the input flow
depending upon whether the maintenance is scheduled, unscheduled, or phase.
Unscheduled maintenance is a term used to indicate that something is broken or
malfunctioning. At that time a AFTO 349 is prepared for the reported failure,
and designated for flightline action in the case of an aircraft. If the
failure can be corrected on-aircraft, the work is accomplished. If off-
equipment maintenance- is required, an AFTO Form 350 (Figure 10) is prepared and
routed along with the item to the appropriate shop. In some cases, the
maintenance work requires activity of more than one shop, in which case the
receiving shop prepares another AFTO Form 350 to route the item to the other
shops. The flow for scheduled and phase maintenance is essentially the same,
except the Initial AFTO Form 349 does not result from a reported failure.
Figure 9 shows an AFTO Form 349. All data elements are supposed to be
completed. The source for the entries are AFR 66-1, AFR 66-267, and the 00-20
series technical orders. Many of the codes to be entered are standard, but
there are subsystem and component entries required that must be obtained from
the 00-20 series technical order for the aircraft or appropriate technical
order series for missiles or communications equipment on which work is being
performed. The standard codes significant for a TIES are shown in Table 3.
Each of them can be related to the item number of the AFTO Form 349:
40
MAINTENANCE DATA COLLECTION RECORD j~
1.MCM~D NO.- :.zW0RNKC2 13. I.D. N0./Stt2A . 1L
4Mo$ J. SD j.1I~ .PI~.3~2~O ~ ?
cKO
m TIMEt 11. ENINE I.. 12. :xsl EW, TIME m2INST. 1W.. I.D. 1S.. M 1. vlac $PCwisp. Jo .
II
20
. PART
NUMBER j21. &a. NO/OPE. TIME 1 1A MO
T. INST. MidM
PART
NO. ~14. SCBIDJNUMBER 125. CPU. TIMt
TY~PE~~
A __o
P K
WORUNIT
_________
COD A:MIlCOO
ME
ATMO
WHEN'P0WM.
is No b
- -
NT
UNT START
NOU
-
DA
TP
R
J EW AT O
L.
ATSC/EMPLOIEE
UME
m_
Iawsu
-
J iNt't
-EDD * S
3tT 34 WILL
91110 ________
<~~______i
u_ _ _ _
i i i I I _ _ _
j~7 -
-
rI i Il . - S - S
-~-iij__
i I___
2L I~P~
2? OETIATO
6ml EONSATO
AFOw~~~WILS il
w
u
J
vi
<.
- - - -- - - - - - -j
Fiue9- FO o 4
~4
15cw 0 OW3 A.6
SUCHERItO 1203
WARNINS RafDam& fc
d-0,c1nj, - E"e""y flI ~W o balk
-A O~
REPAR CCLE
AT-REPARABLE ITEM -PROCESSING-TAGG
23. Ism 24. SILAN COf 1. Job CONTROL. N10. [2,5 a VAL M04 J. WO. NO4 A, WN4 DS
X1 X1CD. IN MAR47.SHO0P
i. SHOP LU omit~
32L MADESLLYJCLA&LI
DATE
SAI lUlAtt ClT DATA
IW 1111C"'*17NI
DAY'00. TL ITI~l IS. SUfLnY KUMft
DOCUMENT
u
10, ILPW? UMZ
TO.
42
Table 3. Data Elements Utilized- in the Maintenance
Data Collection System
JOB- CONTROL NUMBER -(JCN) - A UNIQUE SE-VEN CHARACTER NUMBER USED TO CONTROL AND
IDENTIFY MAINTENANCE JOBS, AS WELL AS TO IMPROVE ANALYSIS CAPABILITY. -EXAMPLE:-
0410001 ;041 IS THE JULIAN DATE AND-O001 IS THE FIRST JOB OF THE DAY.
43
Table 3. (Continued)
TYPE MAINTENANCE CODES (TM) - A CHARACTER USED TO IDENTIFY THE TYPE OF WORK
ACCOMPLISHED. TYPE MAINTENANCE CODES ARE OBTAINED FROM THE APPLICABLE WORK UN!i
CODE MANUALS FOR THE TYPE OF EQUIPMENT WORK IS BEING PERFORMED ON. AIRCRAFT
TYPE MAINTENANCE CODES ARE LISTED BELOW.
A - SERVICING
B - UNSCHEDULED MAINTENANCE
C - BASIC POST FLIGHT OR THRUFLIGHT INSPECTION
D - PREFLIGHT INSPECTION
= - HOURLY POSTFLIGHT OR MINOR INSPECTION
H - HOME -STATION CHECK
J - CALIBRATION OF OPERATIONAL EQUIPMENT
M -- INTERIOR REFURBISHMENT
P - PERIODIC, PHASE OR MAJOR INSPECTION
Q - FORWARD SUPPOT SPARES
R - DEPOT MAINTEMANCE
S - SPECIAL INSPECTIONS
T - TIME COMPLIANCE TECHNICAL ORDERS
Y - AIRCRAFT TRANSIENT MAINTENANCE
WORK UNIT CODE (WUC) - FIVE CHjARACTERS USED TO- IDENTIFT THE SYSTEM, SUBSYSTEM,
AND COMPONENT ON WHICH WORK IS REQUIRED OR PERFORMED. THE FOLLOWING SHOWS THE
BREAKDOWN OF A COMMON AIRCRAFT WUC:
LISTED BELOW ARE THE BASIC STANDARD AIRCRAFT SYSTEMS AS INDICATED BY THE FIRST
TWO POSITIONS OF THE WUC
44
Table 3. (Continued)
09 - SHOP SUPPORT
10 - NOT USED
11 - AIRF'iAME
12 - COCKPIT AND FUSELAGE COMPARTMENTS
13 - LANDING GEAR
14 - FLIGHT CONTROLS
17 - AERIAL RECOVERY
22 - TURBOPROP POWER PLANT
23 - TURBO-JET ENGINE
24 - AUXILIARY POWER PLANT
32 - HYDRAULIC PROPELLER
41 - AIR CONDITIONING, PRESSURIZATION, AND SURFACE ICE CONTROL
42 - ELECTRICAL POWER SUPPLY
44 - LIGHTING
45 - HYDRAULIC AND- PNEUMATIC POWER SUPPLY
46 - FUELS
47 - OXYGEN
49 - MISCELLANEOUS UTILITIES
51 - INSTRUMENTS
52 - AUTOPILOT
.55 - MALFUNCTION ANALYSIS AND RECORDING EQUIPMENT
56 - AUTOMATIC ALL WEATHER LANDING
61 - HF COMMUNICATIONS
62 - VHF COMMUNICATIONS
63 - UHF. COMMUNICATIONS
64 - INTtRPHONE
65 - IDENTIFICATION FRIEND OR FOE
66 - EMERGENCY COMMUNICATIONS
68-- AIR FORCE SATELLITE COMMUNICATIONS
69 - MISCELLANEOUS COMMUNICATIONS EQUIPMENT
71 - RADIO NAVIGATION
72 - RADAR NAVIGATION
73 - BOMBING NAVIGATION
74 - FIRE CONTROL
75 - WEAPONS DELIVERY
76 - ELECTRONIC COUNTERMEASURE
77 - PHOTOGRAPHIC/RECONNISSANCE
82 - COMPUTER AND DATA DISPLAY
89 -AIRBORNE BATTLEFIELD COMMAND-CONTROL-CENTER
9T - EMERGENCY EQUIPMENT
94 - METEROLOGICAL EQUIPMENT
96 - PERSONNEL AND MISCELLANEOUS EQUIPMENT
97 - EXPLOSIVE DEVICES AND COMPONENTS
98 - ATMOSPHERIC RESEARCH EQUIPMENT
45
Table 3. (Continued)
M - DISASSEMBLE
N - ASSEMBLE
P - REMOVED
Q - INSTALLED
R - REMOVE AND REPLACE
S - REMOVE AND REINSTALL
T - REMOVED FOR CANNIBALIZATION
U - REPLACED AFTER CANNIBALI-ZATION
V - CLEAN
X - TEST-INSPECT-SERVICE
Y - TROUBLESHOOT
Z - CORROSION REPAIR
NOT- REPAIRABLE THIS STATION'CODES
1 - REPAIR NOT AUTHORIZED BY SHOP
2 - LACK OF EQUIPMENT, TOOLS, OR FACILITIES
3 - LACK OF TECHNICAL SKILLS
4 - LACK OF PARTS
5 -- SHOP BACKLOG,
6 - LACK OF TECHNICAL DATK
7 - LACK OF EQUIPMENT, TOOLS, FACILITIES, SKILLS, PARTS OR TECHNICAL DATA
REPAIR IS AUTHORIZIED_BUT THE ABOVE IS NOT AUTHORIZIED-
8-- RETURNED TO DEPOT
9- CONDEMNED
-HOW MALFUNCTION CODE (HM) THI-S- CODE CONSISTS OF THREE CHAR-ACTERS AND IS USED TO
IDENTIFY THE NATURE- OF THE EQUIPMENT DE-FECT, -OR THE STATUS OF THE ACTION BEING
ACCOMPLISHED. ONLY THOSE CODES THAT ARE APPLICABLE WILL BE LISTED IN EACH WORK
UNIT CODE MANUAL. rUE TO-THE NATURE OF SUPPORT GENERAL TYPE WORK, THE'-RECORDIX
OF ACTION- TAKEN, WHEN DISCOVERED:, AND HOW MALFUNCTION CODES IS NOT REQUIRED WIT
SUPPORT GENERAL WORK UNIT CODES. A COMPLETE LIST OF AUTHORIZED CODES IS
46
Table 3. (Continued)
CATEGORY OF LABOR (CLB) - THIS DATA ELEMENT IS USED TO DIFFESRENTIATE THE TYPE OF
MAN-HOURS EXPENDED- AS LISTED BELOW.
47
Table- 3. (Continued)
MISSION DESIGN SERIES (MDS) - THIS 7 DIGIT ELEMENT IS THE COMPLETE DESIGNATION
FOR AIRCRAFT, MISSILES AND C-E EQUIPMENT.
EXAMPLE: NKC135A
sErrKL NUMBER (INT - 'THE 8 DIGIT SERIAL NUMBER ASSIGNED TO THE ITEM. FOR
ENGINES AND RELATED PARTS THIS NUMBER IS CONTROLLED BY AFM 400-1.
ESTIMATED TIME IN COMMISSION (ETIC) - YEAR, DAY AND HOUR OF ESTIMATED TIME AN
ITEM WILL BE RETURNED TO OPERATIONAL STATUS.
STATION LOCATION CODE (SLC) - THIS I-S A 4 DIGIT CODE LISTED WITHIN AFM 300-4 FOR
THE BASE, OPERATING LOCATION, OR SITE AT WHICH THE WORK WAS PERFORMED.
TAG NUMBER -(TAG) - THE LAST THREE DIGITS OF THE AFTO FORM 350 TAG NUMBER THAT I:
PREPARED AND IS- TO BE ATTACHED TO THE REMOVED ITEM WHICH WAS IDENTIFIED WITH AN
ASTERISK IN- THE WORK UNIT CODE MANUAL..
FEDERAL SUPPLY CLASS (FSC) - THE FIRST FOUR DIGITS OF THE NATIONAL STOCK NUMBER
OF THE ITEM- BEING REMOVED.
PART/LOT NUMBER (P/N) - THE PART NUMBER OF THE ITEM BEING MODIFIED OR REMOVED,
INCLUDING SLASHES AND DASHES BETWEEN NUMERICS ONLY. FOR CONVENTIONAL MUNITIONS
ITEMS THIS WILL BE THE LOT NUMBER OF THE ITEM. FOR ITEMS THAT DO NOT HAVE
PART/LOT NUMBERS, ENTER THE NATIONAL ITEM IDENTIFICATION NUMBER (NIIN) WHICH IS
THE LAST NINE CHARACTERS OF THE NATIONAL STOCK NUMBER (NSN).
48
Table-3. (Conicluded)
49
Several points should be made about some of the data items.
2. Column J elicits crew size for the action reported. This information
coupled with the work unit code and action reflects the number of personnel
required to perform the work. This number, however, is subject to
misinterpretation, because trainees are often included in the number of
personnel performing a job.
(on the) Radar Navigation System Column C, 1st & 2nd digit (WUC)
50
The reader should recall that the information in the third through fifth digits
of the WUC must be obtained from the 00-20 series -technical order for the MDS.
Codes, however, are accessible in the MDC B-4-master file.
The WUC along with the MDS- are the sources of matching actions taken with
AFS. Normally, a work center is manned by a single AFS. The exception is in
COMO flightline- maintenance, where- the WUC would match up -only to the lead
worker.
The same process as- described -for aircraft also applies to communications-
electronic equipment and missile maintenance.
As indicated above, MDC data originate with the employee accomplishing the
work. The raw data are either keypunched or entered via a terminal at base
level. Data are aggregated at base level and listings appropriate to the base
maintenance activity are provided daily.
Base level data are further aggregated at major command level. Each 30
days, major commands forward MDC data tapes to AFLC/MME-2 at Wright-Patterson
AFB. There the- tape files are added to the Maintenance and Operational Data
Access System (MODAS) which is an interactive retrieval system for-data for the
past two years. Provision is made for operational units to have on-line access
to MODAS.
The flow of the data from base level to AFLC is illustrated in Figure 11,
which is reproduced from the Quick Reference Guide to Maintenance Data
Collection (MDC). Notice that the data are aggregated in the D056A file, from
which MODAS output is generated.
The input for MODAS, a G063 file, is the D056A file. MODAS design is
shown in Figure 12. Notice that MMICS data also are input to the G063 file.
MDC data are also maintained by the contractor who developed the weapon system.
Figure 15 - menu for selecting aircraft system for display of data for
option 1, Figure 14
51
FY714
A
W
Q
1k-41
ene
1 )a)
°a
52
LD
cr)
0-)
0-
CD *E-4
0-
00 U3
C/) CE)
H- C
0 0-
53
4
0D
C44.
0C1
63 00i
CC I ~I
.D
cu 0; N
00 C *L41 .eI
L 3 41 63 +3 4
0 63 < 63
~zfl I C. '1
~o L C 63 .u
LI
I.L
54w
4V4
o4 01
*L L
mlam
Cn w
41w-1
I~ 00
:3 4 .041 61
H 00-4
~t- 0
L L 4J41LL-
3
4ISO41 M
1) 04z
~Ia 41o c a
L - t4 J C
L.es uL14C
44 ~ Uw S. Ia
_
~ l,- '141
to--
L~w41141 C.
wj
I~ U) I0~55
<(33C<<03 WC W <
LL
000LL- 0W0 0 N w
Dqt)C 03 O
-0 CU CO 'V ID
WI U)
OCD
0 C3
OWLL
qq
Iq v
o0-
-vi0
0L
oLLLL
< m0q3m
4w w0
U m Dm ED ED
&o0-00
-
W
m
Z
0 00 00 0LLW V,-I:6
Figure 16 - worst case menu for option 7, Figure 15
Figure 17 reliability report for option 1, Figure 16; these data -have
implications for deciding where training emphasis may be required
Figure 22 important menu for obtaining work unit .odes; displays the
master B4 file
The assertion is sometimes heard that workers may not have ready access to
the codes when they are completing the AFTO 349, and this situation -reduces
57
0 L
> )>
M U) >%)
EIDO- U)
~0 0 0 A
z I xIw :3
I a CL - (fl
IHHH< -L
UIID EDt-M C
( I cc.
0 co
< 0l 0
0 cn
C.)
U) I
01 M
I>Dm
U) MH m
~~I>.>U)Zc
j ~I
M I DD 0E
-
I
U)
U)
LO10
C
IP
ItO0 r-t CQ
IHH-I-- < WL
58
C:) L 1' 0 amN 01 r
co 1 o1)m
0C c T o ci'qM-o I-
0o C: C-) I OLDWNCOED N Z
COIM
a I
CD I
<H I ~ ~ N)~W D1
U) ~ I
-0 c m
Ijx-L Vu m mV,
C-I N oi 4-0 H -m c
,T I nNa w 3 oc
U) . . *
O1 0"c)C;C C u
0 'a- = )I tD Dim1rOOO
o. 0 r- I CU 2WCDCD
N-DOIN
DD
fLLLI LIJI+-H, CD i-
ro L DZ
14 aL 0- a
r-l 0 72 ID aUN. N CNI-Wl--'W
a:-~ ~ V 0 <ULL D H-c
I Ic ~U0
>- Z
CL w z z 0- C, :r.: >
o o IC CU
U- U I U)H z NWi
Is L)UJL
I <W = <> I- cc =)
CDI ccz LDLLWO
0<H < I
I F-OWO<O
M M J
UY
CI
59
0.
r"I 03
0 t :
U, I
I 4.3I/
03I .HI ,
SI .rl U,
I:
I a
I (10 CaI
• I .'H W0-
C-) I C ca
>%I
:3 m U)4J C)
Cl-
-- t W" Mi r
U)u- LM n-
a)4-) C4J d) M
0 -0~ co
¢II
P to
x ca
rr a)PPC r
m u...,.-I. .
0 M 4) to :3-E
o- • Cu-iJ
*0 7 W~4J M C I
CIfW L4.I)CJ
Ca ta m 00~
F >to ri Z b
lr Ja0
L .. M C 0 4
I U)E Cw
I L _ - CL)
0- =_a aa)a 3
Uu E zT
cow
06
WLJI Z
Li
LOJ
WLJI I'
<
~
: a:
w-
LU 0- 0
IL 0-
ZD
c~~o L0a Z4 af -
LLM 0 1-
LU
S< LUj H- H 0U)
)
Cc
I
CI01' L) 111 D 0 < LU
L) CD Z 0 LUI cr
WIHz 0 N 0
LUw : H I L
HH 2 < < [I
d a a -i LU U) Nr
0 LU
0- - E 0 IL N <L
0 0Z C LU- U) 0 < 0 ZD
o0: 2:
0
Z
0
LL
0
<
a 0a
cc
Z)
U)
0H
li 0
LLU
D x 0 - a: - L
cxi 5
0~ I, c
61
0
K I I
A
CcI xE
cl-
- I
I z
< MI
<~- -W I 0 0 cI
M M<-Z0:D cc I
U~
0
< I
z
z
EWI
W
-0um-
I
0
- W D
f
I
I
v
HWI<
0Z
i
ca
<
).
N CU CU cuN)- WCO LU
w W1- I U)2:2:
M 0 0 z:<
UZI
< I >-0 - - C
A
I 0- 0- 0 0 0 0
0 0 0 0 0 0 0 4
0LU1 0 0- 0 0 0 0w
W 0
D.-
IC '-''i-
:3:
1Lo) Io LO- [o, LO LO)
< A
< H-I
0 w HI
WLJI
nI
H-I ED Co 'M Co Co Co
A
HZ I "T O M-m
CU
H A cu In) LO) Lo) -0 co
w I )
0<I 0 0) 0 0 0 0D
I-~i 0 0 0 0 0 0
CCi- 0 0 0 0 0D 0
Cf) WL I-I -- .- V1 --
oJ~ AoC o O
0< A
U01 CD CD CD CD (f) CD
0 r I -T- c- T1--1
< -AHI 0 0 0 0 0 0
w < ILL LL LL LL LL I
63
'7 U
Lm
aY
I Cr)
Cd Q)
Lr a) a)- -,
Hl
LO -0 w0 Z-to-
n0 0 0
DD fJ.
Z1 L
F-
ro 0 - W)L
cu CT)
C')
ED P
0w
z
('3 LIJ
64
• I 0000.• -0
'0 < I
I 0000 0
L C Ir
I m
o000 0
0)I o0000 0
C
.0 I
i oooow
0000 • *00
SI 0000 0
c I 0000 0
>i II 00
0000 00 ... 0
0
W -1 0000 0
CcI 0000 0
<I 0000 0
IL 0000 0
I
I 0000
~ 2 0
1 1 00020
00-M0 -0 0
1 0000 0
11 oooo
0000 o
0
1 10000
0000 00
co10000 0
10000 0
<10000 0
I m m C)m m 4J-
m
o II LL
I -LL2
I- WZ -
I w
I (120 o
I >-~CC - 0
I U)2ED c
in <I LI-
LLccH-0OH ) ~
o 1000Ur. t0
1-+ 0 DcommE
mL I 0000 0
co 01 U LL LLLL L
0 U)
<HZ I
I lI
ED < -IIzw wwV.
rzo Ii
m -W 3- "q
65I11 "q
- • •
65
a ) o0n E ) a)
Z 0 0 0
"DI u ) a0 ) a)
a) 0)C O 0)
C -L -4-0 . C'-
) - U)
I 0)04~, 00c ) 0 Li
-Yn-0c W_0r- 41) L cc E-
M C 0 )iN%) ***
H I) 4--i-0)~U V E 0
Ix " C CD1
66
accuracy of the data. Sometimes, interviews revealed, the worker logs the work
against some code recalled, or logs the work against the vaguely defined
"support-general" code. A large amount of support-general codes were reported
during interviews, and at least one command is testing the deletion of the
code.
In summary, the question of -how reliable are MDC data is best answered by
stating that reliability is suspect in the minds of many maintenance personnel.
No empirical evidence of its unreliability was discovered in this study.
Perhaps the time has come to resolve the question once and for all. Such a
study, for example, could determine variances in manhours among units and
investigate the sources of any variance that was uncovered. But, there is
considerable opportunity for AFTO 349-entries to be inaccurate, for the reasons
cited above.
In the MPT arena, the data are typically used in base level maintenance
shops to assess overall productivity, to track maintenance actions, and to
determine manpower impacts of shifting workloads. They are especially
important for maintenance managers to forecast manning requirements, according
to interviews with these personnel. On-the-job training, through MMICS, is
tracked by the experience that unit personnel acquire, and records of their
training and certification are documented.
67
Above base level there is little evidence of the use of -the data, except
as a basis for input to the LCOM model, to be discussed-below. In the 1977-78
time frame there was an abortive effort to use MDC data for designing resident
training, especially in determining requirements for electronic principles
training. The volume of material as well as the lack of a-means of aggregating
and analyzing the data made its use for training unsuccessful.
That MDC are not being used to any noticeable extent for -training or
classification and related personnel uses does not mean that the data cannot be
used for such purposes. Indeed, a review of the MDC such as reported -here
reveals the data base to be a rich source of information for -training and
personnel decisions.
Critique of MDC
2. MDC are weapons system specific. AFSC analysis is not possible with
the present computer analysis capability--e.g., MODAS.
3. It is not possible to infer with any confidence who is doing the work,
since neither skill level nor grade is reported on AFTO Form 349.
6. While MDC data are not presently being used for personnel or training
purposes, the utility of the occupational analysis data (described in another
section) that are presently used for decisionmaking in these areas could be
considerably enhanced if MDC data were linked with the occupational analysis
data. For example, data about reliability of specific subsystems or components
could augment occupational analysis training emphasis data for making decisions
about what to train in initial resident training courses. Also, in the
personnel field the occupational analysis data are based on more generally
defined tasks than- MDC. If MDC tasks were linked with these more general
tasks, a much clearer picture of the scope of work of an AFS would emerge,
giving more substance on which to base decisions to merge or shred career
ladders. In addition, programs that are now in the research stage that are
discussed under the OSM section (i.e., the Training Decisions System, Advanced
on-the-Job-Training System) would be facilitated by availability of a system
for organizing and analyzing an exhaustive and detailed task identification and
data system. Certainly, the efforts in SUMMA would be-enhanced by a TIES.
-68
V. LOGISTICS COMPOSITE MODEL DATA
As indicated in the previous section, MDC data are a primary input to the
LCOM modelling process, either in the form of -maintenance tasks and manhours
for operational systems or in the form of comparability analysis for a new
weapons system. One of the inputs -to LCOM is a listing of tasks with
associated -manhours. These tasks are weapon system specific, but the tasks are
identifiable- with their appropriate AFS. While task identification data are an
input of LCOM, it is not the modelling process that produces them. The tasks
and associated data are produced from MDC and operational audits -and task
networking created for input into the LCOM model.
Origin of LCOM
Because the human resources required -to support and maintain a weapons
system are a major element in the total cost of the system, a need -exists to
identify and-develop estimates of these resources early in the development of a
new system. A pilot project was initiated in 1971 to develop, test, and employ
a simulation model that Would- effectively predict manpower requirements for a
new weapon system. The AX (now, A-10) Close Air Support Weapon System was
selected as the simulation test bed (Maher & York, 1974). The intent was to
transition the model to the operational command for use in determining manpower
authorizations. The LCOM, developed earlier by RAND, was the model chosen for
the pilot project. This transition -occurred and at present several commands
are employing LCOM to simulate manpower requirements.
In terms of-task identification, -an input to the LCOM process today yields
tasks performed on operational systems as well -as tasks to be required of new
systems. The task identification and associated data are generated by the
Data Preparation Subsystem (DPSS) for one input into LCOM. For the new
systems, the tasks are the result of comparability- analysis, although Cronk
(1985) reported an instance in which Logistics Support Analysis data -(see
Section VI) were employed as input in LCOM modelling of a new system.
It should be reiterated that the LCOM model -is a simulation process. The
task identification data are simply input data, which serve as one of the
parameters for modelling manpower and other factors.
The office of primary responsibility for the LCOM model for manpower is
Hq, AFMEA/MEXL, Randolph AFB, Texas. Their brochure (AFMEA, 1986) describes
LCOM as- a multi-functional computer modelling system designed to determine
through simulation the resource requirements of a system. The major uses of
LCOM for modelling manpower is the Tactical Air Command. AFMEA and interviews
with TAC personnel are principal sources of the information on LCOM in this
report.
69
of detail that can be input and in the range of resources and "systems" that
can -be studied. LCOM is merely a resource counter, and it runs on a queuing
concept. At present, the Army Research- Institute is evaluating LCOM, along
with TSAR, for use in analyzing manpower and unit performance. The AFMEA
Primer lists such uses as vehicle maintenance operations, space shuttle
operations, and aerial port operations. Interviews at Tactical Air Command
(TAO) headquarters in- June, 1986, revealed that manpower requirements for
supply and hospital operations -were modelled in LCOM.
For operational weapons systems, LCOM task data originate from historical
data from the ". . .most current computerized maintenance records.. ." (AMEA,
1986), i.e., MDC; and from operational audits of these data. The task data for
emerging weapons systems, except for the instance cited by Cronk (1986)-, are
derived from comparability analysis.
The Air -Force Systems Program Office (SPO) determines the comparability of
the hardware developed for the new system with other systems currently in the
inventory. Engineers compare the designs of similar aircraft, drawing on the
experience of associates who have worked on various programs, contractor data,
and Air Force Technical Orders, as necessary. The results are then written up
by subsystem work unit code, and include: identification of comparable
aircraft and subsystem work-unit code(s); any additional Line Replaceable Units
(LRU) in the new subsystem or LRUs by work unit code in the comparable system
that are not applicable; any factors that are applied to the comparable
subsystem failure rates or task times in estimating for the new subsystem; and
narrative analysis specifying the criteria used and supporting rationale for
choosing the comparable subsystem -and factors. Any scheduled maintenance
considerations should also be mentioned. In some cases an item is so-new or so
changed that there is nothing reasonably comparable. In that -case, the best
source of data (e.g., contractor) should be identified, and-appropriate factors
and degree of confidence discussed. Study results should be- reviewed in
conjunction with experienced maintenance personnel to be sure no maintenance
considerations are missed.
The next step is to obtain MDC data tapes on the aircraft with comparable!
subsystems. For operational systems, acquisition of the MDC data tapes is also
one of the first steps. Data from these files are processed through the DPSS
which alters the form of the tasks and generates task sequence networks.
Processing of MDC are shown in Figure 25. This figure shows the input of MDC
data and use of the B-4 master file WUC dictionary.
70
B4 YAT:1-
_________ I ~S0N ~ ABD6DA.
~C M J
DIC:lOHAR:Y DATA
PROCESS Z=RAC' ON
PROCESS
_AUD11 5 DAlLA J
ANALYSIS UPDATE--Ai NTY'a: S
PROCESS ~ POZS~WRCS
SZLLZCT0 1rO
T
INPUT
DATA 7 AUDIT 8
SIL~CIONCANNIB-
~T~.CI0NAl::=AlION
?ROCTEcS ANAL!SIS
PROCESS RE-IM-NCvTS
_______PROCESS
71
.able 4. MDC Action Taken Conversion to LCOM
Action- Taken Codes (Source: Form
DPSS-M~anual)
- -
__ N
ally i : '
G M-~-i _______
~ j2 [ 1
'800- x
803 RT
80-1 X
ohrR N0t cS
Q samne a z; MrC ATC "F"___________(~
I_______ - Tlhz is a Dug i Itile 0:-
R -s:imtnl as MPIC AIC "P" - ibU.?2-i1z LAiC zon;ersien
59I
i
*
_______________cu._ . en Z! know.n, but. the C7._U:
Y 8 12 will be co-.-e:te in the nex:
-Tr,~ 0E __________
SC
C0
am 2fE
T
svsze- -'I'ase.
72
Table 5. Definitions Assumed-for LCOM Action Codes
(Source: DPSS 'Manual)
7§3
Table 6. MDC Action Taken Code Definitions
(from AFM 300-4, Vol. XI)
74
As a result of the conversion process, every maintenance task performed on
the aircraft is identified. Accompanying data for each -of them are crew size.
,AFSC, and task time. These items are a part of an output vital to a TIES.
To build a model (AFMEA, l986-Y, the analyst, through the input module
software, characterizes the operation being simulated. This activity includes
the data and rules describing each aspect of the operation- under study. Three
kinds of data (Figure 27) are entered in the input module:
3. Supply data are the- final input used to build the model. Supply data
identify resource type, cost, authorization, valid substitutes, failure rates,
stock levels, and other factors.
75
LIUA
w l.
U LUL
C U
LUU
LUU
C.
L- 0
C4
a
La
1CL
76
LUj
LU 0
0S
VI ..
coo0
z
4--
77
CL %
X
(S. -0
LLJE
UC-
LLU
78
U- m
C LU - W C)
U-CL
LU CL
LUU
-X
.-
LU
CN V)r-
CL d
uiC
79
After the analyst has prepared the data using the Input Module software.
the analyst builds- a model which then is simulated in the Main Module of h-
program software (see Figure 27). The Main Module takes the model scenario and
actually simulates- the operation that was described. Individual aircraft are
preflighted, loaded with munitions, taxied, flown, recovered, and maintained.
Maintenance is divided- into scheduled, unscheduled, and phase (periodic
inspection). The simulation tracks the number of personnel and physical
resources to run the operation, as each aircraft is flown and turned according
to the operation scenario. With large simulation models with many maintenance
tasks, the simulation can take hours of computer CPU time.
"If a new weapons system is being designed with stated mission, logistic,
and maintenance requirements, what are the life cycle costs?"
80
Table 7. LCOM Extended Form 11 Listing
Line i Prior Task Next Selection Sub- Time- to Repair Crew Size
Nkumber Node Node Parameter Svstem Mean Variance Rat AFSC
,Column titles were supplied by TAC LCOM specialists and data are extracted
from a TAC generated Extended Form 11.
2
1n minutes
81
The subsystem column reflects the aircraft subsystem (82--computer and
display). Under time to repair, the mean time for the install action on -h-
component is 5.6 minutes-, with- a variance of 2 minutes. The last two columns
'reflect that two personnel with AFSC 304X4 -are required. Contrary to- the usual
connection the X, the fourth character of the AFSC, does not stand for skill
level. Rather, the X -reflects that the task is performed on the flight line.
Typically in the Extended Form 11 document, the fourth character is used to
designate where the task is performed. For a TIES, the key data elements are
the action taken, work unit code, and the AFS. A significant problem is the
lack of a central-data base.
At present, the most up-to-date LCOM analyses are available from the
organization accomplishing the LCOM process. Mr. Charles Begin, ASD/ENSCC.
Wright- Patterson AFB, indicates, however, he maintains a collection- of LCOM
data- bases. The data are provided voluntarily, but Mr. Begin surveys LCOM
users semiannually to obtain new data files.
Uses of LCOM
TAC reported that all major TAC weapons systems have had LCOM modelling:
F15, F16, A1O, F4E, RF4C, FIIIA, EFIIlA, T38, and A7. Studies of surge.
sustained, peacetime, reserve, and guard have been completed. Studies are
reaccomplished each 4 to 5 years.
82
affecting manpower, reliability- and maintainability needed in a componen::
subsystem, reallocation of tasks among work centers, utilization of personneL.
-(which is a manpower, -personnel, and user issue) and- in an- availabilicy-
readiness model.
Critique
83
taken items, resulting in a more concise but less detailed listing of tasks.
An -important feature of the associated data that differs from available MDC
:output from:MODAS is the indication of the AFS required to perform the acti-ons
taken and crew size.
4. The key data elements for a TIES- are found on the listing of the
"Extended Form it" and consist of LATC, WUC, and AFSC.
Origin-
While data required for LSA have been developed over the past two decades,
it was not until the publication of MIL-STD-1388-2A, DOD Requirements for a
Logistics Support Analysis Record, 20 July 1984, -that Department of Defense
standard requirements, data element definitions, data field lengths, and data
entry requirements for Logistic Stipport Analysis Record -(LSAR) data were
defined. The standard allows delivery of LSAR data in a manual, automated, or
combination manual and automated format. The standard Joint-Service LSAR
Automated Data Processing (ADR) software may be used, or industry developed
LSAR ADP systems may be used. These industry-ADP systems, however, must comply
with minimum standards inhMIL-STDZl388-2A.
84
~ I
4L I o
-u -6
z 6
-Al
S 95
x 0
85u
0 ZUG
> z <_
z i o-Qz 10 Z
<
_ _~~=
--
86. - - -
o <
z a
0U
Z -Z
ZZZ o
00
- a.r- --
00
00
86
Develooing LSA
Since there are a number of DID that may be scheduled- for delivery with
relevance to MPT, a list of them with a short description of their purpose and
LSAR application and record source are shown in Table- 8. These DID are not
available from LSA data tapes and are not automated.
During full scale development, LSAR data records are completed to the
hardware indenture level identified in Figure 32. The data are used to develop
logistic support requirements for testing, operation, and deployment. The
iterative nature of the LSAR process culminates at the end of this phase. A
completed LSAR data base is -now in place.
87
C- - ' -~C-
0 ~ 0
- -6 -
-z v.
o - 6 '-
b. Lb
ae4c C~6
-v -6C
5Z *20 - caq oJ
.k65,. SE .nS
-. 05 0
cc
W
-6
-ec-
pl-4
4- 1 c-'
. -:.= = 0. t otw-!
- 0.a - v. - p
w C. . c .
I. z 4 tIZ c4 -
46 a-ao
a o -
ii - 2- c,. - 0X
v Le
I. I ! ..
2 C; * -E C
c ~
88~-~a- a J..
i.C
0a~~
-, 0. ~ ~ a-e ~ -- 0 Ct 0-
0~ c
.... r-z 2 a - 2
-~~~6: -00 70.- .6 -C 4t
=M - a 2!-. - - C
1.C
00 c
". 0 .0- Cc
r- -s 0-
a t0 -I CI
- S. ).O
-C 41 .2 0
72 M Z, -C Ta
sv X Ia.
a- oj2~..~ q - U 6
a *aU a -0~
42 Cc. a..
., 0-a
aAC"t.an Cu u...-s.89
- 0
-V a.....
C c
o ~~ - -- C
ZZ 0Z 2 L. 0
C I.; t*0cg
1 -2- a c
0 C;0
c 05
0 - L to 0
C.-30Zo
CC. LcC -- E
- -. a - LO. 00--
7;- W :a- CC ba
0
I C 10!:
I C C
-. - c. x- : *
at-. z4~ -- ~
Z.
-C a. D
a; -0 1b - C -
a,
I
CC- b L
a... c..c 90
- a-,'. - - - ,
-
0 -
CcC
0 M -- ' Zo--.
-'00 CL -4 o 0
- - . .4----
-~~ -0 r.0C' 00 4
-:O 000- --
U00-000 U
.0t 01 - 0
-=" 4 E92
-V0. ..
.3CC
-~ ~~~ 0 4 C 3. C..4
'Do4U41 0 - '
t
4..I3U.~3-
I~ ,C-.,1.'. 5~ 4v
4
.Cl ! 44 CC'3 cc V!CU
0 - - V- 0
4.0 '.
0. '
.C 4g.- -
.J-l.Z
=c r.4 =~ t
'.- u. 4. -
9x t-' C, I
0.01
1 - :. 41,
C.
O, It I r I
-Vt-~ ' -w
OJi v
W o -
CPC
0
C7
- CL 0
43C 4(191
- 0 -2
.. c.4s - - C
IV --.- - v,
o Ia Ou.
00740 C0 COCCS
-c
u~~~C~ M.a -
-c C ~C. .
0 -2
- ' C. 4. 13
-c 044 j4. a- a.
v . VC4 - C.Cz -4.44C -
- -. 0- Z,1 21 c-
2CC
29
41o - V a .L
CD~ c U,=
o C- CU -
* st .Vt.Y w.
- 400.4.. *J-t, - 0
v =*' CL- *~ C
tf0 v z..- .JO U 1
0 .c V- 9
-.mi*''~
- -~- t-- - '-
-0
v U
-~~~~ ~ 0 m
--
*0==-
~~ .c
--
C3U1
- 05. -
-0' ~ -00.7-
01 *o. -
'4- 9-0 o8
400
cOrc
tt4
.4 * ~ 0.- iv
%
S.c C
-~~~ I- wLJ
0 4w- - - -
_W -
w X,0
w -
-
tv a a-4. 2.
0. 45C 1 uv - c
4~ *m~-4 .
iOw
1t C v.- -*u c 2
00 c ac
cm041S w - 4Sc.
do 0' 22
CC 47 IT 0 -w-
-0 - -- 93
C;& tas
7
ae.tf0- . L C 0: V
0t 2'Z -
1 1 Cc
8. aC... tZtt c.~
-. ~~
- 1e
-o tau .D6
ta-c --. q
-" c a C c4
ct ta 0.0 -Ez
-C 0.
C0z
C C
0 Q MC
"Co .. .c -
w
- - -- l
a.~~ -
.s-ia-t 1
.40C 0
@0 1 - C - -
a-.2 o-e a- oC.
-. aC. 00
*C -1 2,C%
a0*S
-0r a 1 V C.4a-
acit 0- t-a 0
t'a-2 a. I . - *CO a-e
--tao o2-10!ta C.taeat1 £0C a%
-u a -SU ~0
oX.E 0 Z-:5 gg 411
wu-
-.1--
33C Ic*.C XZ.~ --&~..23=2..:~~-0-
-1 :2 -c.a2C1
-oz
7= e ta, 0t
c -oI c.
TZl
94C alOIC.
C= -Y
CJ
-d4
LL..
1.
__c 0
-N ir_
-c
-4o
-i
C3
w : 14cl i UC)f
uLsCi
95Z
LSAR data established during the development phase are retained for use
during the production and deployment phase. They are used to support logistics
analyses and as a& basis for design change and requirements for succeeding
generations of the materiel acquisition.
In the succeeding paragraphs data LSAR records significant for MPT are listed.
Specific instructions for completing these records are in Appendix A,
MIL-STD-1388-2A.
96
t V
. I n l I *
tI I . * 1-
- I I
I 5t
r il- * i i I l l s I i i -
-- ' * * S ! I S I * £ I * I i * .I I V S I I l
I _____t _1_ .I
_ _ _ -.- I .- I It
V I I-I I I I i---.'
-:-I-i:ifiji
I
iM1i•*.
I i I I I
A:
I !
i:
I
S I I
!;I
I
ij:4;:a!
I l I I
ii
ma a
I
i
-
I i- ~ ' a a-i a a a i tI I ai ~ ii i
F - 1 1 ta!
I I I I I I
Rlx -
-
it
I
aii
! I
I
g I I
I -I1
- v i
li
I I l
I
I
I
Ii
I I
t
=
,. , i i l i a i i I a a i i=~
q L i I i i i I I I I I I I I F-
-- ,I i I i_ I _a,
I ri iI I I gI i I ii t I iI , FI T II i i I, i I " I , i 7a1
1i
: :-i i 'll i all! ii i! , li i i l i ,. B -
itL Itti I I l i '
r
r lii lt iI I iI : I I Ia u- i~iiiii I t I I It I * i IIi II aI I
ct
-'$2 1 I1 1i I I I I i, 1 i i i i i i i 1 1i i I 1 1 i1 1i lt
1
_i i i , i i i i i h l i ti i i = -
- F -"i S5---77 i ,i li -.-
iiiiii t il i i i i.- ,
- i. jg at i-i
I I i i i a i i
"-- -- - - I iI I-, I I I I I I I I I I I I I1 17
I , i I ii i 1i =
I~ia-- -- t
9 7I I i Ii i i IT t I i
F-i~T I I '-T-"F-I- IIA i---i1 ,-- Ha T T .- ' T - iT i-T'_
l' l-- - -Ii -o i ,iii l i ii liii, !i ,a iat ,igiI,
i .,, tl. f l .I I m
li I I I I I ! I I I I I I I i I 1I a a * t I a I
-- i.:a aai1 ! !i i; tat i i iiia l tI la
""F ii 1
, l ! ! ! I ;=!i t a i i a i a i i a i i~ i ii ti
:, -.jl-- i--.-------------i--i- i i i ----------- i ii
i'a L I i F- ia'1 ll i I I i i It a I a i ai a! i ,
i
S!-- i-i '1 I !' i I I i i i S a I a Il t i t -i 2
- J IIi i tat
i t lla ii ilt ai ii a i ilt i , ! lI i a i ll i t
SS------------------.l"-. i i ii ii i i i ---------. ii i i-
* .- ..ji i il i I - I. i I ti , it a Ia i i i i I i 5.-
a i i i I a i i I i i a i a I t i a a I i i i a i -
t t g I I II I I II II II I I
II I III 1
- i VI I II I t I 1 1 p i i
c.. [ i m i i i z.t * i ' IIi , 1., a i I l ,; I I I I i I
llI a, i 1 3 5 3 3 3 * I a -"
l;IF
* *i I I I i; I : I I I I
lI I
| II 3I I I £ I | * I I I 1 1 I I
.
I I
I-_I iI IfI 3 3, ! I : ,|-
.. '38.t
i ,I i
I i
I I ,
I i I ii° i i i i , ,
II I i3 I i i " , i ,1 io - I 1 - A1-,-i;
I t-.'
I 3$"x
m[
I
I I I I
3
I
3
I
A
I I ji
II
I A
i
i I
i
i i i I i I
I L
:a I
zTI
I
I I*
-I :I .3t li
I*t
Ii
c
sti I'l
1~ * - - I:I--,
[ -,i1 1 I I I I 3 I I | I I I 1 I,I I~1 I i
*,I I !II I I "L.l - J
r I I l 3 3 1 1 .11. . . . . . . . . . . IIL ' 3! l- ,
= I 1311"1
iiiI I i I i,I I .I- iii I 1 I~ 13 liii
I I 1i I i I _____
I t
- I 1-1
i i 4 I I 2 1 1 i ; i . . -
!Illi -- | i I I -I ;1 I I i ! ! ! ,i3 I i i I
-------- I-----
----- L. .4L1. .. L.±
.=. I j i 3 I - i i 1 3 I -rI I I01~ I I I I I 3 I I ". I I I I I It e m I zA. I i
I I I II- I I -- I J I ' " I;-
i i , i -i. ' ' ; , ill , liF- i i--
___i L I -L II.. I I I I 1 ,11 * I I - I II 3I I _______'
,- i~i Ii I , -i-i--
-! , - - --
SrL ! -' ; I
*,,I • 3I tI ii!iiI
II i ' I i9 I •i -'3i - ---
-----------
ii l------------------------------------------------- I- i--
I *FI ; il
i i mI . t i! il i 11ii il i"I I F i]
-adL
" i I
IE -
-- -. i / i3
i +
IiI * , P*
. i . I i
I
*
I
,-----,
-------------
, i
3I
,3"
'
3
!
,
" .
3
I
3li , , a-
I I-
'! -'
_
-!
I-i ,.-i t , , .- I - t- !s . -i i- - I- -- I i - - , i ! . - l i
98
I~ c
I~~~ c111
I i -. II I I A
Iar .4.. A1i1'
I" - l ' ,- -' , i I I i i~1 ii i i i i i I
=, I.i" ---.- i
i:,
''
-T--I,- "! i , I - !~*, i i Ii t
, I- '~Tr
,'ii ii r iriiT1 it 7i i i i
7T
- ; -
, ,i ,T I I , , i t . t i i i t , - -
, ,T) i , , I3, E
II lT-- ISM*,- ii
-1' i i 9 ,' i, , i ,
,i i I i i I , , i ,
, I i ! ,
,
i
* Ti t : 'I I, I ; ' I I ! t a aa Ii I I - 7
I!I
It
rI I I _
"
' ! u
H-±~:~JI
I 1 -1 - I
VAm -
o
W
i I . - 9 I II I
-t : '71 ! r 1 1r ,7-7!v i x,
I f I I I f I I I I I I - I IFi
I If
I II
0_ I
ii I I I I A I I I
I A II
- i II IA v .I
I.,I
I
.8
I~~ieis
I.~
' ~~~ --- I li.
I~ - II**lot
3
00
Z' PIT 1
*
-~~
I
F I
l In) -3
P.~
7 I
t i7 1 f.
Iw
,I p" f 3 1
J) 4-,1
a
-P
r~i R I
-
qR *
~~~ti
Olt-, I I
-00
-3p.
4. Data Record E, Support Equipment or Training Material Description and
Justification (Figure 36), is structured to consolidate information related to
xiew support or test equipment and training material.
LSAR reports with potential significance for MPT are given below:
2. LSA 002, Personnel and Skill Stummary (Figure 39), reports the manhours
by skill specialty code, expanded on each task.
5. LSA-014, Training Task 'List (Figure 42), provides the rationale for
training requirements and training location -requirements.
As indicated earlier, the LSA data are developed by the contractor. While
MIL-STD-1138-2A requires specific formats, data may either be in a -manual,
automated, or a combined mode. Most large development efforts- should be
recorded in the automated mode.
101
I A
iti
t A I
I -g II II I II
II ' 4
II ii00
I U I II U I0I
A9 1 11
t
fil .
IFI
:1 I Ti5 I Fi; I
I I I I m I I I.
3 a
I I
I .14
I II II
102III
000 -D 0
U C NtV0000
CD N
IM 0 ccC -
00
w~ >e
0)- NU C.0 @0
0
Wj 0 0 0 0 0 0. 0 .
w In >U ! +
CO
an Ir C
'
w i n
00 c U 00 .)
CL X i-'C 000000
0i
w I a C-
0X 00000CD0
-C
ILW ~ C C, - ...
c U00 EE >' IX
CI \W W ) fi-
0 ZC C> CI-O
0n0 U W
01-
iiiu 4-
~ itwCC
.4Q w 0 000
cc-CO 0
w =i O.W CC ,. .
Ncc .- W)iw
. C cU;a n0
0w wC %
at ca C- 0 ,e
U~" ' li
-~t U -w-0 v ev000
0U a00000C
0 -
oO . o-'
a 0 j c o w
Q -Z aI a 0
U cc
J wr c a ~ u'
Ix
.
--0 -400
W. WL6 0 0 1.4 - z
Li0 C
II
.. 4E
im 1 c00000
In
(A - 00 000 00 -
m0 0 . f -.
-j Cw
- -~ OOOO C4 1 .103
CL( - . V C N In
0 C- in
Co a *1
Cl 0VC C r, 'm C 01
V'
C .C!
0' 00
> -coo cc
1J c 0 a 1 , m -- n n V --
C! 1!
1! 1a 1! M 0
fA0~
:fACo
=- ft-I a
v. . 0 IA x L
w 01 OIL\ 1 Z1 a.,=
-oICuICa z IA c ,
U r_ ' C IA
D .0 - VI 0. 0
~
>.
#A
~ ~
SL
C x C -0 C 0C 0I
N0 C 19 0 o Cc a
C- 0
C I. 0 0- C
0~
U QUI-,OP. v P Ct 9 - W) r8 - a iN-
I. a A.I..a
a. a -ac
Q
o Z
c
JJ
wA
J - *'0 C.IA
=
c
.
V)
A
- W-Aa._
-Cx . - 0
W
M 0 £ a a a sA A =f cr a.
th cc
- 0) I
- A I
w U a -a a.
mA 'Ul . A Z~ =0 (A
(3 6. E a. U J.a&..& W, 0 1 IL IA
0 ox LU- ... a 0 0 C-C
I -IX a I W C a. -U. 0 C U LL - 20.
,0 CA r -- 0
ZE U a
0: 0 0 IA .- Q0 w -C 2 M; a0 IA - a V)_
- 3 - I.. C 0 c I -u
. . c oa u- L, ;.
-L- -a. U- ua
.j U
IJ a.
0%X a -aU0 (A Z 0 C
ul
).j 3 a. Uj Cjs--a.- %"%a w Ow MU wC _jL
>IA - -W--a it-:; u A. L, Q..a uou-..
J_ x I~.I >. 0.__a -a..
_&L 0. a0 -IAC
0 CAfl r (.AUIAw~
0 CL Iw 9L a.0~ CL a c-J0 AUIZa W~~a
inaWLa. tl I L& IA WZ LU IIC-L UACI
CCZW C U X S*OQ W CX- LL L.. Z a X W CLILA
wA z U- j 0 u. a_ a a -C . a (I a.
x c. C. a. c. a c a a. < a. .1 . ca c. C. a a. c
IA C.Q.. OZ C L6. I. 0 0 LL 0 0 r Q 0 C C 0 LL LI.
a. - ~~u a. M -7 . : = - - - = a
Q
Nl >
*0 V, - - -
J. 0 V.Ir 0 0 c _V . . z?>
IL U
w U. Z A ZZ Z U. t/, IA 0
nA IA IA IA IA OIA 0 0 0 Q 0 W a a. a.
IA~~ IA I n I IA . J f - - 4 or 91~--
(a wU -a
Ae !e w a.U;t. - U I
IA IA 0 ccAIA 0 C1 a. a. a. . a. A -'A 0
19 a. a. a. a - a.
C
~w- IA
w
U =A
IA
w
U,
U
w
U
w
U
IA
u
W-
Ui-
-
IA
in
A
IA
IA
0
I-
VI ZI 4
U,C0--C~
. ~ U
c
U
oo.00
0 16 N atV) Rl'O
WCO OCZ
U.- U.
-
U.
IA
U.
La
L, 0rflvi~rn
C:,00 0 a COC
U. COO@0C.aa r, -Uaw cc caca.or C9C. a t@u u
0.0. C 3
0 r4 P)C
a. a.a r 03 w ;a w, u
-; (.A VI PI T e -C.1N
-104
I.,
lK v. V) Y . m , f s i - I
o 0 CDN-O o 1
> 00
0. 0.
(A .X
0 00( D 0 0 -0 -.2-uDa o 0 Q
Yo 0 0 0 r OOC
-1E-
-0 a '-A
C.-C-
_: - ,
IL =J -
aa VA N Q , WA 1
U) (a a u 2 w0 00 r C u 0 C a 0
00- ILO
c~ ~ ~ (A0-W 0i=w_ X V19 0 0190 j;. 0 1
0 J, 44 MN -0 9 ? i VM
'L'
12 a a. .-0 a =c -a
u &L m~ c 0 U u i U t 0 0 U
U V2 rU -- (A
T~ Mi
-- 0. CA 4WD
1.-k (ap a O 1 CU. .6 1-0
-L 0 - 124 - 4y 0--. 0~ V W.-2
*( 01 (A
of-I
120
( ~ 2 0 -C>
~ UIC.0
04U 04
S-.
4. W L; IL- oW cc2 a m r M -i -l
>w -j>
<0
a. 2 U. w. X J4 14-
U, Q u -U U) 00 0 U 01 oz 0-
4 ULU
u U. %01 &L Uu w In LL U.
cc g -I 2 0 1 1 0 21x1
~
@1
-~~~~
~ ~ 0- U
a m. -2 ~. ~
0
I
00w19
1 1 9
Mi-?~1
M i 9 1
UU 12 - 1 0.' 0 . * 0--00, 'o
4 hi 0 (A e h '
(AGI
U hi 3 I a
(A 4 4 2 11 2 4 0 0 n- 10C
4 4( (a II I I C 0IL -o a
N~~' i > 4 4 axM -L
0 u.o 0 N ul w
c of U UL hi Ua U UwJ x c cc Q U_ o
1 N 1
0 0i 0 -
I2 Z a. a - oI 0 - - -1 IF.
-w
0. I- 1U - (a (A AU' A (A N -t~-105 I
In tK I& cv " " tv c
0. P.I! I-
4- IL.-
* - * - 4
J9 S - W
a#A a- - a a a
4. a - ~ (- z t (V (
oL (.3 "1 L i ;; 0- L L
a u. mv w
-. wi m alU a V
o j a- = . L
;a. ' L 'o c k
J
a W.
0 U. U 0 0 '- 0 0
w LA 1 8 a 09 P it l-
5 -OK 6 F 9 1 1
z z
WIt- L f~P t P l P
0 J K ~ -0i. S
9
P
Ic a- =
~. I- 106
r
UU
ccS
0.. .
w C4
om
u s#A 0i. CA
c C
',- ,_-<,- -
UU
Cal CG -r-"-
a wj
4z c
-Li 41 a
44 -W
L A C.-
X-- -
U 4.
4 w :
t 'J tt.
.. ,.
4 s- -k .- 2 0 t
0 1- A - -, 2 40
cl. "i ot - K L
-C.. c 9 u
- - .J . O1.J+1.
wC CA_ fCa zA U2
(- CA 0 'WI m ; - u
W U , U. a a -%a -
- - 4kl
1" U . 0 C- " 0 -( 0m
0a (A
wl ul = z X C -LJ D -u
-S Q aM m In - m Z-~
, £, (- 3 Ul- L
C4 0 -_=A 0 0 C' :
Q - IAW Uat Js -c c 0 C CC 4-3
CC W cc4
u -- a- *a 2 - -0-
.o u X I - - , 1 4
- _ 4 z x-
.
aAus a0 C) CM 4
- u
CL c.
m -L 2 u C
o
La 4
Cc
IL
P-+ K Ge 3-U.
I~~, 0 3. c.
A. 0 , 0 0 o- 0
*
ov W" -Ca
c - CA
In- =5, CAt 00C
la Wb-
u U
a a aA m --IL at WA X
o0 U C 0 40 00- x x Ca -
CA I ~ C. 0 ~ CL ~10-C
CA
t 4 aA I.- &- U -
b. w U b- .
w- wM U X a
4JI 4 C4 cC
tI
C W
c
.4
IC
~ W
-W
x In I .
I-
-0 r! r
I. qA
C0 -a ; 1 -%
40 0 W
lO 4n 0
IA 140
.4 La 0 -CC.
w9
~r O rI.
C
1
A 9 I V
C
0 I 0a
040 w a 49
cc CO c N O 0 CO
C W rIO w C C
r_ CL X~W
C.W Wa _j Cl
u wC 0 CA a
a _
W-0 3
0 c
Im U. :1 4M
to W=10
140 - 0 i
6
> r.WI W?0
WO- w -
LIlWNI
c111 _z
0
atUN0
X-- 0 .4
a 4 -0 U w
SClad = -X -
- W r C 3-1
C h O=-C AL LI :I'
tC Zr
> C 0 - -U
wl 9 . a0~ W
JC WI w U 0 Is0 -
Wm
.j G C 'x " 0 -a a _;4 i _
c w m~ D u,~ L '_ C Ia
n- JS j, f, w V)~C
>4 - C i
'- >C
w J J w i. .C
A. _ J 0: 3 w C
10 J0 U up
j 0f
J- - j 0 0 - C.
.j C - jX0 C
- -
C. ~ 0-
~ z- i
>S-0
C OJ U a
0
W - - U - S NI -C
W LI C- 0.41
! xC a Uw -,
-~~~~~c 04C - I IC U 1 - cr ja-1
1
C4 o0W w GO-
I W-
4~~~t -I L
-C l m cC C M
444
0 -aC
>
A -
J IC C'A41M
W C 3wa0c
j 0 j~I *
J .0 COI
Cf .4 CCCCCCCC
-4 c cc cc
to E. W> 14 >4IC
4
cW 1 - 4
>4 40 -
IIA WW -x ta on
c5 _j>
r.0 1.40a
0 1.0
- CW C108
--CU
present time. The most readily available access at the present time is through
accessing contractor records.
The Unified Date Base (UDB) project, programmed for completion in 1987,
should provide easy access to LSAR. Plans call for the -UDB to provide an
on-line system featuring comprehensive editing and help capabilities, user
defined function keys, narrative text editing, illustrated parts breakdown, and
data indexing, among many other -capabilities. The plan calls for a dial-up
capability to access current and historical LSAR. Observation of a-
demonstration of the development -efforts of the UDB to January, 1986, revealed
that menu-driven access to the data should make LSAR data easily and quickly
accessible.
Two factors affect the stability and accuracy of LSA data. First, the
data are supposed to be developed iteratively with provisions made for updating
the various records as development indicates changes to be required. Second,
estimates of such factors as skill specialty codes and manhours are merely
subjective 'estimates. The technician supplying skill specialty codes may or
may not for a variety -of reasons select the appropriate code. Further,
equipment reliability and maintainability entries have little basis in
experience until late in the life cycle. Interviews revealed, for example,
that reliability estimates for some equipment items were grossly incorrect.
Also interviews indicated that the AFS codes frequently must be changed. In
regard to the frequent changes reported as needed in AFS designation, it should
be noted that AFS decisions for a new weapons system are the result of
negotiations between the personnel classification functions in the Air Force
Military Personnel Center (AFMPC) and the major command gaining the new system.
The contractor has every incentive to -minimize requirements for new MPT
resources. On the other hand, the gaining command has incentives to capture
resources by advocating new AFSsor shreds of existing AFSs. The AFMPC role is
to reconcile the requirements consistent with such guidelines as similarity of
-task and job requirements, assignment equity, and promotion. While the
contractor may be entirely correct in the designation -of the AFS to perform a
task, these major command and AFMPC considerations may change the designation.
Mulligan and Bird -(1980) make this statement about LSAR -data in their
publication of guidelines for maintenance task identification and analysis-,
"Analysts may be aware that the quality and technical integrity of LSAR data
varies considerably from one program to another (p.37)." Of most use are the
LSAR data bases that are required in full by contract and are implemented
completely by the contractor with a continuously updated data base. Many
times,. however, a limited LSA is procured. In some cases the LSA is performed
during the conceptual and developmental phases, in which case the LSA data are
limited and do not remain current-. In other cases, LSA data relevant to the
TIES problem are not procured. Also, data sheets E, F, and G are not usually
entered into the computer data base.
109
Uses of LSA Data
Aside from use of the LSA data for design changes and as a basis for
future design considerations and for development -of technical manuals and
publications, use of the data varies. There, obviously is a wealth of data for
manpower purposes. For each task identified, manhour and elapsed time
estimates are provided. Summaries of these requirements- by skill specialty
code, organizational level, and annual maintenance manhour requirements are
available. There have been efforts to utilize these data for LCOMmodelling of
manpower. Cronk (1986) reported that an LCOM conversion program for inputting
data provided by the LSA preprocessor is beingt developed.
The Aeronautical Systems Division (ASD) of the Air Force Systems Command
has accomplkshed LCOM manpower studies using LSAR for the T-46. Cronk's
conclusion is that LSAR summary reports are inadequate for use for LCOM
modelling. The C and D data records, however, contained approximately 90
percent of the data required by LCOM, at least for unscheduled maintenance. He
stated that the LSA master file from the contractor if processed by the joint
ADP system would probably satisfy all of the data requirements, in which case,
an interface program could-be developed.
1. Blank-data fields
110
This organization is charged with developing training materials and
providing the initial cadre- of -trainers who are assigned at the first site
implementing a new system operationally. The training development -process
should begin at least three years before the implementation for such systems as
the B-1. In every case, the development process must have a long lead time.
Using specialists from various AFSC who are T-prefix qualified, the 3306th
builds course training standards, plans of instruction, and-materials. The ISD
process is -employed, and requires detailed system analysis as its first step.
The task analysis reported on LSA-015 satisfies much of the system analysis
requirement. Other LSA reports also provide valuable input.
While the data developed during this TDY are available for training
development for initial resident technical training, its use for this purpose
is- uncertain. One inhibiting reason is the volume of the hardcopy data.
Personnel in the Training Development Service, USAFOMC, were unaware of the
scope and content of the-data.
Because of their immediate need for the LSA data, the 3306th T and E
Squadron is acquiring a terminal for access to the 1UDB. While data may not be
complete, UDB access will provide these personnel with the data available, its
updating, and relatively easy access- for limited analysis. Any such analysis
that could be accomplished appears to be limited to a given data record.
Analysis across data records is not available.
4. MPT uses have been limited. Some agencies whose functions could be
enhanced by usage of the data were unaware of data scope and content.
5. Criticisms have to do less with the LSA and LSAR.pr Le than with data
availability, completeness, and awareness of the MPT community that they exist.
Certainly, MIL-STD-138-2A provides the mechanism and guidance for generation
and display of data relevant to MPT.
il
6. One may get a false sense of security about LSAR, because it has an
"engineering" veneer. In fact, LSA data relevant to MPT are quite subjectie
and not dealt with as rigorously as in the case of OSM data or in the
operational audits employed to develop input into LCOM.
The first step of the Air Force five step ISD process is to analyze system
requirements. This analysis requires a needs assessment (Goldstein, 1978)
which includes task analysis, organizational analysis, and person analysis.
Results of these analyses are used in the second step of the model to define
education and training requirements. It is in this step that decisions about
what is to'be trained are made. Thus, it seems apparent that prerequisite to
the first two steps are the requirements to identify tasks that will be
performed by those who are trained as well as information to decide the
emphasis to be placed on training each of the tasks and how training should be
developed.
Origin of ISD
AFR 50-8, AFMs 50-2 and 50-62, and AFP 50-58 specify the requirements and
methods for developing instructional programs according to the five step ISD
model. While policy requires the ISD process be used in developing training
programs, the five step model is most frequently applied to new courses or
programs. Interviews revealed that while most of the steps of the ISD model
are applied to some degree in every development effort, there is no audit trail
of the development efforts. In terms of the first two steps of the model,
occupational analysis data as they are available serve as the task
identification source. When these data are unavailable, training personnel
must generate their own task identification data or find some other source.
Information from the occupational analysis data base, such as task difficulty
and training emphasis, are also extremely valuable for deciding which tasks are
to be trained in resident courses. The OSM data are also useful for other
technical training (e.g., advanced courses), but not nearly so much as for
resident training for which ATCR 52-22 provides guidelines for applying the OSM
data to decisions about initial resident technical training.
The use of Utilization and Training (U&T) Workshops grew out of a Hasty
Grad initiative in 1977. U&T workshops are made up of users, trainers,
personnel representatives, and occupational analysts. This group, using OSC
data and subject matter expert guidance, define training requirements,
satisfying the first two steps of the ISD model.
112
More recently, AFR 50-58 (6 Aug 1984) in paragraph 3b directs that
"Anticipated training requirements for each stage in the life cycle of a weapon
or support system or for Air Force personnel career qualification are defined
by aomulticommand Training Planning Team (TPT). The results of TPT efforts are
documented in a Training Development Plan (TDP)." Members of a TPT include Hq
ATC, using command, and AFMPC representatives.
A TPT can be initiated-by the Air Staff (functional manager or other major
staff element) through issuance of any of the normal change directives in -the
Planning, Programming, and Budgeting System (Driskill, Mitchell, & -Ballentine,
1985). When such a directive is issued, Hq USAF/DPPT will direct Hq ATC to
form a TPT to develop career ladder TDPs. A TPT can be required for a variety
of reasons:
ISD Data
1. Task identification
2. Task criticality
3. Percentage performing
4. Number performing
113
5. Frequency of performance
6. Learning difficu
8. Task analysis (action taken, item acted on, steps of 'the task,
knowledges,, and skills)
Data relevant to ISD are recorded on a variety of forms. These data are
not automated. Furthermore, interviews revealed that documentation is
inadequate. The one exception is the task analysis data being developed by the
USAFOMC/OMT where the efforts are systematically documented and accessible
through the USAFOMC IBM 4341 system. Data developed by the 3306th TES are
available, but they must be manually accessed.
When data from the other task identification systems are employed,, the
reliability of the data of those systems applies. For ISD specialist-
developed data, reliability is not assessed. Reliability would depend upon the
replicability of the data through successive SME interviews or panels.
Across the various task data bases is a wide variety of information that,
if a method of aggregation- can be developed, could be highly useful for MPT
purposes. Table 9 displays these various kinds of information, identifying
each type of data and the task system or systems that produces them. For -each
information type, the table shows which MPT function can make use of the data.
When a M, P, or T is underlined, the function is at least partially making use
of the kind of data provided.
Given the variety of information and uses, can the task data systems be
linked so that all of the information would be available? The answer is- yes.
Data elements and literal translations identifying tasks exist in each of the
data bases that will permit linking tasks and associated data among systems.
114
Table 9. Illustration of Task Data
Systems and Data Provided
Component M T MT M T* T
Percent Performing PT T
Relative Difficulty PT T
Consequences PT* T
Support Equipment T T* PT
.Job-Satisfaction P
115
But, because of -the different way tasks are described among the systems, some
manual methods of linking will inevitably be required7 to supplement an
.automated linking methodology.
There are several precedents for linking data from different systems. The
first of these precedents-occurs in the linking of specialty training standard
elements -and plan of instruction criterion objectives with -occupational
analysis program tasks and related data. Using the PRTMOD option of CODAP, the
USAFOMC provides a matching of survey tasks and the training documents- in each
survey report. The matching is accomplished by subject matter specialists in
the career ladder to which the data apply. Recently an automated matching
process based- on the semantic content of the different sets of -data to be
matched was developed to enhance the task and training document matching
activity. Subject matter specialist input to fine tune the matching is still
required; it is not entirely automated.
The matching of the- three task data sources consisted of three steps.
First, subject -matter specialists mapped the maintenance tasks onto
occupational analysis tasks. Next, the specialists linked the occupational
analysis tasks to training standard tasks. Finally, the results of the mapping
procedure were validated by additional subject matter specialists.
-116
Specialty training standards contain elements that are not task specific,
such as understanding of some principle, of operation, or safety training.
Monson, et al. eliminated this type of task. In the PRTMOD match of OSM tasks
and training standard elements, however, the subject matter specialists link
the non-task elements with task statements.
To identify the AFSC of incumbents who performed work on the F16, Monson,
et al. established an identification code for each participant. Such a code is
required if the data are extracted from AFTO 349, since AFSC is not recorded.
The resui of the Monson, et al-. study was the production of job
description reports. These reports could be generated in terms of the
maintenance, occupational analysis, or STS tasks. What is not clear in their
report is whether there was a one-for-one matching of the tasks, or whether one
OSM task or specialty training standard element was- mapped to a single MDC
task. Experience with PRTMOD matches and the differences of the level of
detail of MDC as opposed to- the. OSM tasks suggest that a one-to-one match is
unrealistic.
While members of MPT organizations speak clearly about their missions and
organizational objectives, they do not so clearly articulate the needs -for data
to support their decisions. Most of-the organizations rely to varying degrees-
on data from one of the task data systems. Only one part-of the MPT structure,
however, has any written guidelines for applying data to the depisionmaking
process. ATCR 52-22 establishes rules for applying occupational analysis data
in making decisions-about what tasks are to be trained and where -they are to be
trained. Other aspects- of the training establishment as well as the manpower
and personnel functions are free to use, or not to use,, products of the, 0SM,
MDC - (except as- it is required in LCOM), LCOM, and LSA- in the manner they
choose.
117
study. A job inventory was administered to personnel in the AFSC 51XXL
workcenters to obtain the relative time spent estimates. At the same time, the
MET performed its time-measurements. As noted-earlier, the correlation between
'the 'two time measures was .79. More importantly for -using OSM with LCOM for
manpower determinations were these findings:
I. Hierarchical grouping of the OSM data became the basis for a MET
recommendation to restructure -workcenters.
McFarland cautioned that OSM data are -more-useful for large workcenters and the
more homogeneous AFSs than for smaller workcenters and-heterogeneous AFSs.
118
assigning tasks- to AFS structure -on the basis of similarity o'f nowledge and
skills.
119
1. A statement of each task.
3. Information on which to base decisions about when and where each task
should be trained.
For new equipment or systems, -the ISD ersonnel in the 3306th Test and
Evaluation Squadron use LSA data when they are available early enough in the
development cycle. Otherwise, the 3306th personnel develop necessary
information from other sources, usually through on-site interviews and
observation with contractor personne! developing the new equipment. They need
access to the LSA data as- early as possible.
In summary, each of the MPT functions utilizes data from at least one of
the task identification systems. They each need timely, accurate, and-complete
data at the most specific level possible.
120
X. TIES APPLICATIONS
The benefits to derive from a TIES -originate in- two related sources.
First, a TIES will provide- task identification from generic- to specific levels
of -detail, including. the analysis, of 'tasks into component steps. Any MPT
system whose decisionmaking function is. based on task information would
certainly be enhanced by the more, detailed data from TIES. Seco6nd, a TIES
would provide for a system of detailed audit trails of personnel and training
activities and a mapping-of their interrelationships.
AOTS and TDS would also be served by more detailed task information. AOTS
especially stands, to -gain from a TIES, because of the requirement for master
and local task lists. The first could be generated by a combination of OSM and
MDC data as a first-cut master task list in maintenance AFSs. Local job
descriptions could be the product of the base level generated MDC combined with
other task identification processes at base level.
121
experienced expert is not necessarily -easy for someone who is just-beginning to
learn- the work of a specialty. There is a nee&dfor a methodology of estimating
the degree of difficulty of performing tasks of one kind when the job incumbent
pnas experience with other kinds -of tasks, or no experience at all.
There also is considerable promise of a TIES for new weapons system MPT
issues,- especially where comparability analysis is used to arrive at manpower
and. reliability and maintainability estimates. If the MDC task data of the
comparable- systems have kfeen mapped- to 0SM' and LSA tasks, then job type,
training emphasis, task difficulty, specialty- training standard information,
and the task analysis results are available for sorting out questions of AFS
designations, and-definitions of -aptitude, educational-, strength and stamina,
and training kequirements.
While Armstrong and Moore cliarly illustrate the formal -structure, the-
variety of perspectives a;ch of the communities has onMPT issues is not shown.
MPT integration will require accommodation of these perspectives.
Specifically, each of -the MPT functions views- the- issue, of accomplishing work
in different terms, and, indeed, there are other players, namely -the user and
the worker, who are crucial. Theruser generally will have three variations of
the same perspective. At the highest level, the de.rIe is to acquire and train
personnel to perform whatever job must be accomplished in the shortest time
possible (although the present -Cnitiative in SUMMA suggests some softening of
this position). At intermediate levels, the perspective is more limited,-
incorporating the need for job incumbents who can accomplish the work required
by the. intermediate -level and -to retain, them as long as possible. At the
lowest level, 'the concern is much more.-specific - the training and retention of
personnel to do the specific jobs to which they are assigned. This
perspective, while varying in degrees, has strong implications fo their view
of occupational classification stri.ctures.
122
wi 0-
W-0
V. 0
U w
-Z
0
VSi 0 V
I",d
0 I-
' e zO- I , ......
CC
0
123L
Workers have different perspectives. They are concerned with -their sense
of accomplishment and job satisfaction, the meaningfulness of the work they
perform, promotion potential, career progression, and assignment desirability.
,For example, one has only to look at numerous occupational surveys of AFSs,
especially in the maintenance specialties, to- see -how job incumbent
satisfaction has varied within and among specialties as a function of changes
in classification structure This perspective, too, has strong occupational
classification implications.
The- manpower function must address two basic considerations. -First, jobs
must be identified and described so that they can be matched to an appropriate
AFR 36-1 or 39-1 classification code. Second, the manpower resources necessary
to accomplish workload must be determined. As SUMMA shows, as does- the TSAR
modelling of combat maintenance capabilities reported by Dunigan, et al.
(1985), -occupational classification- structure has a major impact -on manpower
requirements.
From the personnel -perspective, aside from- the selection issue, there are
several important criteria. Occupational classification optimally would
combine into a single structure those jobs in which knowledges and skills are
highly similar and where -transfer of them from one job to another is at a-
maximum. . There are -additional considerations, however, to structuring.
Assignment equity, especially overseas rotation, career progression-, promotion
potential, and meaningfulness of work (all of essential concern of workers)
must be accommodated in occupational classification.
124
But, what does this discussion have to do with a TIES? TIES will not
solve the issue of MPT integration. A TIES will, however, bring into a single
data base a number of sources of information that should facilitate
integration. At present neither M, P, or T use more than one data base--and
those data bases are different. A "task" to manpower, or to maintenance
personnel, is not the same to personnel or -to training in deciding what to
train. A TIES would not necessarily change their respective definitions of a
task, but it would match the different "tasks." Further, a TIES would
aggregate a variety of information relevant to MPT issues in a-single source,
i.e., such factors as tasks, job satisfaction data, job structure, task
difficulty, etc. It also is possible that over time the values of the various
criteria can be developed, so that they can be weighted accord,ng to various
situations. If these values and weights can be determined, modelling of
occupational classification structure to optimize MPT, user, and worker
criteria can become a reality.
Building a TIES is not the most difficult of the problems that lie ahead.
Typically, MPT personnel are not data oriented, and time and effort will be
necessary, first, to acquaint them with a TIES and, second, to equip them to
make use- of the data. It is not clear how the various factors should be
weighted and integrated.
The idea of a TIES for use in MPT is seductive, but its creation and
implementation are not without problems and issues of some concern.
1. Historical data, i.e., data presently in the OSM, MDC, LCOM, LSA
files, are difficult to access and were not originally developed to interface
with any other data system. For these data, whose interfacing will be resource
intensive, it is suggested that a TIES be developed on request only. A TIES
initiative seems better directed at the actions necessary to align the
different systems for interfacing than at the creation of new data.
3. As with any new or expanded data system, initial use will be limited.
Time and training will be required for potential users to become aware-of the
data system.
Some actions- that can be taken in the near term to facilitate future
interface are suggested below:
125
I. Increase compatibility of USAF Job Inventory verb and noun task
statements with MDC, -LCOM and LSA tasks. This effort can be approached in two
ways. First, actions. taken statements from the MDC, LSA, andLCOM bases can be
adapted to the survey tasks during the inventory development process. Here the-
USAFOMC may need research assistance to aggregate the different data sources
and develop a methodology. Under the pressure to produce inventories and
survey reports, the USAOMC has little resource for developmental efforts.
Second, inventory format for data collection should be studied 'from a research
perspective. Foley (1980) suggested a revised format that potentially could
extend the range of equipment specific data collected in occupational surveys
as well as resolve the incompatibility of action verbs. The importance of
Foley's suggestion is that it illustrates an alternative approach to data
collection format. What Foley's work is not is a job analysis system devised
from research findings. It is a suggestion for another approach to collecting
maintenance task data through surveys. As such it should nct be employed
operationally until it is subjected to the same kinds of tests of accuracy and
stability of data as the present program underwent. For the success of a TIES,
however, reduction of incompatibility of action taken verbs between the SM and
the MDC, :LSA, and LCOM must be achieved.
5. Improve discipline in completing LSA data records and MDC AFTO 349s to
assure that correct data codes, especially WUC, are entered.
If data from the four primary systems are successfully related, output
which displays the matched data will inevitably raise several crucial
questions. First, given the array of equipment included under a certain
occupational analysis task, a logical question is "How similar are the
knowledges and skills involved in the maintenance action taken among the
different equipment?" A related question concerns how a job incumbent,
especially a first-termer, can effectively move from performing one set of
tasks in an AFS to another set--on different equipment or at different
locations. These two questions are at the very heart of the classification
problem. They, unfortunately, are not easily resolved, but research which
looks at the ability of subject matter specialists to estimate the similarity
of knowledges and skills required of tasks among job types or specialties is
urgently needed.
126
there is sufficient reason to suggest that a methodology could be developed for
obtaining estimates useful for determining similarity of jobs.
There also is available some evidence that subject matter experts can
provide certain detailed information that, when aggregated, yield acceptable
estimates of a factor about tasks. A case in point is a study reported by
Rose, et al. (1984) in which the factor to be developed was the complexity of
tasks. Rather than obtain estimates of the complexity of each task, Rose, et
al. obtained specific kinds of information about the characteristics of the
tasks from the subject matter specialists --e.g., the quality of the technical
manual describing a task; the logical sequencing of steps of the tasks. An
algorithm for combining these data into a single complexity rating was
developed. As a result, Rose, et al. were able to rank a set of tasks from
most to least complex.
The po.int to this detour is that there is evidence developing to show that
indepth knowledge of two or more AFS may not be necessary to estimate how
efficient movement of-a worker from one job to another, consisting of different
tasks, may be. If research could develop an estimation methodology, the
importance of a TIES for-MPT would multiply dramatically.
Since the OSM task statements tend to be fairly general, they provide a
high level summary for cross-referencing MDC, LCOM, and LSA items across data
systems. It is anticipated that many MDC, LCOM, and LSA data elements will map
into each OSM task., Any mapping of MDC, LCOM, and LSA data to OSM task
statements will ave to be accomplished by a methodology to be developed.
127
evaluation and correction of the -automated matching and completion of mapping
which requires substantive expertise. One method of obtaining a preliminary
mapping is to require, for all future occupational surveys, inventory
development specialists to obtain -subject matter expert input about-MDC actions
taken as they- relate to tasks during inventory development interviews. This
requirement should be expected- to improve task list consistency and coverage,
as well as provide the initial and crucial step in interfacing -the systems-.
Questions to be answered involve LCOM and ISD tasks. LCOM tasks, because
action taken codes in MDC are aggregated in the DPSS, are more general than MDC
tasks. This situation suggests that LCOM tasks would precede a- listing -of
appropriate MDC actions taken at the component level. Also, in LCOM data
preparation, the, first three -digits of the WUC are often used, the last two
digits reflecting component level maintenance being omitted. A second-issue is
the placement of ISD-tasks,. They range from a consolidation of OSM tasks,
making them very general in nature, to the MDC actions taken on components
level.
Since the individual task data systems address the pressing needs of their
various users, what is the value of a TIES? TIES would integrate the separate
systems into a full system capable of accommodating and summarizing very
128
detailed information into more global descriptions and, when necessary,
identify the specific sources contributing-to-each global description.
Four of the five task identification systems, OSM, MDC, LCOM, and LSA
provide the MPT community with availability of a variety of task data. These-
data vary in their content, currency, -ease of access-, accuracy, and
reliability. While each system generally satisfies the uses for which -it was
designed, a combination of the data from two or -more of the- data systems would
increase the decision data base for MPT issues.
The LSA is an excellent source of task analysis information- for use in-
technical publication development and in designing training. Unfortunately,
some trainers report that the data are completed too late in the life cycle to
-be useful for developing the training materials to be used at implementation- of
a new weapons system. Also, accessibility of the data has been a problem,
although the development of the Unified Data.Base (UDB) promises to relieve -the
accessibility issue. In this regard, the 3306 TES is presently embarked on a
trial use of the UDB for ISD in concert with the AFHRL. Preliminary
indications are that the data handling capabilities of the UDB can contribute
greatly to -ISD for new systems as practiced by the 3306 TES. "ven so, the LSAR
must still be bought and still be made available earlier than they usually are
for the UDB (or in any other data base management system) to be useful in ISD
for emerging weapons systems (or in any other MPT application of LSAR data).
While a linking of the task data systems can be expected to create a more
comprehensive data base for MPT purposes, there will yet be missinginformation
vital to these purposes. None of the systems provide for determining the
similarity of the knowledges and skills requirements among tasks. This
129
information is crucial to personnel and training activities,, especially those
like RIVET WORKFORCE (Boye- 1985) and SUMM4A (Boyl', 1986) -in which- structuring
jobs into specialties is-the Lssue.
130
XII. REFERENCES
Armstrong, B., & M6ore, S.C. (1980). Air Force manpower, personnel, and-
training roles-and interactions. Rw2429-AF, Santa Monica, CA, The Rand
Corporatfon.
Bell,, J.M. -& Thomasson, M.C. (1984). Special Report, Job Categorization
Project. (Available from the U.S. Air -Force Occupational Measurement
Center, Randolph AFB, TX).
DPSS Working Paper (1986, May). Form LCOM Steering Committee Meeting.
Randolph AFB, TX.
Driskill, W.E., & Bowers F. (1978). The stability over-time of Air Force
enlisted career ladders as observed in-occupational survey reports.
Proceedings of the 20th annual conference of the-Military Testing
Association, Oklahoma City, OK.
131
Driskill, W.E., Mitchell, J.L., & Ballentine, R.D. (1985). Using job
performance as a criterion for evaluating training effectiveness. Draft
Technical Report prepared under contract number-F33615-83--C-0030, Brooks
AFB, TX, Air Force Human Resources Laboratory.
Driskill, W.E., & Gentner, F.C. (1978).. Four fundamental criteria for
describing the tasks- of- an occupational specialty. 'Proceedings of the
20th Annual Conference of -the Military Testing Associationj Oklahoma
City.
Driskill, W.E., Keeth, J.B., & Gentner, F.-C. (1981). Work Characteristics-A
task-based, benchmark approach. In Proceedings of the 23rd Annual
Conference of the Military Testing Association, Arlington, VA.
Dunigan, J.M., Dickey, G.E., Borst, M.B., NaVin, D., Parham, D.P., Weimer,
R.E., & Miller, T.M. Combat maintenance capability: Executive-summarv.
AFHRL-TR-85-35, Brooks AFB, TX, Air Force Human Resources Laboratory.
Falle, I., &-Knight, J.R. (1983). The assessment of common skills among
similar career ladders. In Proceedings of the 25t Annual Conference of
the Military Testing Association, Gulf Shores , AL.
Garcia, S.K., Ruck, H.W., & Weeks, J. (1985). Benchmarking -learning difficulty
technology: Feasibility of operational implementation. AFHRL-TR-85-33,
Brooks AFB, TX, Air F6rce Human Resources Laboratory.
132
Horton, T. (1985, May). Using cognitive task analysis to examine
troubleshooting tasks. Proceedings of the 5th International Occupational
Measurement Workshop. Randolph AFB, TX, USAF Occupational Measurement
-Center.
Mayer, F.A., & York, M.L. (1974). Simulating maintenance-manning new weapon
system-development. AFHRL-TR-74-97(I), Wright-Patterson AFB, OH, Air
Force Human Resources Laboratory.
Mulligan, J-.F., & Bird- J.B. (1980). Guidance for maintenance task
identification and analysis: Organizational and intermediate maintenance.
AFHRL-TR-80-21, Brooks AFB, Texas-
Reed, L.E. (1967). Advances in the use of computers for handling human factors
data. AMRL-TR-67.i6, Wright Patterson AFB, OH:iAerospace Medical Research
Laboratories.
Rose, A.M., Czarnoiewski, M.Y., Gragg, F.H., Austin, S.H., Ford, P., Doyle, J.,
Hogman, J.. (1984). Acguisition and retention of soldering skills.
Report prepared for U.S. Army under contract MDA 903-81-C-A01.
133
Ruck, H., & Boyle, E. (1985),. Task identification and evaluation system: A
proposal for AFHRL Interdivisional Research (Unpublished manuscript).
134
XIII. BIBLIOGRAPHY
Maintenance Data Collection System (MDC) User's Manual. Air Force Manual
6-267.
135
THE APPENDIX
136
Descriptions of ASCII CODAP Programs
OVRLAP: Creates a similarity matrix for -either cases or tasks that can be
used by the GROUP program to group together individuals into a
totally nested hierarchical clustering solution.
PRTJOB: Prints a job description for any desired group of job incumbents.
The job description is an ordered list -of -task or duty statements.
These descriptions are referred to as- either a task or duty job
description, respectively. The percentage of the members in the
group who perform each task or duty, and-the average percentage of an
incumbent's time that is spent performing the task or -duty is listed
with each statement.
PRTMOD: Reports task-level data along with summary values for identified sets
of tasks called modules.
TASKXX: Lists the tasks contained in the job inventory for the AFSC
specified. The tasks statements are grouped together in the listing
into modules or duties.
137