Performance Do BIM
Performance Do BIM
Performance Do BIM
net/publication/225088877
CITATIONS READS
68 2,407
3 authors:
Anthony Williams
Avondale College of Higher Education
93 PUBLICATIONS 423 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Bilal Succar on 01 June 2014.
Abstract
The term Building Information Modelling (BIM) refers to an expansive knowledge domain within the design,
construction and operation (DCO) industry. The voluminous possibilities attributed to BIM represent an array
of challenges that can be met through a systematic research and delivery framework spawning a set of
performance assessment and improvement metrics. This article identifies five complementary components
specifically developed to enable such assessment: (i) BIM capability stages representing transformational
milestones along the implementation continuum; (ii) BIM maturity levels representing the quality,
predictability and variability within BIM stages; (iii) BIM competencies representing incremental
progressions towards and improvements within BIM stages; (iv) Organizational Scales representing the
diversity of markets, disciplines and company sizes; and (v) Granularity Levels enabling highly targeted yet
flexible performance analyses ranging from informal self-assessment to high-detail, formal organizational
audits. This article explores these complementary components and positions them as a systematic method
to understand BIM performance and to enable its assessment and improvement. A flowchart of the contents
of this article is provided.
B Keywords – Building Information Modelling; capability and maturity models; performance assessment and improvement
ISSUES ARISING FROM THE PROLIFERATION to realize significant benefits and productivity gains
OF BIM while they are still inexperienced users. Successful
Notwithstanding the much-touted benefits of BIM as a implementation of these systems requires an
means of increasing productivity, there are currently appreciation of how BIM resources (including
few metrics that measure such improvements. hardware, software as well as the technical and
Furthermore, little guidance is available for management skills of staff) need to evolve in
organizations wishing to generate new or enhance harmony with each other. The multiple and varied
their existing BIM deliverables. Those wishing to understandings that practitioners have of BIM further
adopt BIM or identify and/or prioritize their compound the difficulties they experience. When the
requirements are thus left to their own devices. The unforeseen happens, the risks, costs and difficulties
implementation of any new technology is fraught associated with implementing BIM increase. In such
with challenges and BIM is no exception. In addition, circumstances compromises are likely to be made
those implementing BIM frequently expect to be able leading, in turn, to users’ expectations not being met.
THE NEED FOR BIM PERFORMANCE METRICS DEVELOPING BIM METRICS AND
BIM use needs to be assessable if the productivity BENCHMARKS
improvements that result from its implementation Although it is important to develop metrics and
are to be made apparent. Without such metrics, benchmarks for BIM performance assessment, it is
teams and organizations are unable to consistently equally important that these metrics are accurate
measure their own successes and/or failures. and able to be adapted to different industry sectors
Performance metrics enable teams and and organizations. Considerable insight can be
organizations to assess their own competencies in gained from the performance measurement tools
using BIM and, potentially, to benchmark their developed for other industries but it would be
progress against that of other practitioners. foolhardy to rely on any tool which is not designed
Furthermore, robust sets of BIM metrics lay the for the specific requirements of the task in question.
foundations for formal certification systems, which Those required to measure key BIM deliverables/
could be used by those procuring construction requirements across the construction supply chain
projects to pre-select BIM service providers. are no exception.
This article describes a set of metrics purposefully and experiences which combine to form the view of
developed to measure the specifics of BIM the BIM domain reported here.
performance. To increase their reliability, adoptability
and usability for different stakeholders, the first-named CONCEPTUAL BACKGROUND
author identified the following performance criteria. According to Maxwell (2005), the conceptual
The metrics should be: background underpinning a study such as this is
typically based on several sources including previous
research and existing theories, the researcher’s own
l Accurate: Well-defined and able to measure
experiential knowledge and thought experiments.
performance at high levels of precision.
Various theories (including systems theory (Ackoff,
l Applicable: Able to be utilized by all stakeholders
1971; Chun, Sohn, Arling, & Granados, 2008),
across all phases of a project’s lifecycle.
systems thinking (Chun et al., 2008), diffusion of
l Attainable: Achievable if defined actions are
innovation theory (Fox & Hietanen, 2007; Mutai,
undertaken.
2009; Rogers, 1995), technology acceptance models
l Consistent: Yield the same results when conducted
(Davis, 1989; Venkatesh & Davis, 2000) and
by different assessors.
complexity theory (Froese, 2010; Homer-Dixon, 2001)
l Cumulative: Set as logical progressions;
assisted in analysing the BIM domain and enriched
deliverables from one act as prerequisites for
the study’s conceptual background. Constraints
another.
identified in these theories led to the development of
l Flexible: Able to be performed across markets,
a new theoretical framework based on an inductive
Organizational Scales and their subdivisions.
approach ‘[more suitable for researchers who are
l Informative: Provide ‘feedback for improvement’
more concerned about] the correspondence of
and ‘guidance for next steps’ (Nightingale & Mize,
their findings to the real world than their coherence
2002, p. 19).
with existing theories or laws’ (Meredith, Raturi,
l Neutral: Not prejudice proprietary, non-proprietary,
Amoako-Gyampah, & Kaplan, 1989, p. 307).
closed, open, free or commercial solutions or
schemata.
METHODOLOGY AND VALIDATION
l Specific: Serve the specific requirements of the
The five components of BIM performance
construction industry.
measurement are some of the deliverables of the
l Universal: Apply equally across markets and
BIM framework developed after assessing numerous
geographies.
publicly available international guidelines (Succar,
l Usable: Intuitive and able to be easily employed to
2009). The framework itself is composed of a
assess BIM performance.
number of high-level concepts that interact to
generate a set of guides and tools necessary to
This article describes the development of a set of (i) facilitate BIM implementations; (ii) conduct BIM
BIM performance metrics based on these guiding performance assessments; and (iii) generate
principles. It introduces a set of complementary multi-tiered educational curricula.
knowledge components that enable BIM performance The theoretical underpinnings of the BIM
assessment and facilitate its improvement. framework have been generated through a process
of inductive inference (Michalski, 1987), conceptual
RESEARCH DESIGN clustering (Michalski & Stepp, 1987) and reflective
The investigations described in this article are part learning (Van der Heijden & Eden, 1998; Walker,
of a larger PhD study which addresses the question Bourne, & Shelley, 2008). Framework components
of how to represent BIM knowledge structures and were then represented visually through a series of
provide models that facilitate the implementation ‘knowledge models’ to reduce topic complexity
of BIM in academic and industrial settings. It is (Tergan, 2003) and facilitate knowledge transfer to
grounded in a set of paradigms, theories, concepts others (Eppler & Burkhard, 2005).
Many of the BIM framework’s components – fields, BIM stages are defined by their minimum requirements.
stages, lenses, steps, competencies and several visual For example, to be considered as having achieved
knowledge models – have been subjected to a process BIM capability stage 1, an organization needs to have
of validation through a series of international focus deployed an object-based modelling software tool
groups employing a mixed-model approach similar to ArchiCAD, Revit, Tekla or Vico. Similarly,
(Tashakkori & Teddlie, 1998). The results from these for BIM capability stage 2, an organization needs to
focus groups and their impact on the development be engaged in a multidisciplinary ‘model-based’
of the five components of BIM performance collaborative project. To be considered at BIM
measurement will be published separately. capability stage 3, an organization needs to be using
a network-based solution which links to external
databases and shares object-based models with at
THE FIVE COMPONENTS OF BIM
least two other disciplines – a solution similar to a
PERFORMANCE MEASUREMENT
model server or BIMSaaS solution (BIMserver, 2011;
The first named author identified five BIM framework
Onuma, 2011; Wilkinson, 2008).
components as those required to enable accurate
Each of these three capability stages may be
and consistent BIM performance measurement
further subdivided into competency steps. What
(Succar, 2010b). These include BIM capability
differentiates stages from steps is that stages are
stages, BIM maturity levels, BIM competency sets,
transformational or radical changes, while steps are
Organizational Scales and Granularity Levels.
incremental ones (Henderson & Clark, 1990; Taylor &
The following sections provide brief introductions
Levitt, 2005). The collection of steps involved in
to each component. They are followed by a
working towards or within a BIM stage (i.e. across
step-by-step workflow which allows BIM capability
the continuum from pre-BIM to post-BIM) is driven
and maturity assessments to be conducted.
by different perquisites for, challenges within and
deliverables of each BIM stage. In addition to their
BIM CAPABILITY STAGES
type (the competency set they belong to – refer to
BIM capability is defined here as the basic ability to
Section BIM competency sets), the following BIM
perform a task or deliver a BIM service/product. BIM
steps can be also identified according to their
capability stages (or BIM stages) define the
location on the continuum shown in Figure 3:
minimum BIM requirements – the major milestones
that need to be reached by teams or organizations
as they implement BIM technologies and concepts. l A steps: from pre-BIM status leading to BIM stage 1;
Three BIM stages separate ‘pre-BIM’, a fixed starting l B steps: from BIM stage 1 leading towards BIM
point representing industry status before BIM stage 2;
l C steps from BIM stage 2 leading towards BIM
implementation, from ‘post-BIM’, a variable
end-point representing the continually evolving goal stage 3;
l D steps from BIM stage 3 leading towards
of employing virtually integrated design, construction
and operation (viDCO) tools and concepts. (The term post-BIM.
viDCO is used in preference to integrated project
delivery (IPD) as representing the ultimate goal of
BIM MATURITY LEVELS
implementing BIM (AIA, 2007) to prevent any
The term ‘BIM maturity’ refers to the quality, repeatability
confusion with the term’s evolving contractual
and degree of excellence within a BIM capability.
connotations within the United States.) The stages
Although ‘capability’ denotes a minimum ability (refer
are:
to Section BIM capability stages), ‘maturity’ denotes
the extent of that ability in performing a task or
l BIM stage 1: object-based modelling; delivering a BIM service/product. BIM maturity’s
l BIM stage 2: model-based collaboration; benchmarks are performance improvement milestones
l BIM stage 3: network-based integration. (or levels) that teams and organizations aspire to or
work towards. In general, the progression from lower to their detractors (e.g. Bach, 1994; Jones, 1994;
higher levels of maturity indicates (i) improved control Weinberg, 1993), research conducted in other
resulting from fewer variations between performance industries has already identified a correlation between
targets and actual results; (ii) enhanced predictability improved process maturity and business performance
and forecasting of reaching cost, time and (Lockamy III & McCormack, 2004).
performance objectives; and (iii) greater effectiveness The ‘original’ software industry CMM, however, is
in reaching defined goals and setting new more not applicable to the construction industry. It does not
ambitious ones (Lockamy III & McCormack, 2004) address supply chain issues, and its maturity levels do
(McCormack, Ladeira, & Oliveira, 2008). not account for the different phases of the lifecycle of
The concept of BIM maturity has been adopted a construction project (Sarshar et al., 2000). Although
from Software Engineering Institute’s (SEI) capability other efforts, derived from CMM, focus on the
maturity model (CMM) (SEI, 2008a), a process construction industry (refer to Table 1), there is no
improvement framework initially intended as a tool to comprehensive maturity model/index that can be
evaluate the ability of government contractors to applied to BIM, its implementation stages, players,
deliver software projects. CMM originated in the field deliverables or its effect on project lifecycle phases.
of quality management (Crosby, 1979) and was later The CMMs listed in Table 1 are similar in structure
developed for the benefit of the US Department and objectives but differ in conceptual depth,
of Defence (Hutchinson & Finnemore, 1999). Its industrial focus, terminology and target audience. A
successor, the more comprehensive capability common theme is how CMMs employ simple
maturity model integration (CMMI) (SEI, 2006a, 2006b, experience-based classifications and benchmarks
2008c), continues to be developed and extended by to facilitate continuous improvement within
the SEI, Carnegie Mellon University. Several CMM organizations. In analysing their suitability for
variants exist for other industries (Succar, 2010a) but developing a BIM-specific maturity index, most are
they are all, in essence, specialized frameworks that broad in approach and can collectively form a basis for
assist stakeholders to improve their capabilities (Jaco, a range of BIM processes, technologies and policies.
2004) and benefit from process improvements. However, none easily accommodates the size of
Example benefits include increased productivity and organizations being monitored. Also, from a
return on investment as well as reduced costs and terminology standpoint, there is insufficient
post-delivery defects (Hutchinson & Finnemore, 1999). differentiation between the notion of capability (an
Maturity models are typically composed of ability to perform a task) and that of maturity (the
multiple maturity levels, or process improvement degrees of excellence in performing a task). This
‘building blocks’ or ‘components’ (Paulk, Weber, differentiation is critical when catering for staged BIM
Garcia, Chrissis, & Bush, 1993). When the implementation as it responds to the disruptive and
requirements of each level are satisfied, implementers expansive nature of BIM.
can then build on established components to attempt To address the aforementioned shortcomings, the
‘higher’ maturity. Although CMMs are not without BIM maturity index (BIMMI) has been developed by
traditionally separate organizational functions, set process improvement goals and priorities, provide guidance
for quality processes, and provide a point of reference for appraising current processes (SEI, 2006b, 2006c,
2008a, 2008b, 2008c)
CMMI has five maturity levels (for staged representation, six capability levels for continuous representation), 16
core process areas (22 for CMMI-DEV and 24 for CMMI-SVC) and one to four goals for each process area
The five maturity levels are: initial, managed, defined, quantitatively managed and optimizing
Continued
ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT
The iBIM model identifies specific capability targets (not performance milestones) for the UK Construction
Industry covering technology, standards, guides, classifications and delivery (total number of topics not defined).
Targets for each topic are organized under one or more loosely defined maturity levels (0 –3)
(BIS, 2011)
I-CMM, Interactive capability maturity model – National Institute for Building Sciences (NIBS) Facility
Information Council (FIC)
This I-CMM is closely coupled with the NBIMS effort (version1, part 1) and establishes ‘a tool to determine the
level of maturity of an individual BIM as measured against a set of weighted criteria agreed to be desirable in a
Building Information Model’ (Suermann, et al., 2008, p. 2; NIST, 2007; NIBS, 2007)
The ICMM has 11 ‘areas of interest’ measured against 10 maturity levels
Continued
ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT
LESAT, Lean Enterprise Self-Assessment Tool – Lean Aerospace Initiative (LAI) at the Massachusetts Institute of
Technology (MIT)
LESAT is focused on ‘assessing the degree of maturity of an enterprise in its use of ‘lean’ principles and practices
to achieve the best value for the enterprise and its stakeholders’ (Nightingale & Mize, 2002, p. 17).
LESAT has 54 lean practices organized within three assessment sections: lean transformation/leadership, life
cycle processes and enabling infrastructure and five maturity levels: some awareness/sporadic, general
awareness/informal, systemic approach, ongoing refinement and exceptional/innovative
P3M3, Portfolio, programme and project management maturity model – Office of Government Commerce
The P3M3 provides ‘a framework with which organizations can assess their current performance and put in place
improvement plans with measurable outcomes based on industry best practice’ (OGC, 2008, p. 8)
The P3M3 has five maturity levels: awareness, repeatable, defined, managed and optimized
(OGC, 2008)
P-CMMw, People capability maturity model v2 – Software Engineering Institute/Carnegie Melon
P-CMM is an ‘organizational change model’ and a ‘roadmap for implementing workforce practices that
continuously improve the capability of an organization’s workforce’ (SEI, 2008d, pp. 3 and 15)
P-CMM has five maturity levels: initial, managed, defined, predictable and optimizing
(SEI, 2008d)
ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT
The project management process maturity (PM)2 model ‘determines and positions an organization’s relative
project management level with other organizations’. It also aims to integrate PM ‘practices, processes, and
maturity models to improve PM effectiveness in the organization’ (Kwak & Ibbs, 2002, p. 150)
(PM)2 has five maturity levels: initial, planned, managed at project level, managed at corporate level and
continuous learning
Continued
ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT
SPICE, Standardized process improvement for construction enterprises – Research Centre for the Built and
Human Environment, The University of Salford
SPICE is a project which developed a framework for continuous process improvement for the construction
industry. SPICE is an ‘evolutionary step-wise model utilizing experience from other sectors, such as
manufacturing and IT’ (Hutchinson & Finnemore, 1999, p. 576; Sarshar et al., 2000)
SPICE has five stages: initial/chaotic, planned & tracked, well defined, quantitatively controlled, and continuously
improving
Supply chain management process maturity model and business process orientation (BPO) Maturity Model
The model conceptualizes the relation between process maturity and supply chain operations as based on the
supply-chain operations reference model (Stephens, 2001). The model’s maturity describes the ‘progression of
activities toward effective SCM and process maturity. Each level contains characteristics associated with
process maturity such as predictability, capability, control, effectiveness and efficiency’ (Lockamy III &
McCormack, 2004, p. 275; McCormack, 2001).
The five maturity levels are: ad hoc, defined, linked, integrated and extended
analysing and then integrating these and other Policy sets in benchmarks/controls, contracts/
maturity models used across different industries. agreements and guidance/supervision. For example,
The BIMMI has been customized to reflect the alliance-based or risk-sharing contractual agreements
specifics of BIM capability, implementation are pre-requisites for network-based integration (BIM
requirements, performance targets and quality stage 3).
management. It has five distinct levels: (a) initial/ Figure 6 provides a partial mind-map of BIM
ad hoc, (b) defined, (c) managed, (d) integrated and competency sets shown at Granularity Level 2 (for
(e) optimized (Figure 4). Level names were chosen to an explanation of Granularity Levels, please refer to
reflect the terminology used in many maturity Section BIM granularity levels).
models, to be easily understandable by DCO
stakeholders and to reflect increasing BIM maturity BIM ORGANIZATIONAL SCALES
from ad hoc to continuous improvement (Table 2). To allow BIM performance assessments to respect
the diversity of markets, disciplines and company
BIM COMPETENCY SETS sizes, an Organizational Scale (OScale) has been
A BIM competency set is a hierarchical collection of developed. The scale can be used to customize
individual competencies identified for the purposes assessment efforts and is depicted in Table 3.
of implementing and assessing BIM. In this context,
the term competency reflects a generic set of BIM GRANULARITY LEVELS
abilities suitable for implementing as well as Competency sets include a large number of individual
assessing BIM capability and/or maturity. Figure 5 competencies grouped under numerous headings
illustrates how the BIM framework generates BIM (shown in Figure 6). To enhance BIM capability and
competency sets out of multiple fields, stages and maturity assessments and to increase their flexibility,
lenses (Succar, 2009). a granularity ‘filter’ with four Granularity Levels
BIM competencies are a direct reflection of BIM (GLevels) has been developed. Progression from
requirements and deliverables and can be grouped lower to higher levels of granularity indicates an
into three sets, namely technology, process and policy: increase in (i) assessment breadth, (ii) scoring detail,
Technology sets in software, hardware and data/ (iv) formality and (iv) assessor specialization.
networks. For example, the availability of a BIM tool Using higher Granularity Levels (GLevel 3 or 4)
allows the migration from drafting-based to exposes more detailed competency areas than lower
object-based workflow (a requirement of BIM stage 1) Granularity Levels (GLevel 1 or 2). This variability
Process sets in resources, activities/workflows, enables the preparation of several BIM performance
products/services, and leadership/management. For measurement tools ranging from low-detail, informal
example, collaboration processes and database- and self-administered assessments to high-detail,
sharing skills are necessary to allow model-based formal and specialist-led appraisals. Table 4 provides
collaboration (BIM stage 2). more information about the four Granularity Levels.
Granularity Levels increase or decrease the levels, competency sets, Organizational Scales and
number of competency areas used for performance Granularity Levels) allow performance assessments
assessment. For example, the mind map provided in to be conducted involving combinations of these
Figure 6 reveals 10 competency areas at GLevel 1 components. The guiding principles discussed in
and 41 competency areas at GLevel 2. Also, at Section Developing BIM metrics and benchmarks all
GLevels 3 and 4, the number of competency areas apply. To manage all possible configurations, a
available for performance assessment increases simple assessment and reporting workflow has been
dramatically as shown in Figure 7. developed (Figure 8).
The partial mind-map shown in Figure 7 reveals The workflow shown in Figure 8 identifies the five
many additional competency areas under GLevel 3, steps needed to conduct a BIM performance
such as data types and data structures. At GLevel 4, assessment. Starting with an extensive pool of
the map reveals even more detailed competency generic BIM competencies – applicable across DCO
areas including structured and unstructured data, disciplines and organizational sizes – assessors can
which in turn branch into computable and first filter-out non-applicable competency sets,
non-computable components (Fallon & Palmer, 2007; conduct a series of assessments based on the
Kong et al., 2005; Mathes, 2004). competencies remaining and then generate
appropriate assessment reports.
APPLYING THE FIVE ASSESSMENT
COMPONENTS A FINAL NOTE
The aforementioned five complementary BIM The five BIM framework components, briefly
framework components (capability stages, maturity discussed in this article, provide a range of
opportunities for DCO stakeholders to measure and These range from informal self-assessments to
improve their BIM performance. The components highly detailed and formal organizational audits.
complement each other and enable highly targeted Such a system of assessment can be used to
yet flexible performance analyses to be conducted. standardize BIM implementation and assessment
efforts, enable a structured approach to BIM After scrutiny of a significant part of the BIM
education and training as well as establish a solid framework through peer-reviewed publications and a
base for a formal BIM certification process. series of international focus groups, the five
FIGURE 7 Technology competency areas at Granularity Level 4 – partial mind map v3.0
FIGURE 8 BIM capability and maturity assessment and reporting workflow diagram – v2.0
components and other related assessment metrics UAE. Engineering, Construction and Architectural Management, 16(1),
are currently being extended and field tested. 92 –108.
Sample online tools (focusing on selected Bach, J. (1994). The immaturity of the CMM. American Programmer, 7(9),
disciplines, at different granularities) are currently 13 –18.
being formulated. All these form part of an ongoing Bew, M., Underwood, J., Wix, J., & Storer, G. (2008). Going BIM in a
effort to promote the establishment of an commercial world. Paper presented at the EWork and EBusiness in
independent BIM certification body responsible for Architecture, Engineering and Construction: European Conferences on
assessing and accrediting individuals, organizations Product and Process Modeling (ECCPM 2008), Sophia Antipolis, France.
and collaborative project teams. Subject to BIMserver. (2011, 20 October). Open source building information
additional field testing and tool calibration, the five Modelserver. Retrieved from http://bimserver.org/
components may be well placed to consistently BIS. (2011). A report for the Government Construction Client Group, Building
assess, and by extension improve, BIM performance. Information Modelling (BIM) working party strategy. Department for
Business Innovation & Skills (BIS). Retrieved from http://www.cita.ie/
ACKNOWLEDGEMENTS images/assets/uk%20bim%20strategy%20(summary).pdf
This article draws on the Bilal Succar’s PhD research at Chun, M., Sohn, K., Arling, P., & Granados, N.F. (2008). Systems theory and
the University of Newcastle, School of Architecture knowledge management systems: The case of Pratt-Whitney Rocketdyne.
and Built Environment (Australia). Bilal Succar wishes Paper presented at the Proceedings of the 41st Hawaii International
to acknowledge his supervisors Willy Sher, Guillermo Conference on System Sciences, Hawaii.
Aranda-Mena and Anthony Williams for their Crawford, J.K. (2006). The project management maturity model. Information
continuous support. Systems Management, 23(4), 50– 58.
Crosby, P.B. (1979). Quality is free: The art of making quality certain.
REFERENCES New York: New American Library.
Ackoff, R.L. (1971). Towards a system of systems concepts. Management Davis, F.D. (1989). Perceived usefulness, perceived ease of use, and user
Science, 17(11), 661– 671. acceptance of information technology [Article]. MIS Quarterly, 13(3),
AIA. (2007). Integrated project delivery: A guide. AIA California Council. 319 – 340.
Arif, M., Egbu, C., Alom, O., & Khalfan, M.M.A. (2009). Measuring Doss, D.A., Chen, I.C.L., & Holland, L.D. (2008). A proposed variation of the
knowledge retention: A case study of a construction consultancy in the capability maturity model framework among financial management
settings. Paper presented at the Allied Academies International Kong, S.C.W., Li, H., Liang, Y., Hung, T., Anumba, C., & Chen, Z. (2005). Web
Conference, Tunica. services enhanced interoperable construction products catalogue.
Eppler, M., & Burkhard, R.A. (2005). Knowledge visualization. In D.G. Automation in Construction, 14(3), 343 –352.
Schwartz (Ed.), Encyclopedia of knowledge management (pp. 551–560). Kwak, Y.H., & Ibbs, W.C. (2002). Project management process maturity (PM)2
Covent Garden, London: Idea Group Reference. model. ASCE, Journal of Management in Engineering, 18(3), 150–155.
Eppler, M.J., & Platts, K.W. (2009). Visual strategizing: The systematic use Lainhart IV, J.W. (2000). COBITTM : A methodology for managing and
of visualization in the strategic-planning process. Long Range Planning, controlling information and information technology risks and
42, 42 –74. vulnerabilities. Journal of Information Systems, 14(s-1), 21– 25.
Fallon, K.K., & Palmer, M.E. (2007). General buildings information handover Lockamy III, A., & McCormack, K. (2004). The development of a supply chain
guide: Principles, methodology and case studies. Washington, DC: US management process maturity model using the concepts of business
Department of Commerce. process orientation. Supply Chain Management: An International Journal,
Fox, S., & Hietanen, J. (2007). Interorganizational use of building information 9(4), 272 –278.
models: Potential for automational, informational and transformational Mathes, A. (2004). Folksonomies – Cooperative classification and
effects. Construction Management and Economics, 25(3), 289 –296. communication through shared metadata. Paper presented at the
Froese, T.M. (2010). The impact of emerging information technology on Computer Mediated Communication, LIS590CMC (Doctoral seminar),
project management for construction. Automation in Construction, 19(5), Graduate School of Library and Information Science. Retrieved from http://
531–538. www.adammathes.com/academic/computer-mediatedcommunication/
Gillies, A., & Howard, J. (2003). Managing change in process and people: folksonomies.html
Combining a maturity model with a competency-based approach. Total Maxwell, J.A. (2005). Qualitative research design: An interactive approach.
Quality Management & Business Excellence, 14(7), 779– 787. Thousand Oaks, CA: Sage Publications, Inc.
Hardgrave, B.C., & Armstrong, D.J. (2005). Software process improvement: McCormack, K. (2001). Supply chain maturity assessment: A roadmap for
It’s a journey, not a destination. Communications of the ACM, 48(11), building the extended supply chain. Supply Chain Practice, 3, 4– 21.
93– 96. McCormack, K., Ladeira, M.B., & de Oliveira, M.P.V. (2008). Supply chain
Henderson, R.M., & Clark, K.B. (1990). Architectural innovation: The maturity and performance in Brazil. Supply Chain Management: An
reconfiguration of existing product technologies and the failure of International Journal, 13(4), 272 –282.
established firms. Administrative Science Quarterly, 35(1), 9 –30. McGraw-Hill. (2009). The business value of BIM: Getting Building Information
Homer-Dixon, T. (2001). The ingenuity gap. Canada: Vintage. Modeling to the bottom line. McGraw-Hill Construction Analytics.
Hutchinson, A., & Finnemore, M. (1999). Standardized process improvement Retrieved from http://construction.com/
for construction enterprises. Total Quality Management, 10, 576–583. Meredith, J.R., Raturi, A., Amoako-Gyampah, K., & Kaplan, B. (1989).
IU. (2009a). BIM design & construction requirements, follow-up seminar Alternative research paradigms in operations. Journal of Operations
(PowerPoint Presentation). The Indiana University Architect’s Office, 32. Management, 8(4), 297 –326.
Retrieved from http://www.indiana.edu/~uao/ Michalski, R.S. (1987). Concept learning. In S.S. Shapiro (Ed.), Encyclopedia
IU%20BIM%20Rollout%20Presentation%209-10-2009.pdf. of artificial intelligence (Vol. 1, pp. 185 –194). New York: Wiley.
IU. (2009b). IU BIM Proficiency Matrix (Multi-tab Excel Workbook). 9 tabs. Michalski, R.S., & Stepp, R.E. (1987). Clustering. In S.S. Shapiro (Ed.),
The Indiana University Architect’s Office. Retrieved from http://www. Encyclopedia of artificial intelligence (Vol. 1, pp. 103 – 111). New York:
indiana.edu/~uao/IU%20BIM%20Proficiency%20Matrix.xls. Wiley.
Jaco, R. (2004). Developing an IS/ICT management capability maturity Mutai, A. (2009). Factors influencing the use of Building Information Modeling
framework. Paper presented at the Proceedings of the 2004 Annual (BIM) within leading construction firms in the United States of America
Research Conference of the South African Institute of Computer (Unpublished Doctor of philosophy). Indiana State University, Terre Haute.
Scientists and Information Technologists on IT Research in Developing NIBS. (2007). BIM Capability Maturity model. National Institute for Building
Countries, Stellenbosch, Western Cape, South Africa. Sciences (NIBS) Facility Information Council (FIC). Retrieved October 11,
Jones, C. (1994). Assessment and control of software risks. New Jersey: 2008, from www.buildingsmartalliance.org/client/assets/files/bsa/
Prentice-Hall. BIM_CMM_v1.9.xls
Keller, T., Gerjets, P., Scheiter, K., & Garsoffky, B. (2006). Information Nightingale, D.J., & Mize, J.H. (2002). Development of a lean enterprise
visualizations for knowledge acquisition: The impact of dimensionality and transformation maturity model. Information Knowledge Systems
color coding. Computers in Human Behavior, 22(1), 43 –65. Management, 3(1), 15 – 30.
NIST. (2007). National Building Information Modeling Standard – Version 1.0 SEI. (2008a, 11 October). Capability maturity model integration. Software
– Part 1: Overview, principles and methodologies. Washington, DC: US Engineering Institute/Carnegie Melon. Retrieved from http://www.sei.
Department of Commerce. cmu.edu/cmmi/index.html
OGC. (2008). Portfolio, programme, and project management maturity model SEI. (2008b). Capability Maturity Model Integration for Services
(P3M3). England: Office of Government Commerce. (CMMI-SVC), partner and piloting draft, V0.9c. Pittsburgh, PA: Software
OGC. (2009, 13 February). Information Technology Infrastructure Library Engineering Institute/Carnegie Melon.
(ITIL) – Office of Government Commerce. Retrieved from http://www. SEI. (2008c, 24 December). CMMI for services. Retrieved from http://www.
itil-officialsite.com/home/home.asp sei.cmu.edu/cmmi/models/CMMI-Services-status.html
Ollerenshaw, A., Aidman, E., & Kidd, G. (1997). Is an illustration always SEI. (2008d, 11 October). People Capability Maturity Model – Version 2,
worth ten thousand words? Effects of prior knowledge, learning style and Software Engineering Institute/Carnegie Melon. Retrieved from http://
multimedia illustrations on text comprehension. International Journal of www.sei.cmu.edu/cmm-p/version2/index.html
Instructional Media, 24(3), 227– 238. Stephens, S. (2001). Supply chain operations reference model version 5.0: A
Onuma. (2011, 20 October). Onuma model server. Retrieved from http:// new tool to improve supply chain efficiency and achieve best practice.
onuma.com/products/BimDataApi.php Information Systems Frontiers, 3(4), 471 – 476.
Paulk, M.C., Weber, C.V., Garcia, S.M., Chrissis, M.B., & Bush, M. (1993). Succar, B. (2009). Building information modelling framework: A research and
Key practices of the capability maturitymModel – version 1.1 delivery foundation for industry stakeholders. Automation in Construction,
(Technical Report). Software Engineering Institute, Carnegie Mellon 18(3), 357 –375.
University, Pittsburgh, PA. Succar, B. (2010a). Building information modelling maturity matrix. In
Pederiva, A. (2003). The COBITw maturity model in a vendor evaluation case. J. Underwood, & U. Isikdag (Eds.), Handbook of research on
Information Systems Control Journal, 3, 26 –29. building information modelling and construction informatics: concepts
Penttilä, H. (2006). Describing the changes in Architectural Information and technologies (pp. 65 – 103). Information Science Reference, IGI
Technology to understand design complexity and free-form architectural Publishing. doi:10.4018/978-1-60566-928-1.ch004
expression. ITcon, (Special Issue The Effects of CAD on Building Form and Succar, B. (2010b). The five components of BIM performance measurement.
Design Quality), 11, 395–408. Paper presented at the CIB World Congress.
Rogers, E.M. (1995). Diffusion of innovation New York: Free Press. Suermann, P.C., Issa, R.R.A., & McCuen, T.L. (2008, 16 –18 October).
Sahibudin, S., Sharifi, M., & Ayat, M. (2008). Combining ITIL, COBIT and ISO/ Validation of the U.S. National Building Information Modeling Standard
IEC 27002 in order to design a comprehensive IT framework in Interactive Capability Maturity Model. Paper presented at the 12th
organizations. Paper presented at the Modeling & Simulation, AICMS 08. International Conference on Computing in Civil and Building Engineering,
Second Asia International Conference, Kuala Lumpur, Malaysia. Beijing, China.
Sarshar, M., Haigh, R., Finnemore, M., Aouad, G., Barrett, P., Baldry, D., & Tashakkori, A., & Teddlie, C. (1998). Mixed methodology: combining
Sexton, M. (2000). SPICE: A business process diagnostics tool for qualitative and quantitative qpproaches. Thousand Oaks, CA: Sage.
construction projects. Engineering Construction & Architectural Taylor, J., & Levitt, R.E. (2005). Inter-organizational knowledge flow and
Management, 7(3), 241– 250. innovation diffusion in project-based industries. Paper presented at the
Sebastian, R., & Van Berlo, L. (2010). Tool for benchmarking BIM 38th International Conference on System Sciences, Hawaii, USA.
performance of design, engineering and construction firms in the Tergan, S.O. (2003). Knowledge with computer-based mapping tools. Paper
Netherlands. Architectural Engineering and Design Management (Special presented at the ED-Media 2003 World Conference on Educational
Issue: Integrated Design and Delivery Solutions), 6, 254–263. Multimedia, Hypermedia & Telecommunication, Honolulu, HI: University of
SEI. (2006a). Capability Maturity Model Integration for Development Honolulu.
(CMMI-DEV), improving processes for better products. Pittsburgh, PA: TNO. (2010). BIM QuickScan – A TNO initiative (sample QuickScan Report –
Software Engineering Institute/Carnegie Melon. PDF). 3. Retrieved from http://www.bimladder.nl/wp-content/uploads/
SEI. (2006b). Capability Maturity Model Integration Standard (CMMI) 2010/01/voorbeeld-quickscan-pdf.pdf
appraisal method for process improvement (SCAMPI) A, Version 1.2 – UKCO. (2011). Government construction strategy. London: United Kingdom
method definition document. Pittsburgh, PA: Software Engineering Cabinet Office.
Institute/Carnegie Melon. Vaidyanathan, K., & Howell, G. (2007). Construction supply chain maturity
SEI. (2006c). CMMI for development, improving processes for better model – Conceptual framework. Paper presented at the International
products. Pittsburgh, PA: Software Engineering Institute/Carnegie Melon. Group for Lean Construction (IGLC-15), Michigan, USA.
Van der Heijden, K., & Eden, C. (1998). The theory and praxis of reflective Weinberg, G.M. (1993). Quality software management (Vol. 2): First-order
learning in strategy making. In C. Eden, & J.-C. Spender (Eds.), measurement. New York: Dorset House Publishing Co., Inc.
Managerial and organizational cognition: Theory, methods and research Widergren, S., Levinson, A., Mater, J., & Drummond, R. (2010, 25 – 29
(pp. 58– 75). London: Sage. July). Smart grid interoperability maturity model. Paper presented at
Venkatesh, V., & Davis, F.D. (2000). A theoretical extension of the the Power and Energy Society General Meeting, 2010 IEEE,
technology acceptance model: Four longitudinal field studies. Minnesota, USA.
Management Science, 46(2), 186 –204. Wilkinson, P. (2008, 12 July). SaaS-based BIM. Extranet evolution –
Walker, D.H.T., Bourne, L.M., & Shelley, A. (2008). Influence, stakeholder Construction collaboration technologies. Retrieved from http://www.
mapping and visualization. Construction Management and Economics, extranetevolution.com/extranet_evolution/2008/04/saas-based-
26(6), 645– 658. bim.html