Decision Support Systems 64 (2014) 130–141
Contents lists available at ScienceDirect
Decision Support Systems
journal homepage: www.elsevier.com/locate/dss
A unified foundation for business analytics
Clyde Holsapple a,⁎, Anita Lee-Post b, Ram Pakath c
a
b
c
Gatton College of Business and Economics, University Of Kentucky, Lexington, KY 40506, USA
Department of Marketing and Supply Chain Management, Gatton College of Business and Economics, University Of Kentucky, Lexington, KY 40506, USA
Department of Finance and Quantitative Methods, Gatton College of Business and Economics, University of Kentucky, Lexington, KY 40506, USA
a r t i c l e
i n f o
Article history:
Received 23 August 2013
Received in revised form 24 May 2014
Accepted 28 May 2014
Available online 6 June 2014
Keywords:
Analytics
Business analytics
Business intelligence
Decision making
Decision support
Evidence-based
a b s t r a c t
Synthesizing prior research, this paper designs a relatively comprehensive and holistic characterization of
business analytics – one that serves as a foundation on which researchers, practitioners, and educators can
base their studies of business analytics. As such, it serves as an initial ontology for business analytics as a field
of study. The foundation has three main parts dealing with the whence and whither of business analytics:
identification of dimensions along which business analytics possibilities can be examined, derivation of a sixclass taxonomy that covers business analytics perspectives in the literature, and design of an inclusive framework
for the field of business analytics. In addition to unifying the literature, a major contribution of the designed
framework is that it can stimulate thinking about the nature, roles, and future of business analytics initiatives.
We show how this is done by deducing a host of unresolved issues for consideration by researchers, practitioners,
and educators. We find that business analytics involves issues quite aside from data management, number
crunching, technology use, systematic reasoning, and so forth.
© 2014 Elsevier B.V. All rights reserved.
1. Introduction
According to a study by Gartner, the technology category of “analytics and business intelligence” is the top priority of chief information
officers, and comprises a $12.2B market [1]. It is seen as a higher priority
than such categories as mobile technology, cloud computing, and collaboration technology. Further, Gartner finds that the top technology
priority of chief financial officers is analytics [2]. Similarly, in studies
involving interviews with thousands of chief information officers,
worldwide, IBM asked, “which visionary plan do you have to increase
competitiveness over the next 3 to 5 years?” In both 2011 and 2009,
83% of respondents identify “Business Intelligence and Analytics” as
their number-one approach for achieving greater competitiveness.
Among all types of plans, this is the top percentage for both years. To
put this in perspective, consider 2011 results, in which business intelligence and analytics exceeds such other competitiveness plans as mobility solutions (ranked 2nd at74%), cloud computing (ranked 4th at 60%),
and social networking (ranked 8th at 55%) [3]. IDC reports that the business analytics software market grew by 13.8% during 2011 to $32B, and
predicts it to be at $50.7B in revenue by 2016 [4,5].
It appears that a driver for this growth is the perception or realization that such investments yield value. Across a set of ROI cases, Nucleus
Research finds a $10.66 payoff for every $1.00 spent on analytics applications – suggesting that such applications can be very attractive
⁎ Corresponding author. Tel.: +1 859 257 5236.
E-mail addresses: cwhols@uky.edu (C. Holsapple), dsianita@uky.edu (A. Lee-Post),
pakath@uky.edu (R. Pakath).
http://dx.doi.org/10.1016/j.dss.2014.05.013
0167-9236/© 2014 Elsevier B.V. All rights reserved.
investment routes for chief financial officers [6]. Although business analytics is hardly all and everything for managers dealing with modern
complexities, it is a big deal for them – becoming increasingly adopted
in practice and emerging as an urgent challenge when it comes to improving business processes and outcomes [7,8].
On the academic front, the literature shows that analytics,1 when
narrowly viewed as an application of mathematical and statistical
techniques, has long been studied in business schools under such titles
as operations research/management science, simulation analysis,
econometrics, and financial analysis. From this same relatively narrow
viewpoint, it has been practiced to varying degrees by business organizations, resulting in substantial practical benefits. However, the literature also reveals that there has been relatively little introspective
investigation of business analytics as a field of study. As a rough gauge,
for instance, in journals of the business administration fields, articles
with the term “analytics” in the title have been relatively rare, although
a marked upswing can be observed over the last few years. Given the
rise of analytics degree programs in business schools and the very extensive resources being poured into business analytics (BA), it seems
prudent and timely to step back to investigate the big picture – as a foundation for organizing, positioning, guiding, and propelling future BA
studies and applications. As a step in this direction, we develop a BA
foundation that is a relatively unifying and inclusive characterization
for understanding the nature, impacts, and potential of business analytics from both theory and practice standpoints.
1
When this paper uses the term “analytics” it should be understood as referring to business analytics, unless indicated otherwise.
C. Holsapple et al. / Decision Support Systems 64 (2014) 130–141
The BA foundation scopes out three dimensions and six perspectives.
Further, it introduces a framework that identifies and links building
blocks that are fundamental to study, research, and application of business analytics. As such, the business analytics foundation advanced in
this paper functions as an initial ontology for the emergent BA field –
a language that people can share for discourse about the business analytics universe, and a mental model for organizing one’s understanding
of business analytics phenomena and possibilities. The ontology is, of
course, subject to modification as the BA field grows and matures.
We begin by briefly outlining the heritage of business analytics.
Against this historical background, we ask where BA is going. BA phenomena are seen as occurring in a three-dimensional space of domain,
orientation, and technique. Within this space, there are diverse perspectives about the nature of BA as a field of study. While each is compelling,
they are not unified into a single, broad, encompassing view that offers
systematic guidance for shaping the future of BA as a field of study. We
contend that until there is a cohesive unifying view, BA is likely to remain fragmented into developments within respective perspectives,
and may become dominated by one or another of them without recognizing value inherent in the other views. Accordingly, we advance a
conceptual framework that is a synthesis of six diverse classes of BA perspectives that appear in the literature. Initial indications of the framework’s value are demonstrated by using it to generate examples of
analytics issues worthy of further study by practitioners, researchers,
and educators. Finally, a brief application of the framework is provided
to illustrate its components and suggest its relevance to BA practice.
2. The business analytics heritage
Insight into the nature of business analytics can come from an examination of its heritage [9]. Generally, today’s conventional views of BA
are concerned in some way with operating on data, with an aim of
supporting business activities (e.g., decision making). The operations
may involve examination, calculation, or inference. From a technological
perspective, such operations date as far back as the “dawn of the computer age” in the 1940s and 1950s (www.fico.com/analytics) with efforts
such as the Kerrison Predictor (a computerized, fully-automated, antiaircraft fire control system; 1940), the Manhattan Project (the use of
Monte Carlo simulation to predict the course of nuclear chain reactions;
1944), and the first computerized weather forecast models (1950). They
also include such 1950s developments as the Logic Theorist, using heuristic methods to infer problem solutions [10], and the mechanization
of solving optimization problems [11].
Subsequent commercialization of related technology included
such milestone developments (www.fico.com/analytics) as solving
the shortest-path problem which transformed routing and logistics
(1956), FICO’s use of predictive modeling to assess credit risk (1958),
the Black–Scholes model for optimal stock options pricing (1973), and
the release of Visicalc, the first commercial tool for model-based decision support system (DSS) development (1979). Since then, the use of
computers to operate on data to aid decision making has been a central
aspect of DSS research and practice for over 30 years [12,13] resulting in
a variety of tools that support business intelligence and business analytics initiatives. Today, DSSs that use data, procedures, and reasoning to
solve problems have become so commonplace on the Web that they
are largely taken for granted by their users (e.g., consumers).
Although the term “Business Intelligence” is often credited to
H. Dressener, an analyst at Gartner (e.g., see [14]), in fact the term
was first coined in 1958 by IBM computer scientist H. P. Luhn [15].
Modern-day business intelligence (BI) and its technologies evolved from
DSS concepts and advances [16], most notably, the Executive Information
System (EIS) paradigm. The typical setup for a BI system centers on a
large data store holding highly structured data – typically in the form of
data warehouses and/or data marts. BI software allows various kinds of
operations with these data, ranging from “simple reporting, to slice-anddice, drill down, answering ad hoc queries, real-time analysis, and
131
forecasting…perhaps the most useful of these is the dashboard,” plus
such facilities as data mining, scenario analysis, business performance
measurement, and business activity monitoring [16]. Such operations are
undertaken at the discretion of decision makers, “with the objective of improving the timeliness and the quality of the input to the decision process.”
Expanding on this, there is the move toward dealing with massive collections of relatively unstructured data such as audio, video, clickstream,
and text. This takes BI (and its analytics aspects) into creation and maintenance of new kinds of data sources/repositories and new ways for
dealing with them (e.g., Hadoop, MapReduce) – with the ultimate aim
of harvesting value from them in the sense of supporting knowledge
acquisition, insight generation, problem finding, and problem solving
to assist decision making. Accordingly, technological aspects of analytics
are rooted in the decision support capabilities provided by business
intelligence. But, business analytics has other roots. It is not all about
technology. For instance, there is operations research. Along with
applied statistics, this is where quantitative aspects of analytics are
grounded – methods for building quantitative representations of situations (i.e., mathematical and statistical models), calculating solutions to
such models (typically, technology assisted), and interpreting the results. However, there is far more to analytics than quantitative methods
and mathematics. The world presents an abundance of unruly data,
complex messes, and wicked decision problems [17,7]. Problems have
qualitative aspects. Their data may be non-numeric and their solutions
may rely on logic, reasoning, inference, and collaboration.
The importance of qualitative analysis has long been recognized. For
instance, Ackoff [18] points out that quantification “depends on qualification. What is qualified at one stage may be quantified at another; but
at any stage some qualitative judgments are required” and that progress
depends on improving one’s abilities to qualify. Killman and Mitroff [19]
points out that there is “an important class of variables that have been
slighted in the literature of OR/MS, qualitative variables.” Mingers [20]
summarizes a variety of qualitative methods for transforming data
into decisions, arguing that they are especially applicable as candidates
for coping with situations where insufficient quantitative data are available, decision problems are unstructured or semi-structured, or active
participation of stakeholders is advisable. These methods include cognitive mapping, strategic choice analysis, decision conferencing, soft systems methodology, robustness analysis, and nominal group technique.
In sum, modern-day BA is rooted in the ongoing advances of systems
to support decision making. These advances include increasingly powerful mechanisms for acquiring, generating, assimilating, selecting, and
emitting knowledge relevant to making decisions. Given its decision support heritage, business analytics necessarily partakes of and exploits these
mechanisms. The knowledge that must be processed ranges from qualitative to quantitative and BA is concerned with operating on both knowledge types, as appropriate for the decision at hand. The situation in
which decision support occurs may be well-structured or, at an opposite
extreme, complex and outright wicked. Following suit, we can expect
BA to be applied in both kinds of situations. Today’s importance of BA reflects the complex situations in which organizations find themselves: decisions need to be made in the face of change that is relentless and rapid,
knowledge that can be massive and eclectic, some variables that are largely unknown or little understood, and competition that is fierce and global.
3. Whither business analytics?
Watson [9] observes “We all know that analytics is a hot topic. It
would be hard to miss the large number of books, articles, research
reports, Webinars, and survey findings that suggest its importance. Despite the recent attention, I feel that analytics is not fully understood.
There are many incorrect, imprecise, and incomplete understandings.”
He goes on to share his thoughts about business analytics, concluding
that it is not simply a buzzword or hype. Subsequent sections of this
paper lend strong support to Watson’s conclusion and contribute to better understanding analytics as a legitimate field of study and practice.
132
C. Holsapple et al. / Decision Support Systems 64 (2014) 130–141
Accepting that BA is not ephemeral, it is sensible to ask where this
emergent movement is heading. Davenport et al. [21] offers several
specific predictions about this, including big data analytics changing
the information technology landscape, fast growth of cloud-based
predictive analytics, rise of analytics asset management as a major
challenge, methods of text and social media analytics entering the
mainstream, and continued growth in demand for analytics talent.
Betser and Belanger [22] suggest that technology factors for BA’s future include data stream management, cloud, mobile, bandwidth,
non-SQL databases, and new forms of data. As another take, Forrester
expects important unfolding issues to include self-service analytics,
pervasive analytics, social analytics, scalable analytics, and real-time
analytics [23].
Broadly, analytics appears to be pervading all business administration disciplines. Each can apply analytics practices/technologies in the
context of transforming evidence into insights and decisions. While BA
may increasingly become ingrained in this or that discipline, we argue
that it can retain an identity of its own with respect to education, practice, and research. For instance, degree programs in analytics exist, job
announcements for BA positions are commonplace, and conferences
devoted to BA have appeared.
Beyond speculating on the future, we advocate that a valuable prelude is the creation of an inclusive foundation on which analytics developments, prognostications, and visions can be built. This foundation
is not a listing of some software packages, or mathematical/statistical
techniques, or applications. It must be fundamental, recognize business
analytics dimensions (domains, orientations, and techniques, for example), appreciate the complementary diversity of perceptions about
business analytics, and provide a framework to provoke and position
thinking about BA. It must offer means for describing what we need to
investigate and understand about the conduct of analytics, the resources
that enable/facilitate it, the manipulations of those resources, the influences that constrain and guide it, and its impacts (both actual and potential). Existence of such a foundation furnishes a basis for systematic
advances in research, education, and practice pertaining to BA. Here,
we articulate a foundation in terms of three dimensions, six classes of
perspectives, and a unifying framework that relates those perspectives
to each other.
4. Some dimensions of analytics
An examination of the literature suggests that analytics can be studied along several distinct, albeit complementary, dimensions. A summary of some of these is indicative of the richness of analytics phenomena.
In brief, consider the three dimensions of domain, orientation, and
technique.
4.1. Domain
This dimension refers to subject fields in which aspects of analytics
are being applied. Domains and sub-domains include traditional business administration disciplines: marketing, human resources, business
strategy, organization behavior, operations, supply chain systems, information systems, and finance. At a more detailed level, analytics may be
applied to a topic within any of these disciplines. For instance, one
domain is marketing analytics, while retail analytics is a sub-domain.
Business analytics domains include:
•
•
•
•
•
•
•
•
Web Analytics [24,25]
Google Analytics [26]
Software Analytics [27,28]
Crisis Analytics [29,30]
Knowledge Analytics [31]
Marketing Analytics [32,33]
Customer Analytics [34,35]
Service Analytics [36]
•
•
•
•
•
•
Human Resource Analytics [37,38]
Talent Analytics [39]
Process Analytics [40]
Supply Chain Analytics [41,42]
Risk Analytics [43]
Financial Analytics [44]
Generally, those who work in one of these domains tend not to
reference works of researchers in others, although it should be possible
for researchers across domains to learn from one another. This principle
may be broadened to include findings from non-business domains such
as learning analytics or medical analytics. It would be useful to investigate the extent to which there is a core of analytics concepts common
to all domains. The framework devised later is offered as a unifying
core for understanding the full scope of BA.
4.2. Orientation
This dimension refers to a direction of thought. It is an example of
what could be considered as part of a business analytics core. Its components are not peculiar to one or another business domain. Perhaps the
most frequently discussed orientation is predictive analytics. This involves
means for predicting what may or will occur (e.g., [45,46]). Predictive
analytics has been identified as part of a three-fold taxonomy introduced
in 2010 by the global consulting house, Capgemini, the other two orientations being descriptive analytics and prescriptive analytics [47–50]. Each
of the three aspects in this orientation taxonomy is concerned with
what analytics does; that is, we think about analytics in terms of description, prediction, or prescription.
There are other orientation taxonomies. Consider, for instance, an alternative four-fold orientation. Think about analytics in terms of what is
made by an analytics effort. Such effort can make Sense of a situation,
make Predictions, make Evaluations, or make Decisions. We call this the
SPED taxonomy and contend that each of its four kinds of products is distinct from the others – having value in its own right. For instance, SWOT
analysis makes an evaluation of strengths–weaknesses–opportunities–
threats, but does not make predictions, interpretations, or decisions.
Using analytics for benchmarking is another example of making evaluations. Note that one SPED component can also be an ingredient in
the production of another. For instance, a decision may be based on
sense (i.e., perceived meaning, interpretation) of a situation, predictions relative to possible actions/changes in the situation, and evaluations of possible outcomes [51]. Or, sense of a situation may depend
on decisions about what factors to consider, predictions about future
events, and evaluations of potential relationships among factors or
events.
Taking a third kind of orientation, we might think about analytics
benefits. Drawing on knowledge management theory, an understanding
of analytics orientation may be susceptible to the PAIR model, which
holds that there are four primary avenues a knowledge management
initiative can traverse enroute to competitiveness: Productivity,
Agility, Innovation, and Reputation [52]. If we accept that analytics
is a knowledge-intensive activity, then the PAIR model suggests that
analytics can be oriented toward enhancing an entity’s productivity,
agility, innovation, and/or reputation.
Orientation taxonomies, such as those suggested here, can be complementary. Each gives us a way for structuring the examination, investigation, and implementation of analytics – in whatever subject domain
is of interest.
4.3. Technique
This dimension refers to the way in which an analytics task is
performed and, it too, can be viewed from multiple perspectives. For instance, some analytics techniques are technology-based, while others
are practice-based. One may also differentiate between the use of
133
C. Holsapple et al. / Decision Support Systems 64 (2014) 130–141
quantitative techniques, qualitative techniques, and hybrids [53,54]. Yet
another view differentiates between techniques for dealing with structured, semi-structured, and unstructured situations [55].
Techniques may also be differentiated based on specific mechanisms
used for analytics, such as approaches to data mining [56,57], text mining [58,59], audio mining [60,61], online analytical processing [62,63],
data warehousing [64,65], query-based analysis [66,67], dashboard
analytics [68,69], visual mining [70–72], and so forth.
5. The core of business analytics
While dimensions, such as the three above, characterize the broad
landscape of BA possibilities, there is a more fundamental aspect of
the analytics core. This central focal point involves answers to: What
is business analytics? What is the rationale/purpose for undertaking
business analytics initiatives?
To begin to answer these questions, we have assembled the published views of a set of business scholars, paying particular attention
to inclusion of diverse vantage points that are representative of what
exists in the business literature. We consider, here, views that are
broadly applicable, rather than those concerned with specific applications and domains. The accumulated definitions span the past 10
years, with the preponderance being more recent – following the
2007 book that popularized business analytics [73]. As previously
noted, we do not look at usage of the term in non-business fields.
Table 1 displays perceptions about what BA is, along with rationales
authors give for adopting BA in an organization. The definitions are organized into several classes based on their similarity. Before considering
each cluster and synthesizing a framework from them, we observe a
dual theme that pervades all of the definitions: The phrases “factbased” and “decision” (or “decision making”) appear repeatedly. As an
alternative to “fact” some definitions allude to “data.” Thus, in spite of
Table 1
Business analytics.
Class
Definition of BA
Rationale for BA
Source
A
1. “…culture, where fact-based decision-making is encouraged” and
rewarded
2. “management philosophy” through which “insights can be gained
and decision making improved” based on a “rich set of data”
3. “movement …. driven by technically literate executives who make
fact-based decisions, the availability of good data, a process orientation to running an enterprise, and improved software for data
capture, processing, and analysis”
4. “a subset of what has come to be called business intelligence: a set
of technologies and processes that use data to understand and
analyze business performance”
5. “a group of tools that are used in combination with one another to
gain information, analyze that information, and predict outcomes
of the problem solutions”
6. “a wide range of techniques and technologies” that “make it easier
to get value, meaning, from data”
7. “more than just analytical methodologies or techniques used in
logical analysis. It is a process of transforming data into actions
through analysis and insights in the context of organizational
decision making and problem solving”
8. “process of transforming data into actions through analysis and
insights in the context of organizational decision making and
problem solving”
9. “the scientific process of transforming data into insight for making
better decisions”
10. “extensive use of data, statistical and quantitative analysis, explanatory and predictive models, and fact-based management to
drive decisions and actions”
11. “the use of analysis, data, and systematic reasoning to make decisions … the most analytical firms and managers employ a combination of techniques, both quantitative and qualitative”
“an organizational capability that can be measured and improved”
12. “a company's use of its databases, explicative and predictive
models and fact-based management to drive its decisions and
actions”
13. “The use of data and related insights developed through applied
analytics disciplines (for example, statistical, contextual quantitative, predictive, cognitive and other models) to drive fact-based
planning, decisions, execution, management, measurement and
learning. Analytics may be descriptive, predictive or prescriptive.”
14. “discipline of applying advanced analytical methods ranging from
descriptive to predictive to prescriptive modeling”
15. “accessing, aggregating, and analyzing large amounts of data from
diverse sources to understand historical performance or behavior,
or to predict— or manage—outcomes”
16. “examine and manipulate data to drive positive business actions”
successful deployment of data warehouses, so they are “more
easily diffused in an environment”
“to support and optimize the capability that is the main driver of
a company’s value and competitive advantage”
“improve the overall process or decisions associated with
process roles”
[89]
“compete on making the best decisions”
[73]
“allow for informed decision making”
[92]
“simplify data to amplify its meaning”
[93]
“improve the overall process or decisions associated with
process roles”
“decisions and insights obtained from analytics are
implemented through changes within enterprise systems”
“better decision making”
[91]
“better decisions”
[82]
“compete on making the best decisions”
[73]
“to make better decisions and take right actions”
“help your managers and employees make better decisions, and
help your organization perform better”
[54]
“…better guide the exclusively human decisions and provide
automated decisions in some tasks”
[42]
“create competitive advantage”
[88]
“better decision making”
[83]
“rapid implementation of new ideas, products, and services,
which result in greater profits and shareholder value”
[94]
“users can make well-informed, fact-based decisions to support
their organizations’ tactical and strategic goals”
“create more value for customers and more profit for
companies”
“improve decision making”
“true value … is the result of knowing which data elements to
combine and compare in order to produce new knowledge …
we must not assume that simply exposing raw data results in
good decision making”
[95]
Movement
A Collection
of Practices & Technologies
A Transformation Process
A Capability
Set
Specific
Activities
A
Decisional
Paradigm
17. the “part of decision management” that involves “logical analysis
based on data to make better decisions”
18. “data-driven decision making” rather than “ignore the evidence
and make our strategic decisions based on anecdotes and
guesses… simply because harvesting the right data to inform our
decisions is more complex than it might first appear”
[90]
[91]
[48]
[96]
[97]
134
C. Holsapple et al. / Decision Support Systems 64 (2014) 130–141
notable differences in perceptions about the nature of business analytics, there is near consensus that whatever definition is adopted, it involves the notion of fact-based decision making.
This shared element of business analytic definitions is subject to
tweaking. The first tweak involves introduction of “evidence” as an element of these definitions. For instance, Pfeffer and Sutton [74] have
advanced the idea of what they call “evidence-based management”
(also see [75]). Although not using the term “analytics,” they examine
and advocate initiating evidence-based management movements in organizations – adopting a culture or philosophy in which a proposed
change is met with requests for (1) evidence about its need and efficacy,
(2) clear description of logic, and (3) explanation of sources behind the
evidence; this allows it to be examined for relevance, faulty reasoning,
and unreliability. Further, they advocate openness to experimentation and encouragement for continuous learning as important aspects
of an evidence-based culture. In a similar vein, Rycroft-Malone et al.
[76] hold “that ‘evidence’ in evidence-based practice should be considered to be ‘knowledge derived from a variety of sources that has
been subjected to testing and has found to be [relevant] credible’
([77], p. 311).”
Because the “evidence-based” notion is more fully developed and
encompassing than “data-based” or “fact-based” terms, we suggest
that the Table 1 definitions can be enhanced by substituting “evidence”
for “fact” or “data” terms, where “evidence” includes hard facts, reliable
measurements, justified estimates, well-reasoned approximations,
unbiased observations, credible explanations, authoritative advice, and
the like. It does not include arbitrary or unfounded beliefs, guesses,
opinions, speculations, conjectures, suspicions, or hearsay. A body of evidence is not driven by desires, emotions, politics, or ideology; nor is it a
captive of preconceptions. This evidentiary enhancement to the concept
of BA allows analytics to draw on inputs about which there is not absolute certitude – as happens, for instance, when hard facts are simply
unavailable (e.g., with what-if analysis, simulation modeling, when
using sales-force estimates, or drawing on expert opinions). Further,
as noted by Simon [78], problem recognition and solution often occur
within the confines of bounded rationality, limiting completeness and
precision of “fact” gathering a priori. That is, in the real world, obtaining
and demonstrating absolute facts about all relevant factors can be too
costly, too late, or infeasible.
A second tweak involves the “decision making” term. While decision
making is certainly a key facet of business, a manager is also concerned
with solving more than decision problems. This suggests there are other
kinds of management problems that may benefit from analytics.
From the SPED taxonomy, we expect that a manager can also produce
evidence-based solutions to problems of making sense, making predictions, and making evaluations. For instance, there are sense-making
problems, whose solutions are not decisions, but rather meaning/understanding. “Problem solving” is a richer, more flexible term, as it
also recognizes types of problems that are not decisional. Accordingly,
we contend that analytics definitions, such as those in Table 1, can be
enhanced by substituting “problem solving” for “decision making.”
In addition to solving a problem, there is also the essential act of recognizing a problem – recognizing occasions for making sense, predictions, evaluations, and decisions. As with problem solving, problem
recognition may be more, or less, evidence-based. Accordingly, we suggest that analytics definitions can be enhanced by including the concept
of “problem recognition” as well as “problem solving.”
Going forward, we adopt a general core characterization of business
analytics as being concerned with evidence-based problem recognition
and solving that happen within the context of business situations. While
consistent with, and underlying, the varied definitional perspectives in
Table 1, this broader working definition of BA allows a more inclusive
view of diversity in kinds of evidence considered (beyond just hard
facts), diversity in kinds of problems managers address (aside from decisional), and diversity in aspects that are addressed (both recognizing
and solving).
5.1. Core perspectives on evidence-based problem recognition and solving
Table 1 shows a structuring of the core analytics concept into six
classes, each reflecting a particular definitional perspective. The six
classes result from our interpretive “factor analysis” of the various definitions based on the emphasis found in each. While there may be some
cross loading, the classes are conceptually distinct. Although related and
complementary (as subsequently illustrated in a BA framework), no one
of them is an instance of another and no one of them is a simple
amalgam of others.
5.1.1. A movement
From the first perspective (exemplified by definitions 1–3), BA is
seen as being a movement. It involves a mind-set in which evidencebased problem recognition and solving governs an entity’s strategies,
operations, and tactics. The entity can be individual, organizational,
inter-organizational (e.g., supply chain), or societal. When an analytics
movement is in place, problem recognition and solving are fed by evidence, within the scope of bounded rationality. Evidence is the driver
in dealing with problems. It is not ignored, handled in a haphazard
way, or treated as an afterthought. When an analytics movement is in
place, evidence is not cherry-picked to support a preconception or
tossed away if it does not. In an analytics movement, philosophy, or culture, evidence is deemed more important than dogma, politics, fashionability, or personal agendas. We might speculate that, by freeing an
organization from such (oftentimes dysfunctional) forces, a BA movement fosters creativity, encourages collaboration and knowledge sharing, and focuses efforts within a realistic, not fictitious, portrayal of
situations.
When an organization commits to a BA movement, it lives and
breathes evidence-based problem solving (and recognizing) as a foundation for its actions and existence. As definitions 1 and 2 suggest, this
commitment means that an entity’s culture and philosophy are imbued
with a mind-set of evidence-based problem solving. For such an organization, adoption of an analytics philosophy and cultivation of an analytics culture are essential for implementation of a BA movement. This
mindset is considered essential to pursuing its vision and achieving its
mission.
From this analytics-movement perspective, a variety of issues
confront practitioners, researchers, and educators. They need to
study such questions as how to define and instill a BA philosophy,
how to create and sustain BA culture, how to determine the antecedent factors and competitive impacts of a successful BA movement,
and what are ways to measure progress and success when engaging
in a BA movement.
5.1.2. A collection of practices and technologies
In the second class (exemplified by definitions 4–6), BA is seen as
being a collection of practices and technologies. This is a very different
focus than that of the first class. The collection can exist without an
imperative of a BA movement, although its deployment may be less
effective in the absence of such imperative. Here, “practices” refers
to how things get done when it comes to operating on evidence –
with goals of increasing understanding, making predictions, generating new valuable knowledge, and so forth. We take these practices
to include an organization’s acquisition, assimilation, selection, generation, and emission of knowledge (i.e., evidence, in the present
context) in the face of various managerial, resource, and environmental influences [79].
In the modern world, operations on evidence may be digital – in
whole or in part. However, technological solutions do not exist for all
problems and, for others (e.g., involving relatively “small” data or “simple” analysis), can be “overkill.” Recalling our earlier discussions on the
technique dimension of analytics (Section 4), BA techniques may be
technology-based, and/or practice-based. More often than not, today,
the two are combined.
C. Holsapple et al. / Decision Support Systems 64 (2014) 130–141
Moreover, BA practices do not always deal with number crunching.
Indeed, most – as much as 80% [80] – of an organization’s data or knowledge base is not at all numeric. Yet, such evidence is still subject to BA
practices and technologies. Therefore, with regard to the technique
dimension of analytics, BA techniques may be quantitative, qualitative,
or combinations. The technique dimension also differentiates between
techniques for dealing with structured, semi-structured, and unstructured situations. Number crunching helps with some, but not all,
situations.
The practices-and-technologies perspective is perhaps the most
commonplace view of BA. It is held by technology vendors and by
those who regard analytics as something (e.g., number crunching)
they have been doing for decades. Yet, there are still important, openended issues to consider. Among the issues that confront practitioners,
researchers, and educators are: what should exist in a particular
practice-and-technology collection, whether there is value in coordinating the collection with a BA movement approach (and how so), how to
encourage and manage adoption, ways to measure deployment effects,
what factors are antecedents of a successful deployment, how the
collection is affected by advances in BA technologies, and what is the
relationship between massive data and the selection of appropriate
techniques – especially statistical analysis techniques [81].
5.1.3. A transformational process
In the third class of definitions (exemplified by 7–9), BA is seen as
being a process of transformation. Evidence is transformed via some process into insight or action. Presumably, a transformation process employs some mix of practices and technologies operating on some body
of evidence and is influenced by some culture. In this perspective, the
focus is on the process that drives, coordinates, controls, and measures
the transformation. What is of interest from this perspective is the
what, why, when, and how of the transformation. However, there is no
mention of handling evidence acquisition, storage, and maintenance
and handling the processors (human and digital) required for performing
transformations and using results. We suggest that requisite evidence
and processor management should be recognized as an important aspect
in design and execution of a transformation process.
In a survey of INFORMS membership (over 1800 respondents),
Liberatore and Luo [48] find that the transformation-process view is
tied with the capability-set view (discussed below) in terms of the
extent of match with individual members’ views on what analytics is.
The other four views identified in Table 1 were not studied in the survey.
Further, a 2012 poll by INFORMS finds that 72% of voting members largely
concur with the definition: “the scientific process of transforming data
into insight for making better decisions” [82].
From the transformational-process perspective, practitioners,
researchers, and educators face a variety of issues. A central issue is
how to design a process that transforms evidence into insights that
make sense of a situation or decisions that drive actions taken in the
context of a situation. How are the SPED elements incorporated and
coordinated within the process? What are the design requirements
and criteria for evaluating the process? The PAIR model may give
some guidance on this point. Those who study BA as a transformational process also face questions about how to muster sufficient resources to make process execution practical, how to measure process
performance, what steps to take for ensuring process control, and
how to infuse creativity into an evidence-based transformational
process.
5.1.4. A capability set
In the fourth class (exemplified by definitions 10–14), BA is seen
as being a set of capabilities. These are competencies possessed by
an organization and its processors. They determine what can be
done in the way of evidence-based problem recognition and solving.
Definitions belonging to the capability-set perspective suggest inclusion of processors’ skills for managing evidence, using models, and
135
logical reasoning. Collectively, definitions 10–14 indicate that BA capabilities can include:
Using techniques that are quantitative, qualitative, and combinations
Using statistical techniques
Using systematic reasoning
Working effectively with models that are:
o Descriptive/Explanatory,
o Predictive, or
o Prescriptive
• Working effectively with evidence (e.g., databases, click-streams, documents, sensors, maps)
•
•
•
•
The extent of BA capabilities can vary, and there is no guarantee that
existence of a particular capability will result in the full use of that capability. For instance, an organization may possess the capability to execute certain technologies, but lack those technologies (conversely,
possessing a technology says nothing about the degree of a firm’s capability for effectively applying that technology).
The definitions are explicitly concerned with capabilities for executing collections of practices and technologies. However, implicitly, BA
capabilities are not limited to an organization’s potential for executing
its repertoire of practices and technologies: there is also some degree
of capability for effectively coordinating the use of such skills in the
course of problem finding and solving. They include capabilities of innovation, improvisation, and imagination involved in building a culture,
designing processes, or conducting specific activities such as evidence
acquisition. The framework introduced later suggests that possessing
processors with such capabilities is necessary for a BA movement to
flourish and BA transformations to happen. It is a capability set that influences what BA transformational processes will be built, and that
functions as both means and constraints on what a BA movement can
accomplish.
From the capability-set perspective, central issues confronting practitioners, researchers, and educators are specifically what a capabilities
portfolio should look like, what the relative weightings of inherent competencies should be, and how must the desired capability profile be
built and refreshed [7]. One also faces questions about the measurement, shelf-life, and costs of alternative capabilities, and the impacts
of a capability portfolio on competitiveness. Investigation of this last
point should be cognizant of possible relationships of BA with dynamic
capabilities, operational capabilities, and improvisational capabilities –
all of which are recognized as important competitive factors [83,84].
5.1.5. An activity type set
From the fifth perspective (exemplified by definitions 15–16), BA is
seen as being a set of specific activity types for operating on available evidence, rather than being a set of broad competencies. According to the
definitions, these four activities are accessing, examining, aggregating,
and analyzing evidence. The framework introduced later suggests that
specific types of evidence-manipulation activities are important to consider in design of a transformation process. Various patterns of such
activities, along with execution of various practices and technologies,
are what enable transformations from evidence to insight or decision.
Beyond the four above-mentioned manipulations, there may well be
other activities that are relevant, or other ways of thinking about or
classifying manipulation activities. For instance, the knowledge chain
theory identifies five first-order activities for manipulating evidence
(i.e., acquiring, generating, assimilating, selecting, emitting knowledge)
and four second-order activities of measuring, controlling, coordinating,
and leading manipulation actions [52]. From the specific-activities perspective, issues confronting practitioners, researchers, and educators
pertain to arranging them into the design of transformational processes,
ways to measure and control efficacy of the activities, how to ensure
security, integrity, and regulatory compliance during the performance
of analytics activities, and what factors are antecedents of successful
activity execution.
136
C. Holsapple et al. / Decision Support Systems 64 (2014) 130–141
5.1.6. A decisional paradigm
In the sixth class (exemplified by definitions 17–18), BA is seen as
being a decisional paradigm. It is said to be an approach to decision making – distinct from other approaches, such as naturalistic decision making [85–87] – or is perceived as part of decision making. It is possible
that one paradigm supports another and that paradigms may mutually
conflict. The “analytics” decisional paradigm may be seen as an umbrella
concept that covers a combination of definitions 1-16.
From the decisional-paradigm perspective, a variety of issues confront practitioners, researchers, and educators. A central issue is whether such a paradigm is appropriate, in light of an organization’s analytics
movement maturity and its analytics capabilities. If so, then design of a
transformational process incorporating attendant specific actions, implemented via selected practices and technologies, can proceed. There is also
a need to study such questions as what are the situational traits that
make BA suitable for a given organization, how to attain and sustain
buy-in for this paradigm, ways to integrate BA with other decisional
paradigms, avoiding misuse of analytics, and so forth.
5.1.7. Rationale for analytics
The “Rationale” column in Table 1 identifies several reasons for pursuing BA. These can be consolidated and summarized to give the following rationales for business analytics:
•
•
•
•
•
•
•
Achieve a competitive advantage
Support of an organization’s strategic and tactical goals
Better organizational performance
Better decision outcomes
Better or more informed decision processes
Knowledge production
Obtaining value from data
In principle, these same factors can be seen as being endogenous
variables whereby outcomes of an organization’s BA efforts can be measured and as a basis for gauging efficacy realized from:
1.
2.
3.
4.
Nurturing an analytics movement
Adopting a decisional paradigm that stresses analytics
Building and growing a managed set of analytics capabilities
Designing analytics processes for transforming evidence into insights,
decisions, and actions
5. Conducting specific evidence-manipulation activities needed within
an analytics initiative
6. Executing analytics practices and technologies
Above, we have identified a variety of unresolved, and perhaps heretofore unidentified, issues for each of these six perspectives. Based on
that discussion, it appears that, when considering BA, all six perspectives deserve attention by researchers, practitioners, educators, and
students. Ignoring any of them renders an organization vulnerable to
sub-par, even unacceptable, outcomes from BA initiatives. However,
there are additional considerations beyond those indicated in any of
the definitions in Table 1. These arise because of relationships that
exist among the six perspectives.
5.2. A unifying foundation for understanding business analytics
Building on the six definitional perspectives of BA, we introduce a
foundational framework that encompasses all of them and their relationships. This framework is a super-perspective that both covers and
unifies the rich collection of analytics definitions. It is designed to be
consistent with, and inclusively supersede, the specialized characterizations of business analytics. As such, the business analytics
framework (BAF) serves as an organizing device for practitioners to
use in planning and evaluating their analytics initiatives, a generative mechanism for researchers to use in identifying and designing
their analytics investigations, and a guiding template for educators
to use for positioning topics and ensuring full coverage in their
Decisional Paradigm
Execute
Conduct
Specific
Activities
Practices &
Technologies
Design
Yielding
Insights
A
Transforming
Process
Yielding
Decisions
Build
Logic
Qualitative
Quantitative
A
Capability
Set
Model
Evidence
A
Movement
Cultivating an
Analytics
Culture
Fig. 1. Business analytics framework (BAF).
analytics curricula. Fig. 1 portrays the BAF, showing a particular arrangement among the six building blocks. To avoid BA attempts
that are insular, piecemeal, out-of-sync, or defective, certain alignments
between blocks are essential.
At the base, we have a BA movement grounded in a rationale, such as
implementing a strategy, achieving a mission, or fulfilling a purpose.
The movement is powered by a philosophy and culture in which
evidence-based problem solving (and recognition) is of high priority.
Adopting an analytics philosophy and cultivating an analytics culture
become core values. Along with other values such as transparency,
integrity, excellence, and accountability, they ultimately shape how
things get done. The resultant conduct and execution of BA determines what things get done (e.g., what decisions get made, what
evaluations are produced, what predictions are devised, what sense
is made of a situation).
A movement will not ‘move’ without a suitable capability set. Without requisite competencies, BA philosophy and culture are to no avail.
Conversely, a strong set of capabilities to use for BA will be relatively ineffective when there is no movement to apply these competences in a
consistent, committed, coordinated, purposeful fashion. It follows that
a strong alignment between an organization’s BA movement and its
set of BA capabilities is necessary. As the BAF portrayal in Fig. 1 indicates,
intentions inherent in the movement need to fit realities inherent in the
capability set. Thus, the BAF generates such questions as how to effect
and sustain not only a BA movement, but also an alignment between
the movement and the capability set. For instance, one might ask
which comes first, the movement or the capability set, what are the antecedents of alignment, what constructs are able to explain this kind of
BA alignment, and who is involved in overseeing the alignment of BA capabilities with a BA movement. Such questions of analytics alignment
deserve careful study, because correct answers may well impact whether
an analytics-centered decisional paradigm can be effective, or even feasible. Each of these questions can be studied from a descriptive or prescriptive standpoint.
C. Holsapple et al. / Decision Support Systems 64 (2014) 130–141
Now, looking at the capability set itself, the BAF indicates that an
organization possesses some mix of competences for handling both
quantitative and qualitative evidence in the course of problem solving
(and recognition). Further, skill at both quantitative and qualitative
modeling is necessary. Then, there is the need for competence at logical
reasoning in the treatment of evidence and models. These categories of
competencies are suggested by Table 1 definitions for the capability-set
perspective.
Enumerating competencies within each category is far beyond the
present scope, but a few examples include interpretation of competitor
actions, distilling the message buried in vast data about behaviors of
web site visitors, reasoning about governmental regulations in order
to make recommendations, solving optimization or other modeling
problems, gathering knowledge of supplier development strategies,
adeptness at identifying novel sources of evidence, and expertise in
understanding the factors relevant to facilities siting. It is up to an organization to build a capability set suitable for its own situation. This construction is guided by answers to such questions as what analyticssupport capabilities are needed/desired, in what proportions, for how
long, at what cost, and when? The construction, which is ongoing, is
performed in alignment with the BA movement – shaping, buttressing,
and/or emanating from the movement.
The BAF in Fig. 1 shows that the building of a capability set is also
performed in alignment with a transformational process(es). This can
work in two directions. A BA process for transforming evidence into insights or decisions may be designed subject to BA capabilities on hand.
Conversely, the transformational design can inform the building of capabilities, by identifying capability requirements. When there is a
mismatch between capabilities and process, the BA movement is unlikely to bear fruit. In the interest of avoiding a mismatch, there is a
need to study such questions as what are the antecedents of alignment between building BA capabilities and designing evidencebased transformational processes, what constructs are able to explain
this kind of BA alignment, who participates in ensuring the alignment
of BA capabilities with BA transformations, and so on. Again, each
of these questions can be studied from descriptive or prescriptive
standpoints.
The transformational-process component of the BAF may be left to
serendipitous, ad hoc exercise of capabilities. However, this may be
less effective than conscious design and sustained application of
processes for transforming evidence into insights (interpretations, predictions, evaluations) and decisions. In the BAF, a transformationalprocess design specifies procedures and/or rules that operate on evidence and are performed with analytics capabilities. Such a design is a
mechanism for coordinating, controlling, and measuring analytics
work. Procedures and rules not only harness analytics capabilities, but
also are designed in terms of conducting specific kinds of activities
and executing practices and technologies. In Fig. 1, this is indicated by
the building blocks protruding from the transformational process component. Here, again, alignment is important.
In this case, the nature and degree of alignment is inherent in
transformational-process designs. These indicate how to handle
dependencies among specific evidence-manipulation activities
(e.g., knowledge acquisition, knowledge assimilation) and within
portfolios of practices (e.g., variants of lessons-learned techniques,
Delphi-based methods) and technologies (e.g., data warehousing
system, data mining software). As with other alignments within
the BAF, there are questions that deserve study, including possession
of requisite BA practices/technologies for accomplishing a transformation, whether the capabilities set furnishes competences for executing
them, what patterns of evidence-handling activities are effective for
accomplishing a transformation, and which practices/technologies to
invoke within these activities.
Overall, successful adoption of BA as a decisional paradigm depends
on the BAF alignments. Moreover, in the BAF, application of capabilities is neither opportunistic nor ad hoc. Rather, in keeping with an
137
organization-wide BA philosophy and culture, such capabilities are
applied as a matter of course and routinely whenever and wherever
necessary.
5.3. A brief BAF application
Here, we apply the BAF to illustrate its concepts and to demonstrate
its relevance for practitioners in planning and evaluating their analytics
initiatives. The illustration involves a study whose findings are based on
anecdotes (e.g., McKesson, BAE Systems, Pfizer) and a large-scale survey
conducted by MIT and IBM researchers [88]. The survey involved 4500
managers and executives from organizations in over 120 countries
and 30 industries. Its goal was to understand how organizations could
use BA to achieve greater competitiveness. One finding is that organizations can take one of two paths to become highly sophisticated or advanced users of analytics: specialized or collaborative. The specialized
path uses a wide range of analytics skills and tools to improve operations within individual business units or functional areas. This usually
leads to pockets of analytical prowess within an organization. On the
other hand, the collaborative path uses analytics broadly across business
units and functional areas to bring the entire organization to the same
level of excellence in analytics sophistication. To demonstrate the versatility of the BAF, we describe the key features of these paths using the six
BAF building blocks.
Both paths are described as being anchored in a data-oriented culture that underscores a pattern of behaviors, practices, and beliefs that
are consistent with the principles of analytical decision making. This is
in line with the BAF’s movement building block, which we have positioned as the base of any BA initiative. Additionally, both paths follow
a decisional paradigm, being presented as pervading every business
decision, from strategy formulation to day-to-day operations. As organizations strive toward analytic sophistication, they are seen as equipping
themselves to manage, understand, and act on data. This is what the BAF
refers to as a capability set. Along a specialized path, for example, capabilities discussed include those of staying abreast of new advances in
analytics and applying them to meet the newest data challenges, such
as capabilities for mining real-time data from the Internet or from unstructured email content.
Depending on the capability set that is built, Kiron et al. [88] differentiate between two kinds of processes that can move organizations forward with analytics. In BAF parlance, these are alternative
approaches for transforming evidence into insights or decisions. A
propensity toward understanding data to explain what is happening,
and why, leads an organization to design a specialized transformation process. In contrast, a collaborative transformation process is
designed by organizations to enable and facilitate flows of suitable
evidence to individuals throughout the enterprise for exercising their
capabilities for making decisions analytically. Each kind of transformation process relies on conducting specific activities by executing various
practices and technologies.
For organizations on a specialized transformational path, Kiron et al.
[88] identify the specific activities of analyzing patterns, identifying
trends, and discovering anomalies, albeit within silos of functional
areas or organizational units. They indicate that performing such activities involves advanced analytics technologies (e.g., predictive modeling) and proven practices (e.g., Six Sigma). In BAF terms, a specialized
transformation process is designed with the flexibility to accommodate
various configurations of these kinds of activities and adoption of these
kinds of practices/technologies. Further, the BAF implies that among the
organization’s many analytics-related capabilities there are some appreciable degrees of competences in conducting these kinds of activities
and in adopting these kinds of practices/technologies.
In contrast, the collaborative path of transformation is designed so
that evidence is readily available and accessible to meet specific needs
of individuals in the organization. Practices include techniques for
integrating data and insights distributed across the organization.
138
C. Holsapple et al. / Decision Support Systems 64 (2014) 130–141
Technologies, such as common data platforms, data visualization tools,
and dashboards, allow individuals to perform such specific activities as
generating snapshots of enterprise performance or developing “whatif” scenarios to manage daily operations and shape future strategies. In
BAF terms, a collaborative transformation process is designed with the
flexibility to accommodate various configurations of the specific activities and the adoption of these kinds of practices/technologies. Further,
the BAF implies that among the organization’s many analytics-related
capabilities there are some appreciable degrees of competences in
conducting the activities and in adopting the practices/technologies to
accomplish a collaborative transformation consistent with enterprise
objectives.
The BAF is derived from multiple academic perspectives on what BA
is – perhaps themselves derived from what other academics have had to
say or from what their authors observed in practice. The result is a unification and subsumption of these perspectives. But, to what extent
does the BAF connect with the world of practitioners? The surveybased and anecdote-based study by Kiron et al. [88] examines behaviors
and experiences of firms engaged in successful business analytics initiatives. By applying the BAF to their study, we have some practice-based
evidence that the BAF constructs are consistent with the real world.
Via actual examples provided by independent researchers, we show
that all of the BAF constructs are illustrated in practice. Our examination
of the Kiron study does not reveal any substantial concepts that are
missing from the BAF. It does, however, reveal that many of the issues
and questions generated from the BAF in Sections 5.1 and 5.2 have not
been identified, addressed, or answered. Accordingly, we suggest that
the BAF serves not only as a theoretical foundation for understanding
business analytics, but also as a guide that can benefit practitioners as
they grapple with the launch and operation of their own analytics
initiatives.
We contend that the BAF is sufficiently promising to warrant further
investigation and application. One starting point for this would be
identifying extensions, modifications, and refinements to the BAF by
examining its capacity to characterize BA initiatives in specific firms.
Although beyond the scope here, the real-world examples highlighted
by Kiron et al. [88] could be examined through the BAF lens. A series
of case studies can be performed, using the BAF to structure interviews
or focus groups with practitioners regarding their own BA experiences
and views. The objectives would be to refine or more fully develop the
framework, to detail ways in which it can be applied in practice, and
to uncover empirically-grounded propositions about BAF for theory development and hypothesis testing.
6. Discussion
The business analytics framework can be adapted in several ways.
A few examples are discussed here. For the most part, they involve
restricting the framework in various ways or collapsing aspects of the
framework to eliminate their distinctions. Such adaptations are at the
discretion of the BAF user, although we caution that they reduce the resolution and richness relative to full-scale adoption.
6.1. The evidence tweak
Consider the tweak in Section 5, where we use the term “evidence”
instead of commonplace “data” or “facts.” There is nothing in the BAF, or
overall BA foundation, that prevents a reader from replacing the concept
of evidence with “data” or “facts.” However, this results in a restricted
appreciation of business analytics.
The term “data” frequently carries with it a connotation of referring
to quantitative evidence. By adopting the term “evidence,” we are more
neutral, intentionally avoiding such a connotation, and allowing that
qualitative evidence can also be used in problem solving. Further, unlike
the “fact” term, “evidence” allows a BAF user to set his/her own threshold for what qualifies as evidence, in contrast to the absolute certainty
required for considering only facts. Another rationale for adopting “evidence” is to be consistent with the evidence-based management movement emanating from Stanford researchers. This connection positions
future research in which analytics can learn from evidence-based management, and vice versa.
The notion of data, whatever meaning is ascribed to it, is subsumed
in the notion of evidence. In the interest of greater generality and
completeness, we choose to use of the “evidence” term. While one
may substitute the “data” term, the foundation designed here would
need to be interpreted in a more restrictive manner.
6.2. The problem tweak
A second tweak brings problem recognition and problem solving
into the fore, overshadowing decision making. We observe that much
of the BA heritage is rooted in efforts to support the making of decisions. Further, most of the definitions in Table 1 seem oriented specifically toward decisional problems, although other orientations
(e.g., sense-making problems) are identified in some definitions.
Nevertheless, because our design is driven by the literature, a unifying
foundation necessarily accommodates the multiple kinds of orientations. This leads to a working definition of BA that does not dismiss
additional non-decisional viewpoints about the bounds and nature of
BA. We advocate (and construct) a more inclusive/expansive appreciation of BA as a field of research, study, and practice. The BAF, thus,
advances a richer and more-detailed system of core perspectives and
considerations for the BA field – while still accommodating the support
of decisional problems.
As a related point, explained in Section 5, examination of literature
characterizing BA shows that it is sometimes regarded as being a
kind of decisional paradigm. This is distinct from a notion of defining
BA by following a decisional paradigm. Recall that there are other
(non-BA) paradigms of decision making, such as the political or ideological. Moreover, the literature shows that BA is regarded in ways
other than serving as a decisional paradigm. Not concerned exclusively with decisional problems, they recognize that other kinds of
problems exist (e.g., sense-making, evaluation, and prediction) and
that BA can be helpful in dealing with those problems. This suggests
that the BAF may be capable of extension that recognizes it as also
being a sense-making paradigm, evaluation paradigm, or prediction
paradigm.
Thus, in the interests of a unifying foundation, one cannot restrict
the characterization of BA to a decision-making heritage or paradigm. Any such restriction would result in characterizing a special
case of BA.
6.3. Other definitions
As another example of adapting the BAF, a reader may prefer some
other definition of BA. Examples are shown in Table 1. For instance,
one may conceive of analytics as a transformation from structured or
unstructured data into data that are in some way quantified for analysis.
This paper develops a more general core characterization of business
analytics: evidence-based problem recognition and solving that happen
within the context of business situations. Notice that this is not the result
of expanding “the definition” of BA. A crucial observation, on which the
paper is based, is that “the definition” of analytics does not exist. The
fact that many definitions prevail is one motivator for designing a unified foundation that encompasses diverse characterizations in a way
that unifies what they have to offer. One of the key contributions of
this paper is to break out of a narrow conception of analytics. We
adopt an inclusive view that takes advantage of the varying conceptions
and emphases found in the literature (including transformation views).
This view accommodates specialized conceptions that may be preferred
or advocated by others, albeit at the sacrifice of a more complete picture
provided by the BAF.
C. Holsapple et al. / Decision Support Systems 64 (2014) 130–141
6.4. Practitioner fit
The inclusive BA foundation we develop is also able to encompass
multiple perspectives of BA practitioners and multiple kinds of BA initiatives. We do not confine ourselves to particular mindsets about BA practice or initiatives (e.g., number crunching). Our research suggests that
excluded considerations may well include actual or potential activities
and issues involving the treatment of evidence – activities and issues
that are important/useful/essential to successful analytics initiatives.
We suggest that what people are doing in the field can be viewed
through the lens of the unified foundation. That is, any instance of BA
in the field fits into the foundation and is subsumed by it. Examination
of how it fits may well reveal unforeseen ways in which a field instance
can be extended or refined, allowing practitioners to go beyond what is
currently being done, by relating it to other analytics facets.
We expect that the BAF is applicable to many domains, both within
and outside of business. It is beyond the present scope to compare BA
with such other fields of study, but we contend a careful investigation
would reveal that the foundation advanced here is not a replication of
other fields, and may well contribute to a better understanding of
those fields.
6.5. Component collapse
Each component of the BAF is amenable to further research and development. On the other hand, some readers may be inclined to collapse
multiple extant components into a single component. This can be done,
but very likely at the expense of clarity. For instance, one may prefer to
collapse the collection of practices and technologies into the capability
set component, arguing that the latter subsumes specific practices and
technologies. The literature notes, however, that these are distinct notions. The distinction is acknowledged in the BAF introduced here
(and explained using the example described in Section 5.3).
We cannot claim that a capability set subsumes the specific practices
and technologies any more than we can claim that the movement component, say, subsumes a capability set. Capabilities are not limited to a
potential for executing a firm’s repertoire of practices and technologies.
For instance, a BA capability set may include capabilities of innovation
or improvisation involved in building a culture, designing transformation processes, or conducting specific activities such as evidence
acquisition. Setting aside this point, we must recognize that a firm
may possess capabilities to execute certain technologies, but lack
those technologies. Conversely, possessing a technology says nothing
about the degree of a firm’s capability for effectively applying that technology; it may at any point in time be non-existent, modest, or without
parallel.
As the paper contends, alignments among elements of the BAF are
important considerations. Misalignments of a capability set with a set
of technologies/practices should be of particular note to management,
as they can suggest opportunities for, and obstacles to, improving a
firm’s BA initiatives/results. We cannot just assume that such alignment
exists, much less that one subsumes the other.
7. Conclusion
This paper builds on prior research to design a relatively comprehensive and holistic characterization of business analytics as a real-world
phenomenon, a subject for investigation, an approach to decision making, and an area for business education. It examines roots, progress,
needs, and issues pertaining to the current state of analytics. We identify
three kinds of analytics dimensions and six distinct classes of analytics
perspectives. We use a synthesizing approach to design the Business
Analytics Framework, which accounts for the six classes being complementary and points out important alignments that are needed. One
contribution of the BAF, along with the three attendant dimensions,
is the provision of a unifying, inclusive foundation for building an
139
understanding of what is involved in the study of BA – effectively
amounting to an ontology that researchers, practitioners, and educators
can use to communicate about, and frame their ideas about, business
analytics. Another benefit of this foundation is its provocation of a
host of un- or under-investigated BA issues for researchers to explore
and for practitioners to take into account. Many of these are brought
out in Sections 5.1 and 5.2.
For BA to actually work in an organization, there are issues quite
aside from data management, number crunching, technology, systematic
reasoning, and so on. These added issues pertain to whether an organization is ready to adopt BA as a decisional paradigm, or whether such an
attempt is likely to languish. Key factors suggested are awareness and
commitment to the organization’s vision, mission, and strategy; an
analytics-friendly culture; a management philosophy that understands
and supports the use of business analytics; techniques for avoiding
evidence being trumped by anti-analytics factors such as apathy, apprehension, coercion, envy, fashion, or ideology. In sum, this paper takes an
initial step toward building a foundation for the study, operation, and investigation of analytics-intensive organizations.
Acknowledgement
Authors are listed in alphabetical order.
References
[1] R. Kalakota, Gartner says – BI and Analytics a $12.2 Bln market, Posted April 24, 2011
http://practicalanalytics.wordpress.com/2011/04/24/gartner-says-bi-and-analyticsa-10-5-bln-market/ 2011.
[2] T. Elliot, 2012: The year analytics means business, Posted February 10, 2012 http://
smartdatacollective.com/timoelliott/45868/2012-year-analytics-means-business
2012.
[3] R. Kalakota, IBM CIO study: BI and analytics are #1 priority for 2012, Posted November
2, 2011 http://practicalanalytics.wordpress.com/2011/11/02/ibm-cio-study-bi-andanalytics-are-1-priority-for-2012/ 2011.
[4] D. Harris, IDC: Analytics a $51B business by 2016 thanks to big data, Accessed Jul 11,
2012 http://gigaom.com/cloud/idc-analytics-a-51b-business-by-2016-thanks-tobig-data/ 2012.
[5] S. Swoyer, Demand for analytics keeps growing, The Data Warehouse Institute,
2012. (Posted September 18, 2012. http://tdwi.org/articles/2012/09/18/Demand%
20Analytics%20Grows.aspx ).
[6] Nucleus Research, Analytics pays back $10.66 for every dollar spent, Report L122,
December 2011.
[7] J. LaCugna, Foreword, in: J. Liebowitz (Ed.), Big Data and Business Analytics, CRC
Press, Boca Raton, FL, 2013, pp. vii–xiii.
[8] J. Liebowitz (Ed.), Big Data and Business Analytics, CRC Press, Boca Raton, FL, 2013.
[9] H. Watson, Business analytics insight: hype or here to stay? Business Intelligence
Journal 16 (1) (2011) 4–8.
[10] A. Newell, H.A. Simon, The logic theory machine, Rand Corporation Report P-868,
June 1956.
[11] G.B. Dantzig, Thoughts on linear programming and automation, Management
Science 3 (2) (1957) 131–139.
[12] C.W. Holsapple, A. Whinston, Decision Support Systems: A Knowledge-Based
Approach, West, St. Paul, MN, 1996.
[13] D.J. Power, R. Sharda, Decision support systems, in: S.Y. Nof (Ed.), Handbook of
Automation, Part I, 2009, pp. 1539–1548.
[14] H. Watson, B. Wixom, The current state of business intelligence, IEEE Computer 40
(9) (2007) 96–99.
[15] H.P. Luhn, A Business Intelligence System, IBM Journal of Research and Development 2 (4) (1958) 314–319.
[16] S. Negash, P. Gray, Business intelligence, in: F. Burstein, C. Holsapple (Eds.), Handbook on Decision Support Systems 2, Springer, Heidelberg, 2008, pp. 176–193.
[17] A. Bennet, D. Bennet, The decision-making process in a complex situation, in: F.
Burstein, C. Holsapple (Eds.), Handbook on Decision Support Systems 1, Springer,
Heidelberg, 2008, pp. 1–20.
[18] R.L. Ackoff, The Design of Social Research, University of Chicago Press, Chicago, 1953.
[19] R.H. Killman, I.I. Mitroff, Qualitative versus quantitative analysis for management
science: different forms for different psychological types, Interfaces 6 (2) (1976)
17–27.
[20] J. Mingers, Soft OR comes of age-but not everywhere! Omega 39 (6) (2011)
729–741.
[21] T. Davenport, R. Kalakota, J. Taylor, M. Lampa, B. Franks, J. Shapiro, G. Cokins, R. Way,
J. King, L. Schafer, C. Renfrow, D. Sittig, Predictions for Analytics in 2012, International Institute for Analytics Research Brief, December 15, 2011.
[22] J. Betser, D. Belanger, Architecting the enterprise with big data analytics, in: J.
Liebowitz (Ed.), Big Data and Business Analytics, CRC Press, Boca Raton, FL, 2013,
pp. 1–20.
140
C. Holsapple et al. / Decision Support Systems 64 (2014) 130–141
[23] J. Kobielus, Predictions and plans for business analytics in 2011, Posted on January 6,
2011 http://blogs.forrester.com/james_kobielus/11-01-06-predictions_and_plans_
for_business_analytics_in_2011 2011.
[24] J. Burby, S. Atchison, Actionable Web Analytics: Using Data to Make Smart Business
Decisions, Sybex, Indianapolis, 2007.
[25] A. Kaushik, Web Analytics 2.0: The Art of Online Accountability and Science of
Customer Centricity, Sybex, Indianapolis, 2009.
[26] L. Hasan, A. Morris, S. Probets, Using Google Analytics to evaluate the usability of
e-commerce sites, in: M. Kurosu (Ed.), Human Centered Design, HCII Lecture
Notes in Computer Science, 5619, 2009, pp. 697–706.
[27] R.P.L. Buse, T. Zimmermann, Analytics for software development, FSE/SDP Workshop on the Future of Software Engineering Research, ACM, Santa Fe, November
2010.
[28] R.P.L. Buse, T. Zimmermann, Information needs for software development analytics,
International Conference on Software Engineering, Zurich, June 2012.
[29] B.M. Tomaszewski, A.C. Robinson, C. Weaver, M. Stryker, A.M. MacEachren,
Geovisual analytics and crisis management, International ISCRAM Conference,
Delft, The Netherlands, May 2007.
[30] M. Jern, M. Brezzi, P. Lundblad, Geovisual analytics tools for communicating emergency and early warning, in: M. Konecny, T.L. Bandrova, S. Zlatanova (Eds.), Geographic Information and Cartography for Risk and Crisis Management, Springer,
Heidelberg, 2010, pp. 379–394.
[31] R.W. Maule, G. Schacher, S.P. Gallup, B. McClain, Applied knowledge analytics for
military experimentation, Proceedings of IEEE International Conference on Information Reuse and Integration, 2004, pp. 91–96.
[32] G.S. Spais, C. Veloutsou, Marketing analytics: managing incomplete information in
consumer markets and the contribution of mathematics to the accountability of
marketing decisions, HERCMA Conference, September 2005, Athens, 2005.
[33] W.J. Hauser, Marketing analytics: the evolution of marketing research in the twentyfirst century, Direct Marketing: An International Journal 1 (1) (2007) 38–54.
[34] T. Davenport, The dark side of customer analytics, Harvard Business Review 85 (5)
(2007) 37–48.
[35] S.D. Lichtenstein, H.B. Bednall, S. Adam, Marketing research and customer analytics:
interfunctional knowledge integration, Journal International Journal of Technology
Marketing 3 (1) (2008) 81–96.
[36] C.H. Tian, R.Z. Cao, H. Zhang, F. Li, W. Ding, B. Ray, Service analytics framework for
web-delivered services, Journal International Journal of Services Operations and Informatics 4 (4) (2009) 317–332.
[37] A. Levenson, Harnessing the power of HR analytics, Strategic HR Review 4 (3)
(2005) 28–31.
[38] C. Royal, L. O'Donnell, Emerging human capital analytics for investment processes,
Journal of Intellectual Capital 9 (3) (2008) 367–379.
[39] T. Davenport, J. Harris, J. Shapiro, Competing on talent analytics, Harvard Business
Review 88 (10) (2010) 52–58.
[40] J. Eicher, D. Ruder, Business process analytics: a new approach to risk, Journal of
Alternative Investments 10 (2) (2007) 76–84.
[41] P. Trkman, K. McCormack, M. de Oliveira, M. Ladeira, The impact of business analytics
on supply chain performance, Decision Support Systems 49 (3) (2010) 318–327.
[42] J. O'Dwyer, R. Renner, The promise of advanced supply chain analytics, Supply
Chain, Management Review January 32–37, 2011.
[43] B. Ray, C. Apte, K. McAuliffe, L. Deleris, E. Cope, Harnessing Uncertainty: The Future
of Risk Analytics, IBM Research Report RC24534, 2008.
[44] M. Smelyanskiy, Challenges of mapping financial analytics to many-core architecture,
Workshop on High Performance Computational Finance, Austin November 2008, 2008.
[45] S.P. Singh, T.G. Sawhney, Predictive analytics and the new world of retail healthcare,
Health Management Technology 27 (1) (2006) 46–50.
[46] J.F. Hair, Knowledge creation in marketing: the role of predictive analytic, European
Business Review 19 (4) (2007) 303–315.
[47] A. Robinson, J. Levis, G. Bennett, INFORMS News: INORMS to Officially Join Analytics
Movement, OR/MS Today 37 (5) (2010).
[48] M. Liberatore, W. Luo, INFORMS and the analytics movement: the view of the membership, Interfaces 41 (6) (2011) 578–589.
[49] D. Delen, H. Demirkan, Data, information and analytics as services, Decision Support
Systems 55 (1) (2013) 359–363.
[50] J.R. Evans, Business Analytics: Methods, Models, and Decisions, Prentice- Hall, Englewood Cliffs, NJ, 2013.
[51] H.A. Simon, The New Science of Management Decision, Prentice-Hall, Englewood
Cliffs, NJ, 1960.
[52] C.W. Holsapple, M. Singh, The knowledge chain model: activities for competitiveness, Expert Systems with Applications 20 (1) (2001) 77–98.
[53] W.F. Cody, J.T. Kreulen, V. Krishna, W.S. Spangler, The integration of business intelligence and knowledge management, IBM Systems Journal 41 (4) (2002) 697–713.
[54] T. Davenport, J.G. Harris, R. Morison, Analytics at Work: Smarter Decisions, Better
Results, HBR Press, Boston, 2010.
[55] W.H. Inmon, A. Nesavich, Tapping into Unstructured Data: Integrating Unstructured
Data and Textual Analytics into Business Intelligence, Pearson Education, Boston,
2008.
[56] A.C.M. Fong, S.C. Hui, G. Jha, Data mining for decision support, IT Professional 4 (2)
(2002) 9–17.
[57] C. McCue, Data mining and predictive analytics in public safety and security, IT Professional 8 (4) (2006) 12–18.
[58] W.R. King, Text analytics: boon to knowledge management? Information Systems
Management 26 (1) (2009) 87.
[59] A. Groß-Klußmann, N. Hautsch, When machines read the news: using automated
text analytics to quantify high frequency news-implied market reactions, Journal
of Empirical Finance 18 (2) (2011) 321–340.
[60] H. Takeuchi, A conversation-mining system for gathering insights to improve agent
productivity, IEEE International Conference on Enterprise Computing, E-Commerce,
and E-Services, July 2007, Tokyo, 2007.
[61] J. Wax, Speech analytics: providing unparalleled levels of business intelligence, Journal of Customer & Contact Centre Management 1 (2) (2011) 195–204.
[62] H. Hasan, P. Hyland, Using OLAP and multidimensional data for decision making, IT
Professional 3 (5) (2001) 44–50.
[63] N. Jukic, B. Jukic, M. Malliaris, Online Analytical Processing (OLAP) for Decision Support, in: F. Burstein, C. Holsapple (Eds.), Handbook on Decision Support Systems 1,
Springer, Heidelberg, 2008, pp. 259–276.
[64] J.N. Hallick, Analytics and the data warehouse, Health Management Technology 22
(6) (2001) 24–25.
[65] D. Taniar, Progressive Methods in Data Warehousing and Business Intelligence:
Concepts and Competitive Analytics, Information Science Reference, Hershey, PA,
2009.
[66] J. Celko, Analytics and OLAP in SQL, Morgan Kaufman, San Francisco, 2006.
[67] M. Hsu, Q. Chen, Scalable data-intensive analytics, Lecture Notes in Business Information Processing 27 (2009) 97–107.
[68] R.W. Selby, Analytics-driven dashboards enable leading indicators for requirements
and designs of large-scale systems, IEEE Software 26 (1) (2009) 41–49.
[69] U. Olsson, J. Fourie, Analytics-the truth is in there, Ericsson Review 88 (1) (2011)
28–33.
[70] D. Keim, G. Andrienko, J.-D. Fekete, C. Görg, J. Kohlhammer, G. Melançon, Visual analytics: definition, process, and challenges, Lecture Notes in Computer Science 4950
(2008) 154–175.
[71] C. Chabot, Demystifying visual analytics, IEEE Computer Graphics and Applications
29 (2) (2009) 84–87.
[72] J. Kohlhammer, D. Keim, M. Pohl, G. Santucci, G. Andrienko, Solving problems with
visual analytics, Procedia Computer Science 7 (2011) 117–120.
[73] T. Davenport, J.G. Harris, Competing on Analytics, HBR Press, Boston, 2007.
[74] J. Pfeffer, R.I. Sutton, Evidence-based management, Harvard Business Review 84 (1)
(2006) 62–75.
[75] J. Pfeffer, Evidence-Based Management for Entrepreneurial Environments: Faster
and Better Decisions with Less Risk, Stanford GSB Research Paper No. 2051, 2010.
[76] J. Rycroft‐Malone, K. Seers, A. Titchen, G. Harvey, A. Kitson, B. McCormack, What
counts as evidence in evidence‐based practice? Journal of Advanced Nursing 47
(1) (2004) 81–90.
[77] J. Higgs, M. Jones, Will evidence-based practice take the reasoning out of practice?
in: J. Higgs, M. Jones (Eds.), Clinical Reasoning in the Health Professionals, 2nd edition, Butterworth Heineman, Oxford, 2000, pp. 307–315.
[78] H.A. Simon, Bounded rationality and organizational learning, Organization Science 2
(1) (1991) 125–134.
[79] C.W. Holsapple, K.D. Joshi, A formal knowledge management ontology: conduct, activities, resources, and influences, Journal of the American Society for Information
Science and Technology 55 (7) (2004) 593–612.
[80] R.T. Herschel, N.E. Jones, Knowledge management and business intelligence: the importance of integration, Journal of Knowledge Management 9 (4) (2005) 45–55.
[81] R. Bapna, P. Goes, R. Gopal, J.R. Marsden, Moving from data-constrained to dataenabled research: experiences and challenges in collecting, validating and analyzing
large-scale e-commerce data, Statistical Science 21 (2) (2006) 116–130.
[82] INFORMS Online, INFORMS, Insta-poll results: INFORMS (almost) official definition
of analytics?, Posted June 25, 2012 http://www.informs.org/Blogs/E-News-Blog/INFORMS-Insta-Poll-Results-INFORMS-Almost-Official-Definition-of-Analytics 2012.
[83] D. Teece, G. Pisano, A. Shuen, Dynamic capabilities and strategic management, Strategic Management Journal 18 (7) (1997) 509–533.
[84] P.A. Pavlou, O.A. El Sawy, The “Third Hand”: IT-enabled competitive advantage in
turbulence through improvisational capabilities, Information Systems Research 21
(3) (2010) 443–471.
[85] V.L. Patel, D.R. Kaufman, J.F. Arocha, Emerging paradigms of cognition in medical
decision-making, Journal of Biomedical Informatics 35 (1) (2002) 52–75.
[86] D.J. Bryant, D.G. Webb, C. McCann, Synthesizing two approaches to decision making
in command and control, Canadian Military Journal (Spring 2003) 29–34.
[87] E. Salas, M.A. Rosen, D. DiazGranados, Expertise-based intuition and decision making in organizations, Journal of Management 36 (4) (2010) 941–973.
[88] D. Kiron, R. Shockley, N. Kruschwitz, G. Finch, M. Haydock, Analytics: the widening
divide, Sloan Management Review 53 (3) (2011) 1–22.
[89] K. Ramamurthy, A. Sen, A.P. Sinha, Data warehousing infusion and organizational
effectiveness, IEEE Transactions on Systems, Man, and Cybernetics 38 (4) (2008)
976–994.
[90] T. Larsson, R. Lundgren, The Power of Knowing: A Case Study on Data Driven Management, Lund Institute of Technology, Lund, Sweden, 2009.
[91] M. Liberatore, W. Luo, The analytics movement: implications for operations research, Interfaces 40 (4) (2010) 313–324.
[92] R. Bose, Advanced analytics: opportunities and challenges, Industrial Management
and Data Systems 109 (2) (2009) 155–172.
[93] J. Taylor, Intelligent automated processes: embedding analytics in decisions”, in: L.
Fischer (Ed.), Business Process Management and Workflow Handbook, Lighthouse
Point, FL, Future Strategies Inc, 2010, pp. 71–78.
[94] S. Tyagi, Using data analytics for greater profits, Journal of Business Strategy 24 (3)
(2003) 12–14.
[95] TWDI, Portal of the Data Warehousing Institute, http://tdwi.org/portals/businessanalytics.aspx 2012 accessed June 2013.
[96] L.E. Rosenberger, J. Nash, The Deciding Factor: The Power of Analytics to Make Every
Decision a Winner, Jossey-Bass, San Francisco, 2009.
[97] G. Ravishanker, Doing Academic Analytics Right: Intelligent Answers to Simple
Questions, Research Bulletin, EDUCAUSE Center for Applied Research, 2011.
C. Holsapple et al. / Decision Support Systems 64 (2014) 130–141
Dr. Clyde W. Holsapple, a Fellow of the Decision Sciences Institute and Senior Member of
the ACM, holds the Rosenthal Endowed Chair in the University of Kentucky's Gatton College of Business. He has authored more than 150 articles in journals including Decision
Support Systems, Decision Sciences, Operations Research, Journal of Operations Management,
Organization Science, Entrepreneurship Theory and Practice, Journal of the American Society
for Information Science and Technology, ACM Transactions on Management Information Systems, Communications of the ACM, Journal of Management Information Systems, Information
& Management, Information Systems Journal, Information Systems Research, Journal of
Knowledge Management, Journal of Strategic Information Systems, Expert Systems with Applications, IEEE Intelligent Systems, International Journal of Electronic Commerce, and the IEEE
Transactions on Systems, Man, and Cybernetics. Professor Holsapple’s books include Foundations of Decision Support Systems, Decision Support Systems: A Knowledge-based Approach,
Handbook on Knowledge Management, and Handbook on Decision Support Systems. In all,
his publication impact exceeds the 10,000 level, with an h-index of 50. Thomson Essential
Science Indicators has recognized his research as being among the top 1%; he is the recipient of the AIS-SIGDSS Award for the year’s Best Journal Research Paper, IACIS Computer
Educator of the Year, UK Chancellor’s Award for Outstanding Teaching, and universitywide recognition for Teaching Excellence across several years at the University of Illinois.
Dr. Holsapple has chaired 30 doctoral dissertations and serves as Editor-in-Chief of the
Journal of Organizational Computing and Electronic Commerce.
Dr. Anita Lee-Post is Associate Professor in the Department of Marketing and Supply
Chain at the University of Kentucky. She received her Ph.D. in Business Administration
from the University of Iowa. Her research interests include sustainable supply chains, supply chain relationship management, e-learning, artificial intelligence, machine learning,
knowledge-based systems, computer integrated manufacturing, and group technology.
She has published in such journals as OMEGA, Decision Sciences Journal of Innovative
Education, International Journal of Production Research, Computer and Industrial Engineering,
International Journal of Operations and Production Management, Journal of the Operational
Research Society, Annals of Operations Research, Journal of Information Technology, Information & Management, Expert Systems, Expert Systems with Applications, IEEE Expert, Journal of
Intelligent Manufacturing, and OM Review. She is the author of Knowledge-based FMS
Scheduling: An Artificial Intelligence Perspective. Dr. Lee-Post serves on editorial review
boards of the International Journal of Data Mining, Modeling and Management, Journal of
Managerial Issues, Production Planning and Control, and Business Systems Research.
141
Dr. Ramakrishnan (Ram) Pakath is Professor of Finance and Quantitative Methods, at the
Gatton College of Business & Economics, University of Kentucky. He holds a bachelor’s degree in Mechanical Engineering from Bangalore University, a master’s degree in Business
Administration from University of Madras), a master’s degree in Operations Research
and Industrial Engineering from The University of Texas at Austin, and a doctorate in Management (MIS) from Purdue University. Ram's current research interests lie in the areas of
Evolutionary Computation and Data Mining. His research articles have appeared in such
refereed forums as Behaviour and Information Technology, Computational Economics,
Decision Sciences, Decision Support Systems, European Journal of Operational Research, IEEE
Transactions on Systems, Man, and Cybernetics, Information & Management, Information
Systems Research, Journal of Computer Information Systems, and Journal of Electronic
Commerce Research. He is author of the book Business Support Systems: An Introduction
published by Copley. Ram has contributed refereed material to the following books: Cases
on Information Technology Management in Modern Organizations, Decision Support Systems:
A Knowledge-based Approach, Handbook on Decision Support Systems 1 — Basic Themes,
Handbook of Industrial Engineering, Management Impacts of Information Technology:
Perspectives on Organizational Change and Growth, Multimedia Technology and Applications,
Security, Trust, and Regulatory Aspects of Cloud Computing in Business Environments, and
Operations Research and Artificial Intelligence. He is an Associate Editor of Decision Support
Systems and an Editorial Board Member of the Journal of End User Computing, Management,
and The Open Artificial Intelligence Journal. His research has been funded by IBM, Ashland
Oil, the Gatton College, the University of Kentucky, and the Kentucky Science and Engineering Foundation.