Analytics 3.0 OK
Analytics 3.0 OK
Systems
Analytics 3.0
Analytics 3.0
Those of us who have spent years studying “data smart” companies
believe we’ve already lived through two eras in the use of analytics.
We might call them BBD and ABD—before big data and after big data.
Or, to use a naming convention matched to the topic, we might say that
Analytics 1.0 was followed by Analytics 2.0.
Generally speaking, 2.0 releases don’t just add some bells and whistles
or make minor performance tweaks. In contrast to, say, a 1.1 version, a
2.0 product is a more substantial overhaul based on new priorities and
technical possibilities.
Analytics 3.0
When large numbers of companies began capitalizing on vast new
sources of unstructured, fast-moving information—big data—that was
surely the case.
The use of data to make decisions is, of course, not a new idea; it is as
old as decision making itself.
But the field of business analytics was born in the mid-1950s, with the
advent of tools that could produce and capture a larger quantity of
information and discern patterns in it far more quickly than the
unassisted human mind ever could.
The Evolution of Analytics
Analytics 1.0—the era of “business intelligence.” What we are here
calling Analytics 1.0 was a time of real progress in gaining an objective,
deep understanding of important business phenomena and giving
managers the fact-based comprehension to go beyond intuition when
making decisions.
For the first time, data about production processes, sales, customer
interactions, and more were recorded, aggregated, and analyzed. New
computing technologies were key. Information systems were at first
custom-built by companies whose large scale justified the investment;
later, they were commercialized by outside vendors in more-generic
forms.
The Evolution of Analytics
This was the era of the enterprise data warehouse, used to capture
information, and of business intelligence software, used to query and
report it.
Data sets were small enough in volume and static enough in velocity to
be segregated in warehouses for analysis. However, readying a data
set for inclusion in a warehouse was difficult. Analysts spent much of
their time preparing data for analysis and relatively little time on the
analysis itself.
The Evolution of Analytics
More than anything else, it was vital to figure out the right few questions
on which to focus, because analysis was painstaking and slow, often
taking weeks or months to perform.
Although the term “big data” wasn’t coined immediately, the new reality
it signified very quickly changed the role of data and analytics in those
firms.
Big data also came to be distinguished from small data because it was
not generated purely by a firm’s internal transaction systems.
The Evolution of Analytics
It was externally sourced as well, coming from the internet, sensors of
various types, public data initiatives such as the human genome
project, and captures of audio and video recordings.
As analytics entered the 2.0 phase, the need for powerful new tools—
and the opportunity to profit by providing them—quickly became
apparent.
Soon the data scientists were not content to remain in the back office;
they wanted to work on new product offerings and help shape the
business.
The Evolution of Analytics
Analytics 3.0—the era of data-enriched offerings. During 2.0, a
sharp-eyed observer could have seen the beginnings of analytics’ next
big era.
It’s every firm in every industry. If your company makes things, moves
things, consumes things, or works with customers, you have increasing
amounts of data on those activities.
Every device, shipment, and consumer leaves a trail. You have the
ability to analyze those sets of data for the benefit of customers and
markets. You also have the ability to embed analytics and optimization
into every business decision made at the front lines of your operations.
The Evolution of Analytics
Like the first two eras of analytics, this one brings new challenges and
opportunities, both for the companies that want to compete on analytics
and for the vendors that supply the data and tools with which to do so.
First, however, let’s consider what Analytics 3.0 looks like in some well-
known firms—all of which were decidedly offline businesses for most of
their many decades in operation.
The Next Big Thing, in Beta
The Bosch Group, based in Germany, is 127 years old, but it’s hardly
last-century in its application of analytics. The company has embarked
on a series of initiatives across business units that make use of data
and analytics to provide so-called intelligent customer offerings.
It relies heavily on online map data and optimization algorithms and will
eventually be able to reconfigure a driver’s pickups and deliveries in
real time. In 2011 it cut 85 million miles out of drivers’ routes, thereby
saving more than 8.4 million gallons of fuel.
The Next Big Thing, in Beta
The common thread in these examples is the resolve by a company’s
management to compete on analytics not only in the traditional sense
(by improving internal business decisions) but also by creating more-
valuable products and services. This is the essence of Analytics 3.0.
A new set of data management options. In the 1.0 era, firms used
data warehouses as the basis for analysis. In the 2.0 era, they focused
on Hadoop clusters and NoSQL databases.
Ten Requirements for Capitalizing on
Analytics 3.0
Today the technology answer is “all of the above”: data warehouses,
database and big data appliances, environments that combine
traditional data query approaches with Hadoop (these are sometimes
called Hadoop 2.0), vertical and graph databases, and more.
Chief analytics officers. When analytics are this important, they need
senior management oversight. Companies are beginning to create
“chief analytics officer” roles to superintend the building and use of
analytical capabilities.
Ten Requirements for Capitalizing on
Analytics 3.0
Organizations with C-level analytics leaders include AIG, FICO, USAA,
the University of Pittsburgh Medical Center, the Obama reelection
campaign, Wells Fargo, and Bank of America. The list will undoubtedly
grow.
IBM, for instance, formerly used 150 models in its annual “demand
generation” process, which assesses which customer accounts are
worth greater investments of salesperson time and energy.
Ten Requirements for Capitalizing on
Analytics 3.0
Working with a small company, Modern Analytics, and using a “model
factory” and “data assembly line” approach, IBM now creates and
maintains 5,000 such models a year—and needs just four people to do
so. Its new systems can build 95% of its models without any human
intervention, and another 3% require only minimal tuning from an
analyst.
Many will give you greater certainty before taking action. Managers
need to become comfortable with data-driven experimentation. They
should demand that any important initiative be preceded by small-scale
but systematic experimentation of this sort, with rigorous controls to
permit the determination of cause and effect. Imagine, for example, if
Ron Johnson’s tenure as CEO of J.C. Penney had involved limited
experiments rather than wholesale changes, most of which turned out
badly.
Ten Requirements for Capitalizing on
Analytics 3.0
Paradoxically, some of the changes prompted by the widespread
availability of big data will not yield much certainty. Big data flows
continuously—consider the analysis of brand sentiment derived from
social media sources—and so metrics will inevitably rise and fall over
time.
Such “digital smoke signals,” as they have been called, can serve as an
early warning system for budding problems. But they are indicative, not
confirmatory. Managers will have to establish guidelines for when early
warnings should cue decisions and action.
Ten Requirements for Capitalizing on
Analytics 3.0
Additional uncertainty arises from the nature of big data relationships.
Unless they are derived from formal testing, the results from big data
generally involve correlation, not causation, and sometimes they occur
by chance (although having greater amounts of data increases the
likelihood that weak results will be statistically significant).
The online companies that unleashed big data on the world were built
around it from the beginning. They didn’t need to reconcile or integrate
big data with traditional sources of information and the analytics
performed on it, because for the most part, they didn’t have those
traditional sources.
Creating Value in the Data Economy
They didn’t need to merge big data technologies with traditional IT
infrastructures; in their companies, those infrastructures didn’t exist. Big
data could stand alone, big data analytics could be the only analytics,
and big data technology architectures could be the only IT
architectures. But each of these companies now has its own version of
Analytics 3.0.