Executive's Guide To Prescriptive Analytics - ZDNet
Executive's Guide To Prescriptive Analytics - ZDNet
SPECIAL FEATURE
By Brandon Vigliarolo | June 3, 2019 -- 13:15 GMT (14:15 BST) | Topic: How to Win with Prescriptive Analytics
SPECIAL FEATURE
Cookie Settings
(https://www.techrepublic.com/resource-library/whitepapers/special-report-how-to-win-with-prescriptive-analytics-free-pdf/)
Special Report: How to Win with Prescriptive Analytics (free PDF) (https://www.techrepublic.com/resource-
library/whitepapers/special-report-how-to-win-with-prescriptive-analytics-free-pdf/)
This ebook, based on the latest ZDNet / TechRepublic special feature, explores how you set up an analytics
infrastructure that sees around corners and gives you options to avoid a head-on crash.
Read More (https://www.techrepublic.com/resource-library/whitepapers/special-report-how-to-win-with-prescriptive-analytics-free-pdf/)
The modern world is inundated with data. It comes to businesses in the form of location data, personal
information, cookies, online behavior, buying habits … almost everything internet users do on the web
and with their mobile devices ends up generating data of some kind.
One of the biggest ways that businesses can use data is to predict the future through a process called
business analytics, in which companies build models of the past, present, and future based on the
data they've collected. But knowing what is potentially to come isn't worth a whole lot unless you can
figure out what to do with that knowledge, which is where the final stage of business analytics, known
as prescriptive analytics, comes in.
Far from being just a forecasting tool, prescriptive analytics aims to go beyond simply describing one
potential future: It tugs on the various threads that make up data models to prescribe the best
possible course of action.
If your organization gathers data and has yet to commit to a prescriptive analytics program, it's time to
consider it. Proper prescriptive programs can save time and money, maximize profit, and ensure that
your business is running as smoothly as possible.
How does prescriptive analytics work? Starting with the predictive model, data scientists make adjustments to
various aspects of the model and use multiple types of tech to see how their changes affect the model.
Alternatively, they can tweak the model with a specific goal in mind in an attempt to meet a desired conclusion
in the most efficient way possible.
How can prescriptive analytics help businesses succeed? With prescriptive analytics, the game is always the
same: optimal performance for minimal cost. Think of a way your organization could improve, then think about
what it would take to do that. Prescriptive analytics is the same thing, but with a higher degree of certainty
that's more likely to lead to success.
How does a business get started with prescriptive analytics? Businesses that are already performing data
analytics have a leg up on getting to prescriptive analysis. Those that aren't are in for a long, but important,
journey that includes hiring new analysts, finding the right prescriptive analytics platform, and building up the
descriptive and predictive models that are essential to a good prescription.
Called descriptive analytics, this first stage involves analyzing data to determine the current state of
an organization.
Any data that creates a picture of the present can be used to create a descriptive model. Common
types of data are customer feedback, budget reports, sales numbers, and other information that
allows an analyst to paint a picture of the present using data about the past.
A thorough and complete descriptive model can then be used in predictive analysis to build a model
of what's likely to happen in the future if the organization's current course is maintained without any
change.
Cookie Settings
Predictive models are useful, but they aren't designed to do anything outside of predicting current
trends into the future. That's where prescriptive analytics comes in.
A good prescriptive model will account for all potential data points that can alter the course of
business, make changes to those variables, and build a model of what's likely to happen if those
changes are made.
Prescriptions, like those made by doctors, try to solve a problem in the most efficient way possible.
Exercise and a good diet may be enough to resolve a medical problem, but for businesses the best
way forward may be adjustments to their supply chain, a streamlined decision-making process, more
aggressive sales tactics, or anything else that, in combination with other data points, creates a model
of success.
It's worth noting that prescriptive analytics isn't a new concept (http://analytics-magazine.org/the-analytics-
journey/) -- at least according to the IBM team that coined the term in a 2010 article. Prescriptive models
were used long before the modern era, but they were limited to the ability of humans to crunch
numbers and account for variables.
"Many problems simply involve too many choices or alternatives for a human decision-maker to
effectively consider, weigh and trade off," the article said. With the advent of modern machine learning
and supercomputing, it's far more practical to build models that are highly accurate and account for
practically everything.
As mentioned above, prescriptive analysis is all about optimization, so data points that affect optimal
business practices are what need to be included. Data used can be either structured or unstructured,
so let's take a look at potential variables broken into those two categories.
STRUCTURED DATA
Customer data
Sales records
Cookie Settings
Shipping records
Inventory databases
Billing records
Timecard records
Structured data is essentially anything that is already in a relational database where there are clearly
defined fields into which the data is sorted.
UNSTRUCTURED DATA
Customer feedback
Web traffic
Emails
Invoices
Paper records
Documents, both digital and physical
Photographs
Audio recordings
Videos
Raw data from IoT devices
Unstructured data can essentially be anything that doesn't fall under the structured data category.
Chances are your organization has a ton of unstructured data, and lots of it may be important for
building a prescriptive model.
In the article, the IBM team defines prescriptive analytics as, "A set of mathematical techniques that
computationally determine a set of high-value alternative actions or decisions given a complex set of
objectives, requirements, and constraints, with the goal of improving business performance."
If that all sounds like a lot, it's because prescriptive analytics, while simple in concept, requires a lot of
work to pull off.
Some of the tools that go into prescriptive analysis include graph analysis, simulations, complex event
processing that accounts for additional data, neural networks that combine multiple forms of machine
learning, and recommendation engines to model user/customer response.
Cookie Settings
Additionally, prescriptive analytics has to account for uncertainty in data through the use of heuristics,
which is practically a whole other science in and of itself.
Heuristic problem solving involves finding the best course to a practical solution that, while not
necessarily the most efficient or ideal, is the best one when uncertainty makes finding the ideal
solution impossible.
Given the size and scope of prescriptive models, it makes sense that heuristics are a fundamental part
of a good prescription. In many cases, it's simply not possible to find the mathematically optimal
solution, so the most practical one that's still an improvement becomes ideal.
Once all variables are accounted for, a model is built and a problem to solve for is determined, a
prescriptive analysis can be undertaken. Modern prescriptive analytics tools are often capable of
returning results in near real time, making them ideal for changing course on the fly to account for the
rapid pace of the modern business world.
If you can think of a business need, have found something in your other forms of data analysis that
raises questions, or are simply faced with a high degree of uncertainty about the future, there are
ways prescriptive analysis can help.
DirectBuy, a membership-based consumer buying club (similar to Sam's Club or Costco) used
regression techniques and decision tree models (https://ibm.cioreview.com/cioviewpoint/developing-predictive-and-
prescriptive-business-analytics-a-case-study-nid-10680-cid-117.html) to help its member services team develop more
effective member retention models and even eliminate several retention programs that were found to
be ineffective.
BondIT, a digital investment platform, used IBM's prescriptive platform to build portfolios for fixed-
income individuals (https://www.ibm.com/case-studies/bondit). Machine learning models have enabled BondIT
to custom tailor portfolios to each individual in minutes rather than days and have reduced risk by 30
percent while retaining similar earnings for clients.
FleetPride, which delivers spare parts for commercial vehicles, used prescriptive analytics to
streamline its supply chain (https://www.ibm.com/case-studies/fleetpride). Prescriptive models improved
Cookie Settings
inventory management, reduced labor costs, increased profits, and eliminated nearly all errors in
shipment packing.
Maximizing airline profits by eliminating over-fueling and creating more efficient flight scheduling
Improving the cost/benefit ratio of healthcare procedures by finding the best course of action for particular
client cases
Finding the best pricing scheme for items in a retail store
Optimizing product layout on shelves in a store
Determining the most efficient way to schedule shifts in a 24-hour factory
Finding the optimum source for raw materials
The ways in which prescriptive analysis can benefit an organization are incredibly varied, and it's hard
to say precisely how a prescriptive analytics program could help a particular business succeed.
From a general standpoint, as mentioned above, prescriptive analytics aims to find the optimal course
of action to meet a specific business goal. Whether that goal is something general, like improving
profits, or something specific, like determining the optimal spot to drill a well, a good prescriptive
analytics program has the potential to transform a business and make the future easier to see.
Organizations that are new to business data analytics, on the other hand, have a lot of work cut out for
them (https://www.zdnet.com/article/business-analytics-the-essentials-of-data-driven-decision-making/).
STEP 0: DATA
If your organization doesn't have a good handle on its data, or if you haven't been collecting it in any
meaningful way, you'll need to start here.
All the data you think may be relevant should be gathered and organized. This includes both
structured and unstructured data.
Cookie Settings
STEP 1: STATE A GOAL
Like any business transformation initiative, it's nearly impossible to start working toward prescriptive
analytics without sitting down and stating a goal. As mentioned before, this can be a particular goal or
something broader with a larger impact on the entire organization.
Regardless of how you approach a prescriptive program, it's essential to set a goal so you know what
you're working with.
STEP 2: OUTLINE
Once you know where you want to get to with a prescriptive analytics program, it's time to figure out
what you need to do to get there. This step is going to be the one you spend the most time in, at least
from a planning perspective, so make sure you don't skip anything essential.
Analytics software comes in a lot of flavors, and the one you choose can have a huge impact on what
your team is going to need to do. Different types of platforms are available, and they largely fall into
three categories: prepackaged software, modeling platforms, and solvers.
Prepackaged software is typically built to support a particular kind of problem and industry and
includes products like:
The other two forms of software, modeling platforms and solvers, tend to be used in tandem.
Modeling platforms, according to River Logic (https://www.riverlogic.com/technology/prescriptive-analytics/), are
used to build mathematical models that define problems. Solvers, on the other hand, are used to
develop solutions based on the problem defined in a modeling platform.
Cookie Settings
There are dozens of both types of platforms. Some of the leaders in these types of software include
IBM (https://www.ibm.com/analytics/prescriptive-analytics), NGDATA (https://www.ngdata.com/dictionary/prescriptive-
analytics/), River Logic (https://www.riverlogic.com/), FICO (https://www.fico.com/en/products?category=644), and SAS
(https://www.sas.com/en_us/insights/analytics/predictive-analytics.html).
Modeling and solving software tends to be more adaptable, but it requires a lot more analytics
proficiency to use. If you're going to go this route be prepared to spend more money—but you should
end up with a solution more tailored to your organization's unique needs.
STEP 3: TEST
Once you have a team in place, have selected a platform, and know what you want to accomplish, it's
time to start the real analytics work.
The first thing your analytics team should do is build a proof-of-concept that will help you understand if
your project is feasible.
Once the proof has been worked out, you can use the same basic model to build a small-scale
implementation with real data. Perform small-scale tests to see what kind of results you get, make
adjustments, and then test at larger and larger scales.
At this point, it's advisable to scale up to full strength, but work on building a solid predictive model
you can use as your baseline for performing prescriptive experiments.
Once all applicable data has been added, prescriptive formulae and algorithms have been built, and
everything is running reliably, it's time for the final step: actual prescriptive analysis.
Feedback from a working prescriptive system can be quick, so be sure your organization is ready to
start implementing what your models suggest immediately. Prescriptive models that aren't acted upon
are just as useless as ignoring a doctor's orders, so don't let your plans languish by not enacting them
immediately.
This whole process can take time. Months or even years can be spent getting things right before
receiving actionable prescriptive analysis. Don't let that stop you from investing, though. Prescriptive
analytics is fast becoming a major contributor to success in the modern business world.
Why guess, make mistakes, and start again at square one when you can use the advanced machine
learning and big data (https://www.zdnet.com/topic/big-data/) crunching software available on the market
today? There may be big startup costs involved, but the well-known adage "You have to spend money
to make money" applies perfectly to prescriptive analytics.
Cookie Settings
Also see
Prescriptive analytics: A cheat sheet (https://www.techrepublic.com/article/prescriptive-analytics-a-cheat-sheet/)
(TechRepublic)
Prescriptive analytics: An insider's guide (free PDF) (https://www.techrepublic.com/resource-library/whitepapers/prescriptive-
analytics-an-insider-s-guide-free-pdf/) (TechRepublic)
Feature comparison: Data analytics software and services (http://www.techproresearch.com/downloads/feature-comparison-
data-analytics-software-and-services/) (Tech Pro Research)
ENTERPRISE SOFTWARE
By Brandon Vigliarolo | June 3, 2019 -- 13:15 GMT (14:15 BST) | Topic: How to Win with Prescriptive Analytics
Une fois que vous essayez Deejo, vous ne voudrez plus changer de couteau de poche
Deejo
Vos achats en ligne plus sûrs ? Testez ESET Internet Security 30 jours
ESET Antivirus
The 29 Coolest Gadgets Of 2020 (Hurry, These Will Be Gone Very Soon)
TechGadgetZone
Fabriqué en Suisse: La montre slow vous rappelle de cesser de courir après le temps
slow-watches.com
"Même la boîte de montre est incroyable." Ces montres sont incomparables ! Pièces uniques en bois et en pierre
Holzkern
Les français nés après 1962 payant +2869€ d'impôts vont apprécier le dispositif, faites le test !
Simulation Loi Pinel
Getting your corporate data ready for prescriptive analytics: data quantity
and quality in equal measures
Good news: there's nothing special about getting your data ready for prescriptive analytics.
Bad news: you need to do what's needed to get your data ready for any type of analytics -
and that's hard work.
Cookie Settings
By George Anadiotis for Big on Data | June 21, 2019 -- 19:11 GMT (20:11 BST) | Topic: How to Win with Prescriptive Analytics
Prescriptive analytics is nothing short of automating your business. This was the silver lining as we
explored the complexities of prescriptive analytics in our guide (https://www.zdnet.com/article/a-guide-for-
prescriptive-analytics-the-art-and-science-of-choosing-and-applying-the-right-techniques/). While a lot of that complexity
is something line of business and expert data scientists will have to deal with, IT is not out of the
equation either.
Prescriptive analytics is hard, and there's no silver bullet that can get you there without having gone
through the evolutionary chain of analytics. You have to get the data collection and storage
infrastructure right, the data modeling right, and the state classification and prediction right.
SEE ALSO
Cookie
This Settings
is the prescriptive analytics bottom line, and IT has to make sure the data collection and storage
infrastructure parts are in place for business and data science to do their parts. The data cleaning and
organization necessary for success with prescriptive analytics can be thought of along two
dimensions: quantity and quality of the data that will be used to feed the analytics.
Data quantity
To begin with, IT needs to make sure all the data pertinent to the organization are accounted for and
accessible. This really is a sine qua non of any analytics effort, but it may be more complicated than it
sounds.
Consider all the applications an organization may be using: custom built, off the shelf, on premises, in
the cloud, legacy. Each of those may have its own format, storage, and API. IT needs to make sure
they are all accessible, without disrupting the operation of applications. A data lake approach
(https://www.zdnet.com/article/a-standard-for-storing-big-data-apache-spark-creators-release-open-source-delta-lake/) may be
useful in that respect.
And it gets worse. Data may also live beyond applications. Consider all the internal documents and
emails, for example. More often than not, a wealth of data lives in unstructured format and
undocumented sources. And many applications are also undocumented, unaccessible, and lack APIs
to export data. For those, you will have to either get resourceful, or fail fast.
Even where you succeed, however, this is not a one-off exercise. Applications evolve, and with them
so do their data. APIs change, schemas change, new data is added. New applications get thrown in
the mix, and old ones become deprecated. Staying on top of data collection requires constant effort,
and this is a cost you need to factor in when embarking on your prescriptive analytics journey. Adding
semantics to your data lake (https://www.zdnet.com/article/semantic-data-lakes-architecture-in-healthcare-and-beyond/)
may help.
Cookie Settings
How much data is enough? As much as possible.
Speaking of cost: of course, the usual IT provisioning discourse applies here, too. Do you plan ahead,
make this a project with predetermined budget for infrastructure and personnel costs, and get it
through the organizational budget approval process? Or do you take a more agile, pay-as-you-go
approach?
The former is theoretically safer, and more in line with organizational processes. Here's the problem:
Unless your data sources are relatively limited and well understood, and you are very thorough in
keeping track and provisioning for them, this approach may be impossible in practice.
The latter is more flexible, but can also lead to budget overrun and shadow IT issues. Without some
sort of method to the madness, you may end up spending beyond control, and having your data
stored all over the place. Although this is not a 100 percent strict rule, the budgeting ahead approach
makes more sense when going for on-premises storage, while cloud storage and development
(https://www.zdnet.com/article/data-driven-software-development-in-the-cloud-trends-opportunities-and-threats/) lends itself
well to the pay-as-you-go approach.
Finally, data freshness is one more consideration to take into account. If you want your analytics to
reflect the real world in real time, the data that feeds it should come in real time, too. This means you
should consider streaming data infrastructure (https://www.zdnet.com/pictures/streaming-becomes-mainstream/).
While there are benefits in adopting streaming, it's a new paradigm that comes with its own learning
curve and software/hardware/people investment.
Special Report: How to Win with Prescriptive Analytics (free PDF) (https://www.techrepublic.com/resource-
library/whitepapers/special-report-how-to-win-with-prescriptive-analytics-free-pdf/)
This ebook, based on the latest ZDNet / TechRepublic special feature, explores how you set up an analytics
infrastructure that sees around corners and gives you options to avoid a head-on crash.
Read More (https://www.techrepublic.com/resource-library/whitepapers/special-report-how-to-win-with-prescriptive-analytics-free-pdf/)
Data quality
Garbage in, garbage out is a golden rule when it comes to using data to get analytics insights. So
while getting all the data you can get a hold of is an absolute must, it won't help much if you just dump
it in a data lake and consider your work done. That's precisely the reason data lakes have gotten a
bad name - data swamp, anyone?
To pick up on the "data evolves" theme - this really should be your number one priority. Data
governance (https://www.zdnet.com/article/moving-fast-without-breaking-data-governance-for-managing-risk-in-machine-
learning-and-beyond/), that is. Yes, this does sound abstract, but it's just as important as building a pipeline
to channel your data to your data lake. Each dataset should come with metadata on its lineage (where
does the data come from), its acquisition date, as well as access rights and processing history.
This latter aspect has become increasingly important in today's GDPR world. Of course, not all
organizations deal with user data, and even for the ones who do, not all data will be related to users.
Still, most organizations at least touch upon some personal data. For those, GDPR provisions
(https://www.zdnet.com/article/gdpr-in-real-life-transparency-innovation-and-adoption-across-borders-and-organizations/) need
to apply.
So the question becomes: what's more efficient - dividing and conquering, or giving all data the GDPR-
ready treatment? In many cases, if the infrastructure to apply full circle data governance is there
anyway, it makes sense to apply it to all data. This may make make dataset processing a less
lightweight process. But the benefits of metadata for downstream applications can very well make up
for this.
Cookie Settings
Only quality data can lead to better analytics. And you have to learn to listen to the data, too.
To boot, master data management is something that can benefit from metadata. When collecting data from
many sources, the same entity may well exist more than once. For example, references to customer X will
probably exist in the CRM system, in a number of emails and documents, and in the ERP system.
Often, the gist of data quality comes down to truly mundane issues (https://www.zdnet.com/article/artificial-
intelligence-in-the-real-world-what-can-it-actually-do/): unit conversions, data formats, and the infamous "address
issue". To quote Mark Bishop, Tungsten TCIDA director:
"My team and myself were hired to work with Tungsten to add more intelligence in their SaaS offering. The
Cookie Settings
idea was that our expertise would help get the most out of data collected from Tungsten's invoicing
solution. We would help them with transaction analysis, fraud detection, customer churn, and all sorts of
advanced applications.
But we were dumbfounded to realize there was an array of real-world problems we had to address before
embarking on such endeavors, like matching addresses. We never bothered with such things before -- it's
mundane, somebody must have addressed the address issue already, right? Well, no. It's actually a thorny
issue that was not solved, so we had to address it."
Conclusion
SPECIAL FEATURE
(/topic/turning-big-data-into-business-insights/)
Addresses are a good example of how data quantity and quality need to coalesce if you want to have
dataset that can feed prescriptive analytics efforts. Your data scientists won't be able to do feature
engineering to capitalize on the business expertise, if your data is not abundant and clean.
But that already implies the most important thing, which is often left out of the equation: culture. No
organization can benefit from prescriptive analytics, without change in attitude to become data driven
(https://www.zdnet.com/article/dataops-changing-the-world-one-organization-at-a-time/). And that is perhaps the most
important by-product of going through the evolutionary path of analytics.
If you do that, what you'll find is that you will no longer be thinking in terms of IT and business: data is
business, and it's everyone's job to produce and consume it. IT is just the facilitator..
Cookie Settings
RELATED TOPICS: DIGITAL TRANSFORMATION ROBOTICS INTERNET OF THINGS INNOVATION
By George Anadiotis for Big on Data | June 21, 2019 -- 19:11 GMT (20:11 BST) | Topic: How to Win with Prescriptive Analytics
Une fois que vous essayez Deejo, vous ne voudrez plus changer de couteau de poche
Deejo
Vos achats en ligne plus sûrs ? Testez ESET Internet Security 30 jours
ESET Antivirus
The 29 Coolest Gadgets Of 2020 (Hurry, These Will Be Gone Very Soon)
TechGadgetZone
Fabriqué en Suisse: La montre slow vous rappelle de cesser de courir après le temps
slow-watches.com
"Même la boîte de montre est incroyable." Ces montres sont incomparables ! Pièces uniques en bois et en pierre
Holzkern
Les français nés après 1962 payant +2869€ d'impôts vont apprécier le dispositif, faites le test !
Simulation Loi Pinel
14 215 876 y jouent ! Le nouveau jeu de construction qui rend tout le monde accro. Pas d'installation
Forge Of Empires - Jeu en ligne gratuit
SHOW COMMENTS
Cookie Settings
Cookie Settings