3 Ways To Reduce The Cost of AI

Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

3 Ways to Reduce

the Cost of AI
BACKGROUND PICTURE IT’S JUST A PLACEHOLDER

POV FACT
You want to see ROI on AI initiatives — who doesn’t? Maximizing ROI requires wide deployment across the
After all, the potential is huge. enterprise.

ROI on leveraging AI Scale, dimension,


techniques ranges from and reach across the
about 20% to more enterprise are the real
than 800%. returns on investment
Source: Gartner, What Is Artificial Intelligence? Ignore the Hype; Here’s
Where to Start, 15 March 2022
in AI.
Source: Gartner, What Is the True Return on AI Investment?,
17 February 2022
The Problem?
Revenue Cost Profit
Your first Al use case(s) are likely low-hanging fruit and have more
value than the 10th, 50th, or 100th use cases, so the marginal value of
$$ use cases is decreasing overall.

However, if you’re like most organizations today, the cost of


maintenance and the cost of executing each use case is likely
increasing.

At some point, profit from Al initiatives is decreasing due to increased


costs and stagnation of revenue.

The key to driving massive ROI from AI after some of the initial high-value
use cases, then, is largely about deploying it massively while controlling
# of use cases costs. But how?
3 Steps to Reducing
Costs Associated With AI
1. REUSE AND RECYCLE AI PROJECT COMPONENTS
Reuse is the simple concept of avoiding rework in AI projects, from small
details like code snippets to the macro-level, like the finding and cleaning
of data. Common sense and economics tell us not to start from scratch
every time, and that is exactly the principle behind reuse in AI projects.

Let’s dig into what’s perhaps the most costly aspect of AI projects: data
cleaning and preparation. It’s a hefty, often tedious, and time-consuming
task. That said, data cleaning and preparation are critical parts of an AI
project, and if not executed well, can translate into poor quality models as
well as increased risk through the entire model lifecycle.
So reducing this cost is not necessarily about simply discouraging time
spent or outsourcing the work, but rather finding smarter, more efficient
ways to ensure people across the organization aren’t wasting time finding
data or cleaning data that has already been prepared by someone else.

For example, what if you could provide a built-in, centralized, and


structured catalog of data treatments (from data sources to data
preparation, algorithms, and more) for easy consumption?

Enter: AI tools. Platforms such as Dataiku help teams and individuals alike
systemize processes, using common elements to get to business value
faster.

For example, in Dataiku, data can be prepared once and used across
multiple projects, code snippets can be packaged for reuse by other data
scientists, and plugins or applications can be leveraged even by non-
technical business users to promote reuse and prevent costly chaos.
2. FACILITATE MORE USE CASES
FOR THE PRICE OF ONE

We talked about reusing and recycling AI


project components, but let’s take that to the
next level. Making real money with AI requires
massively increasing the number of use cases
being addressed across the organization.

How can you empower anyone (not just


people on a data team) to leverage the work
done on existing AI projects to spin up new
ones, potentially uncovering previously
untapped use cases that bring a lot more
value than expected?
Sharing the cost incurred from an initial AI project results in many use cases for the price of one, so to speak.
However, being able to leverage one project to spur another requires:

Radical transparency. For example, how The right tools — like Dataiku — that make
can someone from marketing build off AI and data accessible to anyone across
of a use case developed in the customer the organization, from data scientist to
service department if neither knows what AI analyst to business people with only simple
projects the other is working on, much less spreadsheet capabilities.
can access and leverage those components?

The surfacing of these hidden use cases often comes from the work of analysts or business users. It is one of the keys to data democratization
and eventually to Everyday AI, where it’s not just data scientists that are bringing value from data, but the business itself.
3. INTRODUCE EFFICIENCY ACROSS
THE AI LIFECYCLE
The AI project lifecycle is rarely linear, and there are different people
involved at every stage, which means lots of potential rework and
inefficiencies along the way. Here are three main areas where introducing
efficiency — for example, through a centralized AI platform like Dataiku —
can help control costs.
How long does it take to release a first model in production?
Operationalization, or Pushing to Production
5%
Packaging, release, and operationalization of data, analytics, and AI
19 % We don’t release
projects is complex, and without any way to do it consistently, it can be in production
extremely time consuming. Less than
3 months

This a massive cost not only in person hours, but also in lost revenue
for the amount of time the machine learning model is not in production 55 %
and able to benefit the business. Multiply this not by one model but by
Between 3
hundreds, and the cost is debilitating. and 6 months
21 %

Dataiku has robust support for deployment of models into production More than
6 months
(including one-click deployment on the cloud with Kubernetes), easing the
operationalization of AI projects.
Source: From a Dataiku survey of more than 200 IT professionals
https://pages.dataiku.com/trends-in-enterprise-data-architecture-model-deplyment
Model Maintenance (MLOps)

Putting a model into production is an important milestone, but it’s far


from the end of the journey. Once a model is developed and deployed,
the challenge becomes regularly monitoring and refreshing it to ensure it
continues to perform well as conditions or data change.

That means continual AI project maintenance cannot be ignored (or at


least not without an effect on profit). Depending on the use case, the
model can either become less and less effective in a best case scenario; in
the worst case, it can become harmful to and costly for the business.

MLOps has emerged as a way of controlling the cost of maintenance,


shifting from a one-off task handled by a different person for each model
— usually the original data scientist who worked on the project — into a
systematized, centralized task.

Dataiku has robust MLOps capabilities and makes it easy not only to
deploy, but to monitor and manage AI projects in production.
Changes in Underlying Architecture

It’s not just models that need to be maintained, but architecture as well,
especially as technologies in the AI space are seemingly moving at the
speed of light. That means switching from one to another happens often,
and when it does, it can be costly.

For example, even though the cloud is growing in popularity, most


companies will take a hybrid approach, investing in AI platforms like
Dataiku that sit on top of the underlying architecture to provide a
consistent user experience for working with data no matter where it is
stored.

In addition, as organizations’ data teams or centers of excellence grow and


as more staff outside of those data professionals start working with data,
having a modern approach to architecture that allows for scaling up and
down of resources is critical to reducing overall costs associated with AI.
Controlling Costs Is Just the Tip of the Iceberg
We’ve seen here why reducing costs is a critically important component of successful AI initiatives. But how can organizations do it?
By ensuring — via investments in the right technology, including AI platforms like Dataiku — that:

1
Anyone at the organization can easily
access information, including who is
working on which AI projects with what
data, how that data is being transformed,
what models are being built based
on that data, etc.
2 Data experts can create and share
assets to be used across the organization,
including things like feature stores, a
portfolio of data treatments, or even
entire AI projects packaged as easy-to-use
applications.
3 Anyone at the organization can take,
reuse, and adapt AI project work
(whether micro, like data preparation, or
macro, like AI applications) done by those
data experts.
4 Leaders at the organization can
ensure the quality of AI projects via AI
Governance.

In other words, controlling costs requires removing friction — with that, you’re well on your way to successfully realizing AI at scale.
Drive 423% ROI With Dataiku
Forrester: The Total Economic Impact™ Of Dataiku reveals that organizations save 75% of data
scientists’ time and reduce 90% of manual, repeated reporting tasks with the platform.

“By having these reusable data pipelines and data products, [we have] streamlined our
operational side of development. We’re talking about savings in the range of $4 million plus.”
—Team Lead, Analytics Innovation | Pharmaceutical Company

READ THE FORRESTER STUDY

You might also like