3 Ways To Reduce The Cost of AI
3 Ways To Reduce The Cost of AI
3 Ways To Reduce The Cost of AI
the Cost of AI
BACKGROUND PICTURE IT’S JUST A PLACEHOLDER
POV FACT
You want to see ROI on AI initiatives — who doesn’t? Maximizing ROI requires wide deployment across the
After all, the potential is huge. enterprise.
The key to driving massive ROI from AI after some of the initial high-value
use cases, then, is largely about deploying it massively while controlling
# of use cases costs. But how?
3 Steps to Reducing
Costs Associated With AI
1. REUSE AND RECYCLE AI PROJECT COMPONENTS
Reuse is the simple concept of avoiding rework in AI projects, from small
details like code snippets to the macro-level, like the finding and cleaning
of data. Common sense and economics tell us not to start from scratch
every time, and that is exactly the principle behind reuse in AI projects.
Let’s dig into what’s perhaps the most costly aspect of AI projects: data
cleaning and preparation. It’s a hefty, often tedious, and time-consuming
task. That said, data cleaning and preparation are critical parts of an AI
project, and if not executed well, can translate into poor quality models as
well as increased risk through the entire model lifecycle.
So reducing this cost is not necessarily about simply discouraging time
spent or outsourcing the work, but rather finding smarter, more efficient
ways to ensure people across the organization aren’t wasting time finding
data or cleaning data that has already been prepared by someone else.
Enter: AI tools. Platforms such as Dataiku help teams and individuals alike
systemize processes, using common elements to get to business value
faster.
For example, in Dataiku, data can be prepared once and used across
multiple projects, code snippets can be packaged for reuse by other data
scientists, and plugins or applications can be leveraged even by non-
technical business users to promote reuse and prevent costly chaos.
2. FACILITATE MORE USE CASES
FOR THE PRICE OF ONE
Radical transparency. For example, how The right tools — like Dataiku — that make
can someone from marketing build off AI and data accessible to anyone across
of a use case developed in the customer the organization, from data scientist to
service department if neither knows what AI analyst to business people with only simple
projects the other is working on, much less spreadsheet capabilities.
can access and leverage those components?
The surfacing of these hidden use cases often comes from the work of analysts or business users. It is one of the keys to data democratization
and eventually to Everyday AI, where it’s not just data scientists that are bringing value from data, but the business itself.
3. INTRODUCE EFFICIENCY ACROSS
THE AI LIFECYCLE
The AI project lifecycle is rarely linear, and there are different people
involved at every stage, which means lots of potential rework and
inefficiencies along the way. Here are three main areas where introducing
efficiency — for example, through a centralized AI platform like Dataiku —
can help control costs.
How long does it take to release a first model in production?
Operationalization, or Pushing to Production
5%
Packaging, release, and operationalization of data, analytics, and AI
19 % We don’t release
projects is complex, and without any way to do it consistently, it can be in production
extremely time consuming. Less than
3 months
This a massive cost not only in person hours, but also in lost revenue
for the amount of time the machine learning model is not in production 55 %
and able to benefit the business. Multiply this not by one model but by
Between 3
hundreds, and the cost is debilitating. and 6 months
21 %
Dataiku has robust support for deployment of models into production More than
6 months
(including one-click deployment on the cloud with Kubernetes), easing the
operationalization of AI projects.
Source: From a Dataiku survey of more than 200 IT professionals
https://pages.dataiku.com/trends-in-enterprise-data-architecture-model-deplyment
Model Maintenance (MLOps)
Dataiku has robust MLOps capabilities and makes it easy not only to
deploy, but to monitor and manage AI projects in production.
Changes in Underlying Architecture
It’s not just models that need to be maintained, but architecture as well,
especially as technologies in the AI space are seemingly moving at the
speed of light. That means switching from one to another happens often,
and when it does, it can be costly.
1
Anyone at the organization can easily
access information, including who is
working on which AI projects with what
data, how that data is being transformed,
what models are being built based
on that data, etc.
2 Data experts can create and share
assets to be used across the organization,
including things like feature stores, a
portfolio of data treatments, or even
entire AI projects packaged as easy-to-use
applications.
3 Anyone at the organization can take,
reuse, and adapt AI project work
(whether micro, like data preparation, or
macro, like AI applications) done by those
data experts.
4 Leaders at the organization can
ensure the quality of AI projects via AI
Governance.
In other words, controlling costs requires removing friction — with that, you’re well on your way to successfully realizing AI at scale.
Drive 423% ROI With Dataiku
Forrester: The Total Economic Impact™ Of Dataiku reveals that organizations save 75% of data
scientists’ time and reduce 90% of manual, repeated reporting tasks with the platform.
“By having these reusable data pipelines and data products, [we have] streamlined our
operational side of development. We’re talking about savings in the range of $4 million plus.”
—Team Lead, Analytics Innovation | Pharmaceutical Company