Set Your Data in Motion

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Set Your Data in Motion

with Apache Kafka using


Confluent on AWS
2

Your business needs access


to data in motion
Whether you’re preparing to modernize, designing real-time event streams and continuous real-time
from the ground up for a new service yet to be processing. Isn’t it time you provided your business
launched, or sitting anywhere in between, one with the same?
thing should be clear: Meeting the expectations
of a modern customer depends on delivering an The good news is that you can. Confluent has taken
end-to-end real-time experience. Influenced by the the Apache Kafka® event streaming platform and
apps on their phones and unlimited options on their created an enterprise distribution that makes
TVs, today’s customers hold a high standard of Apache Kafka easier to use and manage. On top
expectation for experiences that satisfy their each of that, Confluent’s integration with Amazon Web
and every need—often before that need is even Services (AWS) means you can scale that solution
fully understood. across hybrid and multicloud environments—
wherever your data exists.
The companies most successful in delivering
against this expectation are running on top of data
constantly in motion. They’re totally connected
and fueled at all times by a ubiquitous supply of

What is
Apache Kafka?
First developed at LinkedIn by the founders of
Confluent, Apache Kafka is a community distributed
event streaming platform capable of handling
trillions of events a day. Initially conceived as a
messaging queue, Kafka is based on an abstraction
of a distributed commit log. Since being created and
open sourced in 2011, Kafka has quickly evolved from
a messaging queue to a full-fledged event streaming
platform for data in motion.
marketplace
3

The vision:
Establish event streaming as the central nervous system
When getting started with data in motion and Kafka, to your entire business. This means all data in the delivering the real-time products and services that
it helps to have a clear understanding of where it can organization—covering every team, every use case, on would delight your customers the most. But value can
take your business—perhaps, somewhere similar to premises and in the cloud, etc.—is managed through be found right out of the gate, by individual teams or
where the successful companies we mentioned earlier a single event streaming platform. It’s the realization with an initial use case, when making the first pivot
have landed. At Confluent, we think about Kafka of enterprise data served as a product with a globally toward an event-driven architecture and operations
adoption in terms of an event streaming maturity scaled and mission-critical event-driven architecture. based on data in motion, not data at rest. All you
model that ultimately results in the establishment have to do is choose a place to start.
With Kafka fully established as the central nervous
of data in motion as the central nervous system
system to your business, it’s easy to imagine

The event streaming maturity model

5
VALUE

Enterprise data as a product,


event-driven architecture
Central nervous system

4 All data in the organization


managed through a single
data in motion platform
Platform effect: reuse of data, Digital natives/digital pure
efficiencies of scale players—probably using machine
3 Mission-critical, connected learning and artificial intelligence
LOBs (relational databases: redundant)
2 Clusters of reuse,
Data in motion platform
1 real-time analytics managing majority of mission-
Scalable pipeline pub/sub Mission-critical, critical data processes, globally,
Single solution Identify a project/Start to but disparate LOBs with multi-datacenter replication
Multiple mission-critical use cases across on-prem and hybrid clouds,
Early interest set up pipeline
in production, with scale, disaster in parallel with other big data
Developer signs up for Confluent LOBs; small teams experimenting; infrastructure
recovery, and SLAs
and experiments; pilot(s) pub/sub or integration
Event streaming clearly
1-3 uses cases quickly moved into
delivering business value,
production but fragmented
with C-suite visibility
INVESTMENT AND TIME

Projects Platforms
4

Start small:
Choose your first use case
As with any new mission, understanding where to begin and how to get started can often be one of the most challenging steps. Fortunately, Confluent is here to help.
Teams that are most successful adopting data in motion and realizing a central nervous system built around Kafka typically focus on three fundamental use cases:
event-driven microservices, streaming ETL, and customer 360.

Let’s take a look at these most commonly adopted and impactful use cases:

Event-driven microservices Streaming ETL Customer 360


Integrate and process data in real time across Deliver a complete customer view with
Build efficiently and avoid complexity with
all of your sources maximum insights
reusable components
Businesses are in constant need of data As businesses grow, they develop more and
The popularity of modern architectures based
integration with ETL (extract, transform, and more teams holding siloed data sets—each
on event-driven microservices comes from an
load). However, modern-world requirements now containing valuable customer information. While
industry move toward decoupling services into
call for data integration to run across distributed somewhat useful on their own, these disparate
small, reusable components in order to simplify
platforms, with support for more data sources/ data sets become exponentially more valuable
development and management of data systems.
types and fast, real-time processing. Kafka to the business when aggregated and jointly
Instead of being tangled within a monolith of
Connect allows for integration across a wide analyzed. Kafka is a great fit for customer 360
code, the different functions within an application
variety of sources and sinks, and Kafka Streams/ use cases because it allows previously siloed
can operate independently and asynchronously,
ksqlDB allow for real-time processing and systems to communicate through a single
which allows for reduced complexity and risk.
enrichment of data in motion. central platform in order to deliver the highest-
Kafka serves as the single platform to effectively
value profile of your customers.
manage communications between these different
components. Additionally, with microservices,
organizations are able to improve collaboration
and increase productivity by designing systems
that reflect their different teams. Confluent provides the industry’s only fully managed, cloud-native service for Kafka

Fully managed (cloud-native) Partially managed (cloud-hosted) Self-managed (DIY)


Automated product capabilities with Manual operations with tools Development and maintenance
no operational overhead and/or support without support

Confluent Other managed


Cloud services for Kafka

marketplace
5

Cloud-native Complete Everywhere


With Confluent Cloud on AWS, you can provision serverless Confluent does way more than just manage Kafka for you. A Kafka deployment that only works in a single
Kafka clusters on demand that scale elastically between You get a complete platform built around it in order to environment undercuts your ability to leverage data in
0 and 100 MBps or get you to GBps+ scale with just a quickly execute projects. We provide out-of-the-box, fully motion across your entire business and every customer
few clicks. Instant elasticity like this means you can easily managed connectors for the most popular data sources/ experience. Confluent, however, lets you link Kafka
scale up to meet unexpected demand and just as quickly sinks in the Kafka ecosystem, Schema Registry and clusters that sync in real time, so your events are available
scale back down to manage cost. You can store unlimited validation to maintain data quality, stream processing with anywhere—across multiple public or private clouds.
amounts of data on your Kafka clusters, without any an event streaming database, and more—all fully managed
upfront capacity planning or provisioning. Kafka becomes in the same cloud UI. Confluent and AWS make it incredibly easy to acquire and
a system of record so you can do more with your real-time use Confluent Cloud. With deep integration from source
events. Run your solution on AWS. Our highly available There’s no cutting corners when it comes to data (data out) and sink (data to) across the AWS system, using
clusters are backed by a 99.95% SLA and are always on the safety and compliance. Confluent secures your Kafka fully managed pre-built connectors like Amazon Simple
latest stable version of Kafka, upgraded and patched in the environment with encryption both at rest and in transit, Storage Service (Amazon S3), Amazon Kinesis, and Amazon
background like a true cloud-native service. role-based access controls, Kafka ACLs, and SAML/SSO Redshift, customers can easily bring data into AWS and
for authentication. More granular control with private move it to other sources in the cloud or on premises.
networking and BYOK for your Kafka environments is
available as well. Finally, we design with compliance and Confluent’s collaboration with AWS also provides you with
privacy in mind with SOC 1/2/3 certifications, ISO 27001 different purchasing options—pay as you go monthly or
certification, HIPAA/GDPR/CCPA readiness, and more. an annual commit—and a streamlined experience from
the moment you get started with Confluent in the cloud
through to unified management, security, and billing.

Build and launch faster with a fully managed,


cloud-native service for Kafka
Unless you’re in the business of managing data infrastructure, there’s no reason to consider open
source when it comes to launching your first apps with data in motion. Leverage a cloud service and
maintain focus on what matters most: your business. However, not all cloud services for Kafka are
created equal. Confluent provides a service that is cloud-native, complete, and everywhere.

Confluent on Amazon Web Services (AWS) enhances the experience even further, enabling you
to start working with Confluent Cloud immediately and conveniently using your existing AWS
credentials. Even better, if you’re an existing AWS customer, you can use 50% of your Confluent Cloud
spend toward your annual EDP commit drawdown.

marketplace
6

No better way to work Learn Kafka with Leverage expert


than with Confluent Confluent resources and support
When it comes to data in motion, there’s no better Fear not, all are welcome. Whether brand new Confluent Developer will be your one-stop-shop for
way to work than with Confluent. You’ll be supported to Kafka or already experienced with building demos, guides, and code samples to quickly continue
all along your event streaming journey with the tools event streaming apps, developers serious about growing your skills and launching apps. From basic
and assistance needed to keep you moving at top establishing themselves as data-in-motion experts concepts to advanced patterns, you’ll have access to
speed. Work in the language of your choice. You don’t come to Confluent. Within Confluent Cloud, we take everything needed to get started with Kafka using
have time for systems that can’t support the clients a hands-on, learn-while-you-build approach to event Confluent on AWS. Finally, if you ever need further
for your preferred programming language or aren’t streaming education with tutorials taking you from assistance, our teams of data-in-motion and Apache
available in the cloud where your project is located. early development to advanced, real-time solutions. Kafka committer-led experts within Support and
With Confluent, you can stay agile and work in the You’ll move quickly with step-by-step guidance Professional Services are always right around the
language of your choice with our non-Java clients like for everything from cluster creation and Schema corner and ready to help!
C/C++, Go, .NET, Python, and our REST Proxy. And Registry setup through to connector integrations
because Confluent is in AWS Marketplace, you can and creation of your first stream processing app.
use your existing AWS credentials to leverage Kafka You can also access these learning opportunities in
and quickly scale it up or down. Event streaming the AWS Marketplace.
is unpredictable in nature, meaning it’s difficult to Confluent Education Self Paced Subscription -
predict the volume or velocity of data coming in. 1 Named User for 1 Year
AWS flexibility means more uptime and reliability—
so you can focus on the application and not the Confluent Public Instructor-Led Three Day Training -
infrastructure behind the application. Single Student

Step-by-step, in-product
tutorials for everything from
cluster creation through to
creation of your first stream
processing app

marketplace
7

Customer story

How AO.com increased personalization—even during the


pandemic—with Confluent

“Confluent Cloud is a critical enabler for us, allowing us to treat each


moment as a one-on-one opportunity to provide a great customer
experience. Running Confluent Cloud on AWS enables us to take advantage
of the scalability of cloud-native approaches as we build our applications.”

Jon Vines
Engineering Lead, AO

With the COVID-19 pandemic, AO saw a dramatic shift in consumer shopping habits, which led
to a sharp increase in growth. To support this surge, AO turned to a real-time event streaming
platform based on Confluent and Apache Kafka. They were able to deliver hyper-personalized
online experiences by combining historical customer data with clickstream data and other real-
time digital signals from across the business.

The company’s personalization efforts paid off with a 30% increase in customer conversion
rates. The Confluent platform also enabled AO development teams to accelerate the rollout of
new business capabilities. And it has helped improve operational efficiencies such as deliveries
through better integration of information across the entire company.

marketplace
8

ABOUT CONFLUENT

Confluent is pioneering a fundamentally new


category of data infrastructure focused on data
in motion. Confluent’s cloud-native offering is
the foundational platform for data in motion—
designed to be the intelligent connective tissue
enabling real-time data from multiple sources
to constantly stream across the organization.
With Confluent, organizations can meet the
new business imperative of delivering rich
digital front-end customer experiences and
transitioning to sophisticated, real-time,
Get started today and fuel your software-driven back-end operations.

business with data in motion To learn more, please visit


www.confluent.io

Ready to set your data in motion? Subscribe to Confluent Cloud in one of


two ways—pay as you go or an annual commit—in the AWS Marketplace
and see what you can accomplish with a fully managed, cloud-native
service for Kafka. Or, check out a demo of Confluent Cloud.

To learn more, please visit marketplace


Confluent on AWS

AWS Marketplace

Copyright © Confluent, Inc. 2022

You might also like