Set Your Data in Motion
Set Your Data in Motion
Set Your Data in Motion
What is
Apache Kafka?
First developed at LinkedIn by the founders of
Confluent, Apache Kafka is a community distributed
event streaming platform capable of handling
trillions of events a day. Initially conceived as a
messaging queue, Kafka is based on an abstraction
of a distributed commit log. Since being created and
open sourced in 2011, Kafka has quickly evolved from
a messaging queue to a full-fledged event streaming
platform for data in motion.
marketplace
3
The vision:
Establish event streaming as the central nervous system
When getting started with data in motion and Kafka, to your entire business. This means all data in the delivering the real-time products and services that
it helps to have a clear understanding of where it can organization—covering every team, every use case, on would delight your customers the most. But value can
take your business—perhaps, somewhere similar to premises and in the cloud, etc.—is managed through be found right out of the gate, by individual teams or
where the successful companies we mentioned earlier a single event streaming platform. It’s the realization with an initial use case, when making the first pivot
have landed. At Confluent, we think about Kafka of enterprise data served as a product with a globally toward an event-driven architecture and operations
adoption in terms of an event streaming maturity scaled and mission-critical event-driven architecture. based on data in motion, not data at rest. All you
model that ultimately results in the establishment have to do is choose a place to start.
With Kafka fully established as the central nervous
of data in motion as the central nervous system
system to your business, it’s easy to imagine
5
VALUE
Projects Platforms
4
Start small:
Choose your first use case
As with any new mission, understanding where to begin and how to get started can often be one of the most challenging steps. Fortunately, Confluent is here to help.
Teams that are most successful adopting data in motion and realizing a central nervous system built around Kafka typically focus on three fundamental use cases:
event-driven microservices, streaming ETL, and customer 360.
Let’s take a look at these most commonly adopted and impactful use cases:
marketplace
5
Confluent on Amazon Web Services (AWS) enhances the experience even further, enabling you
to start working with Confluent Cloud immediately and conveniently using your existing AWS
credentials. Even better, if you’re an existing AWS customer, you can use 50% of your Confluent Cloud
spend toward your annual EDP commit drawdown.
marketplace
6
Step-by-step, in-product
tutorials for everything from
cluster creation through to
creation of your first stream
processing app
marketplace
7
Customer story
Jon Vines
Engineering Lead, AO
With the COVID-19 pandemic, AO saw a dramatic shift in consumer shopping habits, which led
to a sharp increase in growth. To support this surge, AO turned to a real-time event streaming
platform based on Confluent and Apache Kafka. They were able to deliver hyper-personalized
online experiences by combining historical customer data with clickstream data and other real-
time digital signals from across the business.
The company’s personalization efforts paid off with a 30% increase in customer conversion
rates. The Confluent platform also enabled AO development teams to accelerate the rollout of
new business capabilities. And it has helped improve operational efficiencies such as deliveries
through better integration of information across the entire company.
marketplace
8
ABOUT CONFLUENT
AWS Marketplace