How to Subscribe to the Topic in Apache Kafka from the Java application?
Last Updated :
30 May, 2024
Subscribing to a Kafka topic from a Java application requires setting up a Kafka consumer that reads messages from a specific topic. This is a key part of many microservice architectures where services must process messages asynchronously. Apache Kafka provides a robust and scalable platform for building such message-driven applications.
Kafka consumers are responsible for reading messages from Kafka topics. The customer subscribes to the topic and always polls for new messages. The key components involved in setting up a Kafka consumer include:
- Consumer Configuration: Define properties such as the Kafka server address, group ID, key deserializer, and value deserializer.
- Kafka Listener: An annotation provided by Spring Kafka to mark a method as a listener for a specific topic.
- Consumer Group: A group of consumers that work together to consume messages from a topic. Each message is delivered to only one consumer in the group.
Implementation to Subscribe to the Topic in Apache Kafka from the Java application
Below are the implementation steps to subscribe to the topic in Apache Kafka from the Java application.
Step 1: Setup the Apache Kafka
Refer to this link to confirm that Kafka is installed and running on your local system.
Step 2: Create the Spring Boot Project
Create the Spring Boot project using the Spring Initializr and add the below required dependencies.
- Spring Web
- Spring For Apache Kafka
- Lombok
- Spring DevTools
After completing this step, the project structure will be like below:

Step 3: Configure the application properties
Open the application.properties and add the below properties for configuration of the server port.
spring.application.name=kafka-demo
server.port=8080
Step 4: Consumer Kafka Configuration
Now, we will create the KafkaConsumerConfig class that can configure the configuration of the Consumer service.
Go to src > main > java > org.example.kafkademo > KafkaConsumerConfig and put the below code.
Java
package org.example.kafkademo;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.clients.consumer.KafkaConsumer;
import org.apache.kafka.common.serialization.StringDeserializer;
import java.util.HashMap;
import java.util.Map;
import java.util.Properties;
public class KafkaConsumerConfig {
public static Properties getConsumerProperties() {
Properties props = new Properties();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.GROUP_ID_CONFIG, "my-group");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return props;
}
}
Step 5: Main Class
Open the main class to subscribe the topic of the Kafka in the Spring application.
Go to src > main > java > org.example.kafkademo > KafkaDemoApplication and put the below code.
Java
package org.example.kafkademo;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.clients.consumer.ConsumerRecords;
import org.apache.kafka.clients.consumer.KafkaConsumer;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import java.time.Duration;
import java.util.Collections;
@SpringBootApplication
public class KafkaDemoApplication {
public static void main(String[] args) {
KafkaConsumer<String, String> consumer = new KafkaConsumer<>(KafkaConsumerConfig.getConsumerProperties());
consumer.subscribe(Collections.singletonList("kafka-topic"));
try {
while (true) {
ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(100));
for (ConsumerRecord<String, String> record : records) {
System.out.printf("Consumed record with key %s and value %s%n", record.key(), record.value());
}
}
} finally {
consumer.close();
}
}
}
pom.xml:
Java
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>3.2.5</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>org.example</groupId>
<artifactId>kafka-demo</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>kafka-demo</name>
<description>kafka-demo</description>
<properties>
<java.version>17</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
Step 6: Run the Application
After completing all the above steps, now run the application and it will start at port 8080.

Step 7: Sending Messages to the Topic
Now, we will use the Kafka producer to send the messages to the topic for testing. We can use the below Kafka command line tool.
.\bin\windows\kafka-topics.bat --create --topic kafka-topic --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1
Output:

Type the message:
bin/kafka-console-producer.bat --broker-list localhost:9092 --topic kafka-topic
>hello kafka message
Output:
Spring Boot Console Output:

By the following these steps, we can set up the Kafka consumer in the Spring Boot application and it allows microservices to process messages asynchronously and efficiently.
Similar Reads
How to Get Number of Messages in a Topic in Apache Kafka in Java?
Apache Kafka is the distributed event streaming platform capable of handling high throughput, low latency, and fault tolerance. One of the common tasks when working with Kafka is determining the number of messages on a specific topic. This article will guide you through the process of using Java to
4 min read
How to Deploy Java Application in Kubernetes?
Kubernetes is a platform for automating deployment, scaling, and management of containerized applications. Pods are basic units that hold one or more containers. Deployments manage application deployment and updates. Services provide stable endpoints for accessing pods. Ingress manages external acce
9 min read
How to Use Apache Kafka for Real-Time Data Streaming?
In the present era, when data is king, many businesses are realizing that there is processing information in real-time, which is allowing Apache Kafka, the current clear leader with an excellent framework for real-time data streaming. This article dives into the heart of Apache Kafka and its applica
5 min read
How to get all Topics in Apache Kafka?
Apache Kafka is an open-source event streaming platform that is used to build real-time data pipelines and also to build streaming applications. Kafka is specially designed to handle a large amount of data in a scalable way. In this article, we will learn how to get all topics in Apache Kafka. Steps
2 min read
How to Setup Jackson in Java Application?
JSON(Javascript Object Notation) is the most popular format for the exchange of data in the world of web applications. The browsers can easily parse json requests and convert them to javascript objects. The servers parse json requests, process them, and generates a new json response. JSON is self-de
7 min read
Implementing Request Response in Java Apache Kafka
Apache Kafka is a very powerful distributed event streaming platform that can be used for building real-time pipelines and streaming applications. It is highly scalable, fault-tolerant, and provides high throughput. One of the common patterns used in the Kafka application is the request-response pat
6 min read
How To Install Apache Kafka on Mac OS?
Setting up Apache Kafka on your Mac OS operating system framework opens up many opportunities for taking care of continuous data and data streams effectively. Whether you're a designer, IT specialist, or tech geek, figuring out how to introduce Kafka locally permits you to examine and construct appl
4 min read
How to Install & Configure Conduktor Tool For Apache Kafka?
Conduktor is a full-featured native desktop application that plugs directly into Apache Kafka to bring visibility to the management of Kafka clusters, applications, and microservices. Itâs eventually helping companies make the most of their existing engineering resources, and minimizing the need for
2 min read
Step-by-Step Guide to Access Application Logs in Kubernetes Pods
In a Kubernetes environment monitoring application logs is essential for debugging performance tuning and maintaining the overall health of our application. The Kubernetes pods which are the smallest deployable units in Kubernetes can generate logs that provide valuable insights inside them. In this
5 min read
How to Deploy Spring Boot Application in Kubernetes ?
The Spring Boot framework provides an assortment of pre-configured templates and tools to make the development of Java-based applications simpler. With little configuration, it enables developers to quickly design production-ready, stand-alone apps. Kubernetes, commonly referred to as K8s, is an ope
7 min read