Apache Kafka - Create Safe Producer using Java
Last Updated :
18 Mar, 2023
Apache Kafka Producers are going to write data to topics and topics are made of partitions. Now the producers in Kafka will automatically know to which broker and partition to write based on your message and in case there is a Kafka broker failure in your cluster the producers will automatically recover from it which makes Kafka resilient and which makes Kafka so good and used today. But how to create a safe producer in Apache Kafka? So there are two answers to this question according to Kafka Version you are using.
Prerequisites:
Kafka Version < 0.11
If you are using Kafka Version < 0.11 then you have to set the following producer properties in order to make your producer safe.
- acks = all (producer level) : It will ensure that data is properly replicated before an ack is received
- min.insync.replicas = 2 (broker/topic level) : It will ensure that two brokers in In-Sync Replica at least have the data after an ack
- retries = Integer.MAX_VALUE (producer level) : It will ensure that transient errors are retried indefinitely
- max.inflight.requests = 1 (producer level) : It will ensure that only one request is tried at any time, preventing message re-ordering in case of retries
Kafka Version >= 0.11
If you are using Kafka Version >= 0.11 then you have to set the following producer properties in order to make your producer safe.
- enable idempotence = true (producer level)
- acks = all (producer level) : It will ensure that data is properly replicated before an ack is received
- min.insync.replicas = 2 (broker/topic level) : It will ensure that two brokers in In-Sync Replica at least have the data after an ack
- retries = Integer.MAX_VALUE (producer level) : It will ensure that transient errors are retried indefinitely
- max.inflight.requests = 5 (producer level) : Default
Note: Running a "safe producer" might impact throughput and latency, always test it for your use cases.
Create Safe Producer using Java
To create a safe producer using java we have to set these additional properties in the code
Java
properties.setProperty(ProducerConfig.ENABLE_IDEMPOTENCE_CONFIG, "true");
properties.setProperty(ProducerConfig.ACKS_CONFIG, "all");
properties.setProperty(ProducerConfig.RETRIES_CONFIG, Integer.toString(Integer.MAX_VALUE));
properties.setProperty(ProducerConfig.MAX_IN_FLIGHT_REQUESTS_PER_CONNECTION, "5");
Example Project
In this example, we are going to discuss the step-by-step implementation of how to Create a Safe Apache Kafka Producer using Java.
Step by Step Implementation
Step 1: Create a New Apache Kafka Project in IntelliJ
To create a new Apache Kafka Project in IntelliJ using Java and Maven please refer to How to Create an Apache Kafka Project in IntelliJ using Java and Maven.
Step 2: Install and Run Apache Kafka
To Install and Run Apache Kafka in your local system please refer to How to Install and Run Apache Kafka.
Step 3: Create Producer using Java
First, we have to create Producer Properties. And to create Producer Properties refer to the below code snippet
Create Producer Properties:
Properties properties = new Properties();
properties.setProperty(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServer);
properties.setProperty(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
properties.setProperty(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
Add Additional Properties for Creating Safe Producer:
properties.setProperty(ProducerConfig.ENABLE_IDEMPOTENCE_CONFIG, "true");
properties.setProperty(ProducerConfig.ACKS_CONFIG, "all");
properties.setProperty(ProducerConfig.RETRIES_CONFIG, Integer.toString(Integer.MAX_VALUE));
properties.setProperty(ProducerConfig.MAX_IN_FLIGHT_REQUESTS_PER_CONNECTION, "5");
Create the Producer:
KafkaProducer<String, String> producer = new KafkaProducer<>(properties);
Create a Producer Record:
ProducerRecord<String, String> record =
new ProducerRecord<>("first_geeksforgeeks_topic", "hello_geeksforgeeks");
Send data asynchronously:
producer.send(record);
Flush and Close the Producer:
producer.flush();
producer.close();
Below is the complete code. Comments are added inside the code to understand the code in more detail.
Java
package basics;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.common.serialization.StringSerializer;
import java.util.Properties;
public class KafkaProducerDemo {
public static void main(String[] args) {
String bootstrapServer = "127.0.0.1:9092";
// Create Producer Properties
Properties properties = new Properties();
properties.setProperty(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServer);
properties.setProperty(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
properties.setProperty(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
// Create Safe Producer
properties.setProperty(ProducerConfig.ENABLE_IDEMPOTENCE_CONFIG, "true");
properties.setProperty(ProducerConfig.ACKS_CONFIG, "all");
properties.setProperty(ProducerConfig.RETRIES_CONFIG, Integer.toString(Integer.MAX_VALUE));
properties.setProperty(ProducerConfig.MAX_IN_FLIGHT_REQUESTS_PER_CONNECTION, "5");
// Create the Producer
KafkaProducer<String, String> producer = new KafkaProducer<>(properties);
// Create a Producer Record
ProducerRecord<String, String> record =
new ProducerRecord<>("first_gfg_topic", "hello_geeksforgeeks!!");
// Send Record
producer.send(record);
// Flush and Close the Producer
producer.flush();
producer.close();
}
}