Spring Boot – Integrate with Apache Kafka for Streaming
Last Updated :
23 Jul, 2025
Apache Kafka is a widely used distributed streaming platform that enables the development of scalable, fault-tolerant, and high-throughput applications. In this article, we'll walk you through the process of integrating Kafka with a Spring Boot application, providing detailed code examples and explanations along the way. By the end of this article, you'll gain a strong understanding of how to seamlessly incorporate Kafka into your Spring Boot projects, enhancing your application's performance and capabilities.
What are Kafka Streams?
Kafka Streams is a client library built on top of Apache Kafka. It enables the processing of unbounded streams of events in a declarative manner. Streaming data examples include stock market prices, system logs, or the number of users on a website at any given moment.
Kafka Streams provides a connection between Kafka topics and relational database tables. It enables operations such as joins, grouping, aggregation, and filtering on streaming events. A key concept in Kafka Streams is processor topology, which outlines the operations performed on one or more event streams. This topology consists of a directed acyclic graph (DAG), where nodes are categorized as:
- Source Nodes: Consume one or more Kafka topics and forward the data to successor nodes.
- Processor Nodes: Receive records from upstream nodes, process them, and optionally forward new records to downstream nodes. For instance, a source processor node named "Source" is added to the topology with the
addSource
method, and a processor node named "Process" with predefined logic is added using the addProcessor
method. - Sink Nodes: Receive records from upstream nodes and write them to a Kafka topic.
The topology is constructed as an acyclic graph, which is then passed to a KafkaStreams
instance for consuming, processing, and producing records.
Processor API
The Processor API offers flexibility for defining and connecting custom processors to the processing topology. It allows the creation of stream processors that handle one record at a time and supports both stateless and stateful operations. Stateful operations connect stream processors to state stores.
Dependencies
Dependencies are libraries that provide specific functionalities for use in your application. In Spring Boot, dependency management and auto-configuration work together seamlessly. To integrate Kafka Streams with Spring Boot, add the following dependencies in your pom.xml
file:
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
<version>3.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId
<artifactId>kafka-streams</artifactId>
<version>3.6.1</version>
</dependency>
Configuration
Spring Boot's auto-configuration feature automatically configures your Spring application based on the jar dependencies you've included. Now, let's define the Kafka Streams configuration in a Java configuration class:
Let us define the Kafka stream configuration in a Java config class:
- @Configuration: Spring @Configuration annotation is part of the spring core framework. Spring Configuration annotation indicates that the class has @Bean definition methods
- @EnableKafka: The KafkaListenerContainerFactory is responsible for creating the listener container for a specific endpoint.
- @EnableKafkaStreams: To autoconfigure Kafka Streams support in Spring Boot
Java
@Configuration
@EnableKafka
@EnableKafkaStreams
public class KafkaConfig {
@Value(value = "${spring.kafka.bootstrap-servers}")
private String bootstrapAddress;
@Bean(name = KafkaStreamsDefaultConfiguration.DEFAULT_STREAMS_CONFIG_BEAN_NAME)
KafkaStreamsConfiguration kStreamsConfig() {
Map<String, Object> props = new HashMap<>();
props.put(APPLICATION_ID_CONFIG, "streams-app");
props.put(BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
props.put(DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
props.put(DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
return new KafkaStreamsConfiguration(props);
}
// other config
}
Basic Kafka Producer and Consumer
Before diving into Kafka Streams, it’s essential to understand how to integrate basic Kafka producers and consumers with Spring Boot.
Producer Configuration
To configure a Kafka producer in Spring Boot, specify the Kafka server addresses and serialization settings in the application.yml
file. Include the necessary dependencies in your pom.xml
file.
Java
@Configuration
public class KafkaProducerConfig {
@Bean
public ProducerFactory<String, String> producerFactory() {
Map<String, Object> configProps = new HashMap<>();
configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
return new DefaultKafkaProducerFactory<>(configProps);
}
@Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
}
Output:
Consumer Configuration:
Similarly, to configure a Kafka consumer in Spring Boot, you need to specify settings for bootstrap-servers, key-serializer, and value-serializer in the application.yml file. Add the required dependencies in your pom.xml file. The consumer configuration generally includes properties such as group-id, key-deserializer, and value-deserializer.
Code Example:
Java
@Configuration
@EnableKafka
public class KafkaConsumerConfig {
@Bean
public ConsumerFactory<String, String> consumerFactory() {
Map<String, Object> configProps = new HashMap<>();
configProps.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
configProps.put(ConsumerConfig.GROUP_ID_CONFIG, "group_id");
configProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
configProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
return new DefaultKafkaConsumerFactory<>(configProps);
}
@Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
}
Output:
Producer and Consumer Example
To demonstrate Kafka producer and consumer functionality, first configure a Kafka producer to send messages to a Kafka topic. Then, configure a Kafka consumer using a Kafka listener.
- This involves setting up a producer bean and using its 'send' method to publish messages.
- Next, you configure a Kafka consumer using a Kafka listener with the @KafkaListener annotation, which allows the application to automatically receive and process messages from the Kafka topic to which the producer is sending data.
- This setup illustrates a complete cycle where the producer sends data to a topic, and the consumer processes that data as it arrives.
Code Example:
Java
@Service
public class KafkaProducerService {
private final KafkaTemplate<String, String> kafkaTemplate;
@Autowired
public KafkaProducerService(KafkaTemplate<String, String> kafkaTemplate) {
this.kafkaTemplate = kafkaTemplate;
}
public void sendMessage(String message) {
kafkaTemplate.send("my_topic", message);
}
}
@Service
public class KafkaConsumerService {
@KafkaListener(topics = "my_topic", groupId = "group_id")
public void consume(String message) {
System.out.println("Consumed message: " + message);
}
}
Output:
KafkaProducerService: When you call sendMessage("Hello Kafka"), you will see:
Message sent: Hello Kafka
KafkaConsumerService: When the Kafka consumer receives the message "Hello Kafka" from the topic, you will see:
Consumed message: Hello Kafka
Testing Kafka Integration
You can test the producer and consumer setup by using a REST controller to trigger the producer and view the output from the consumer in the console.
Code Example:
Java
@RestController
public class KafkaController {
private final KafkaProducerService kafkaProducerService;
@Autowired
public KafkaController(KafkaProducerService kafkaProducerService) {
this.kafkaProducerService = kafkaProducerService;
}
@GetMapping("/send/{message}")
public String sendMessage(@PathVariable String message) {
kafkaProducerService.sendMessage(message);
return "Message sent successfully!";
}
}
Building a Topology
Now that we have set up the configuration, let’s build the topology for our application to keep a count of the words from input messages:
- @Component:
@Component
is an annotation that allows Spring to automatically detect and manage custom beans in the application context - @Autowired: The Spring framework facilitates automatic dependency injection. This means that by defining bean dependencies in a Spring configuration file, the Spring container can automatically wire the relationships between collaborating beans.
Java
@Component
public class WordCountProcessor {
private static final Serde<String> STRING_SERDE = Serdes.String();
@Autowired
void buildPipeline(StreamsBuilder streamsBuilder) {
KStream<String, String> messageStream = streamsBuilder
.stream("input-topic", Consumed.with(STRING_SERDE, STRING_SERDE));
KTable<String, Long> wordCounts = messageStream
.mapValues((ValueMapper<String, String>) String::toLowerCase)
.flatMapValues(value -> Arrays.asList(value.split("\\W+")))
.groupBy((key, word) -> word, Grouped.with(STRING_SERDE, STRING_SERDE))
.count();
wordCounts.toStream().to("output-topic");
}
}
Creating REST Endpoints
After defining our pipeline with the declarative steps, create the REST controller. This provides the endpoints to POST messages to the input topic and GET the counts for the specified word.
- @GetMapping: @GetMapping annotation in Spring is a powerful for building RESTful web services. It maps HTTP GET requests to a specific handler method in Spring controllers. With the help of @GetMapping annotation we can easily define endpoints of RESTful API and handle various HTTP requests.
- @PathVariable: @PathVariable annotation can be used to handle template variables in the request URI mapping.
Java
@RestController
public class WordCountController {
@Autowired
private KafkaStreamsFactoryBean factoryBean;
@GetMapping("/count/{word}")
public Long getWordCount(@PathVariable String word) {
KafkaStreams kafkaStreams = factoryBean.getKafkaStreams();
ReadOnlyKeyValueStore<String, Long> counts = kafkaStreams.store(
StoreQueryParameters.fromNameAndType("counts", QueryableStoreTypes.keyValueStore())
);
return counts.get(word);
}
}
Conclusion
This article walked you through how to integrate Apache Kafka for streaming in a Spring Boot application. We explored the basics of Kafka producers and consumers, configured Kafka Streams, built a simple topology, and set up REST endpoints for interacting with the stream data. By following these steps, you can enhance your Spring Boot applications with the robust capabilities of Apache Kafka for handling streaming data.
Similar Reads
Advanced Java Tutorial | Mastery in Java Programming Advanced Java typically refers to the specialized topics and advanced features of the Java programming language beyond the basics covered in Core Java. Includes concepts like Servlets, JSP (JavaServer Pages), JDBC (Java Database Connectivity), Java EE (Enterprise Edition), web services, frameworks l
13 min read
Java Enterprise Edition
Introduction to Java ServletsJava Servlet is a Java program that runs on a Java-enabled web server or application server. It handles client requests, processes them and generates responses dynamically. Servlets are the backbone of many server-side Java applications due to their efficiency and scalability.Key Features:Servlets w
7 min read
Life Cycle of a ServletThe entire life cycle of a Servlet is managed by the Servlet container, which uses the jakarta.servlet.Servlet interface to understand the Servlet object and manage it. So, before creating a Servlet object, let's first understand the life cycle of the Servlet object, which is actually understanding
6 min read
Introduction to JSPJavaServer Pages (JSP) is a server-side technology that creates dynamic web applications. It allows developers to embed Java code directly into HTML pages and it makes web development more efficient.JSP is an advanced version of Servlets. It provides enhanced capabilities for building scalable and p
4 min read
JSP ArchitectureJSP (Java Server Pages) uses a three-tier architecture with a client, web server, and database. When the client sends a request, the web server's JSP engine processes the JSP file by converting it into a servlet, compiling, and executing it. The generated HTML is sent back to the client. The followi
2 min read
JSF | Java Server FacesJSF technology includes a set of APIs, which represent different UI components and helps in managing their states. These APIs further help in handling events on the UI components and validate user inputs through the UI components. JSF framework provides the flexibility of creating simple as well as
4 min read
Enterprise Java Beans (EJB)Note java.beans: This package has been deprecated in Java 9 and later versions, in favor of using annotations and other modern ways of creating beans. Enterprise Java Beans (EJB) is one of the several Java APIs for standard manufacture of enterprise software. EJB is a server-side software element th
4 min read
Multithreading
Concurrency
java.util.concurrent PackageJava Concurrency package covers concurrency, multithreading, and parallelism on the Java platform. Concurrency is the ability to run several or multi programs or applications in parallel. The backbone of Java concurrency is threads (a lightweight process, which has its own files and stacks and can a
9 min read
Java.util.concurrent.Executor interface with ExamplesThe concurrent API in Java provides a feature known as an executor that initiates and controls the execution of threads. As such, an executor offers an alternative to managing threads using the thread class. At the core of an executor is the Executor interface. It refers to the objects that execute
1 min read
Java.util.concurrent.ExecutorService Interface with ExamplesThe ExecutorService interface extends Executor by adding methods that help manage and control the execution of threads. It is defined in java.util.concurrent package. It defines methods that execute the threads that return results, a set of threads that determine the shutdown status. The ExecutorSer
3 min read
Java Runnable Interfacejava.lang.Runnable is an interface that is to be implemented by a class whose instances are intended to be executed by a thread. There are two ways to start a new Thread - Subclass Thread and implement Runnable. There is no need to subclass a Thread when a task can be done by overriding only the run
3 min read
Callable and Future in JavaIn Java, multithreading allows tasks to run concurrently, improving performance and responsiveness. Traditionally, developers used the Runnable interface to define tasks, but it has two major limitations: it cannot return a result and cannot throw checked exceptions.To overcome these, Java introduce
2 min read
Difference Between Callable and Runnable in Javajava.lang.Runnable is an interface that is to be implemented by a class whose instances are intended to be executed by a thread. There are two ways to start a new Thread â Subclass Thread and implement Runnable. There is no need of sub-classing Thread when a task can be done by overriding only run()
3 min read
JDBC (Java Database Connectivity)
JDBC (Java Database Connectivity)JDBC is an API that helps applications to communicate with databases, it allows Java programs to connect to a database, run queries, retrieve, and manipulate data. Because of JDBC, Java applications can easily work with different relational databases like MySQL, Oracle, PostgreSQL, and more.JDBC Arc
5 min read
JDBC DriversDBC drivers are software components that enable Java applications to communicate with different types of databases. Each database (like MySQL, Oracle, or PostgreSQL) requires a specific JDBC driver that translates Java JDBC calls into the database-specific protocol.The JDBC classes are contained in
4 min read
Establishing JDBC Connection in JavaBefore performing any database operations, you first need to establish a connection using JDBC. This connection acts like a communication channel through which SQL queries are sent and results are received. Setting up this connection involves loading the database driver, specifying the database URL,
6 min read
Types of Statements in JDBCIn Java, the Statement interface in JDBC (Java Database Connectivity) is used to create and execute SQL queries in Java applications. JDBC provides three types of statements to interact with the database:StatementPrepared StatementCallable Statement1. StatementA Statement object is used for general-
5 min read
Java Frameworks
Introduction to Spring FrameworkThe Spring Framework is a lightweight Java framework widely used for building scalable, maintainable enterprise applications. It offers a comprehensive programming and configuration model for Java-based development.Benefits of Using Spring FrameworkSimplified Development: Spring reduces boilerplate
7 min read
Spring - Understanding Inversion of Control with ExampleSpring IoC (Inversion of Control) Container is the core of the Spring Framework. It creates and manages objects (beans), injects dependencies and manages their life cycles. It uses Dependency Injection (DI), based on configurations from XML files, Java-based configuration, annotations or POJOs. Sinc
6 min read
Introduction to Spring BootSpring is one of the most popular frameworks for building enterprise applications, but traditional Spring projects require heavy XML configuration, making them complex for beginners.Spring Boot solves this problem by providing a ready-to-use, production-grade framework on top of Spring. It eliminate
4 min read
Spring - MVC FrameworkThe Spring MVC Framework follows the Model-View-Controller architectural design pattern, which works around the Front Controller, i.e., the Dispatcher Servlet. The Dispatcher Servlet handles and dispatches all incoming HTTP requests to the appropriate controller. It uses @Controller and @RequestMapp
4 min read
How to Create a REST API using Java Spring Boot?Representational State Transfer (REST) is a software architectural style that defines a set of constraints for creating web services. RESTful web services allow systems to access and manipulate web resources through a uniform and predefined set of stateless operations. Unlike SOAP, which exposes its
4 min read
What is Spring Data JPA?Spring Data JPA is a powerful framework that simplifies database access in Spring Boot applications by providing an abstraction layer over the Java Persistence API (JPA). It enables seamless integration with relational databases using Object-Relational Mapping (ORM), eliminating the need for boilerp
6 min read
Spring - JDBC TemplateIn this article, we will discuss the Spring JDBC Template and how to configure the JDBC Template to execute queries. Spring JDBC Template provides a fluent API that improves code simplicity and readability, and the JDBC Template is used to connect to the database and execute SQL Queries. What is JDB
7 min read
Spring Hibernate Configuration and Create a Table in DatabaseSpring Boot and Hibernate are a powerful combination for building scalable and efficient database-driven applications. Spring Boot simplifies application development by reducing boilerplate code, while Hibernate, a popular ORM (Object-Relational Mapping) framework, enables easy database interactions
4 min read
Aspect Oriented Programming (AOP) in Spring FrameworkSpring AOP (Aspect-Oriented Programming) is a programming technique in the Spring Framework that helps separate cross-cutting concerns (like logging, security, transactions) from the main business logic. Instead of adding this logic inside every class, AOP allows you to write it once and apply it wh
3 min read
Introduction to Spring Security and its FeaturesSpring Security is a powerful authentication and authorization framework used to secure Java-based web applications. It easily integrates with Spring Boot and provides advanced security mechanisms such as OAuth2, JWT-based authentication, role-based access control, and protection against common vuln
3 min read
What is Spring Cloud?There are many reasons to use Spring Framework for example if you want faster development, less configuration, auto-configuration, embedded server, production-ready application, and many more. But apart from that most importantly we have ready-made support for microservices and this ready-made suppo
2 min read
Introduction and Working of Struts Web FrameworkStruts is an open-source web application framework developed by Apache Software Foundation, it is used to create a web application based on servlet and JSP. It depends on the MVC (Model View Controller) framework. Struts are thoroughly useful in building J2EE (Java 2 Platform, Enterprise Edition) ap
3 min read
JUnit
Introduction to JUnit 5JUnit is a Testing Framework. The Junit 5 is the latest version of the testing framework, and it has a lot of features when compared with Junit 4. JUnit 5, also known as JUnit Jupiter. It introduces several new features and improvements over its predecessor, JUnit 4, making it more powerful and flex
8 min read
JUnit 5 vs JUnit 4JUnit 4 and JUnit 5 both are Java-based testing frameworks, The JUnit 5 is the advanced testing framework when compared with JUnit 4. The JUnit provides a lot of features like Annotation-based coding, parallel test execution, and other features. Difference between JUnit 5 and JUnit 4TopicJUnit 5JUni
4 min read
How to Write Test Cases in Java Application using Mockito and Junit?Mockito is an open-source testing framework used for unit testing of Java applications. It plays a vital role in developing testable applications. Mockito is used to mock interfaces so that a dummy functionality can be added to a mock interface that can be used in Unit Testing. Unit Testing is a typ
4 min read
Unit Testing in Spring Boot Project using Mockito and JunitSpring Boot is a Java-based framework built on top of Spring that simplifies application development with minimal configuration. Itâs ideal for creating production-ready applications quickly, thanks to features like embedded servers, auto-configuration and reduced boilerplate code.Mockito is an open
4 min read
JUnit 5 - Test Suites with ExampleJUnit 5 encourages a modular approach to test creation with its test suites. These suites function as containers, bundling multiple test classes for easier management and collective execution within a single run. In this article, we will learn about JUnit 5 - Test Suites. Test SuiteIn JUnit 5, a tes
2 min read
JUnit 5 â JaCoCo Code CoverageIn simple terms, code coverage means measuring the percentage of lines of code that are executed during automated tests. For example, if you have a method containing 100 lines of code and you are writing a test case for it, code coverage tells you briefly how many of those lines were actively exerci
5 min read