In this post, I'll walk through my implementation of a production-ready product microservice using Spring Boot and Apache Kafka that follows event-driven architecture principles.
Overview
The microservice provides a REST API for creating products and publishes events to Kafka whenever new products are created. This enables other services in the system to react to product changes in real-time. Key features:
- REST API for product creation
- Kafka event publishing
- Configurable Kafka topics
- Comprehensive error handling
- Both synchronous and asynchronous publishing modes
- Detailed logging
Architecture
graph TD
A[Client] -->|HTTP POST /products| B[Product Controller]
B --> C[Product Service]
C -->|Publish Event| D[Kafka]
D --> E[Consumer Services]
Technology Stack
- Framework: Spring Boot 3.4.5
- Language: Java 17
- Messaging: Apache Kafka
- Build Tool: Maven
- Logging: SLF4J with Logback
Implementation Highlights
Kafka Configuration
The service automatically creates a Kafka topic with optimal settings:
@Bean
NewTopic createTopic() {
return TopicBuilder.name("products-created-events-topic")
.partitions(3)
.replicas(3)
.configs(Map.of("min.insync.replicas","2"))
.build();
}
Event Model
The product creation event contains all relevant product information:
public class ProductCreatedEvent {
private String productId;
private String title;
private BigDecimal price;
private Integer quantity;
// getters/setters
}
REST API
The service exposes a simple endpoint for product creation:
@PostMapping
public ResponseEntity<Object> createProduct(@RequestBody CreateProductRestModel product) {
try {
String productId = productService.createProduct(product);
return ResponseEntity.status(HttpStatus.CREATED).body(productId);
} catch (Exception e) {
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
.body(new ErrorMessage(new Date(), e.getMessage(),"/products"));
}
}
Event Publishing
The service supports both synchronous and asynchronous publishing:
// Synchronous
SendResult<String, ProductCreatedEvent> result = kafkaTemplate
.send("products-created-events-topic", productId, productCreatedEvent)
.get();
// Asynchronous
CompletableFuture<SendResult<String, ProductCreatedEvent>> future =
kafkaTemplate.send("products-created-events-topic", productId, productCreatedEvent);
Getting Started
Prerequisites
- JDK 17+
- Apache Kafka 3.0+
- Maven 3.6+
Running the Service
- Start Kafka:
bin/zookeeper-server-start.sh config/zookeeper.properties
bin/kafka-server-start.sh config/server.properties
- Run the application:
./mvnw spring-boot:run
- Test the API:
curl -X POST http://localhost:38079/products \
-H "Content-Type: application/json" \
-d '{"title":"iphone","price":800, "quantity":5}'
Future Enhancements
While the current implementation provides solid foundations, here are some potential improvements:
- Database Integration: Add persistence layer using JPA/Hibernate
- Validation: Implement comprehensive input validation
- API Documentation: Add Swagger/OpenAPI support
- Security: Implement JWT authentication
- Monitoring: Add metrics and health checks
Conclusion
This microservice demonstrates how to effectively combine Spring Boot and Kafka to build scalable, event-driven systems. The decoupled architecture allows for easy extension and integration with other services.
The complete source code is available on GitHub.