Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
This guide helps Kafka developers build and migrate applications to Azure Event Hubs. Whether you're connecting an existing Kafka application or building a new streaming solution, you'll find quickstarts, tutorials, and integration patterns organized by your development journey.
Prerequisites
Before you start developing, ensure you have:
- An Azure Event Hubs namespace with Kafka enabled (Standard tier or higher)
- Your preferred Kafka client library installed
- Connection string or Microsoft Entra credentials for authentication
For an overview of how Event Hubs works with Kafka, see Event Hubs for Apache Kafka.
Get started in 5 minutes
The fastest way to connect to Event Hubs is to modify your existing Kafka client configuration. No code changes required—just update your connection settings.
Quick start: Data streaming with Event Hubs using the Kafka protocol walks you through connecting producers and consumers with just a configuration change.
Language-specific quickstarts
Choose your language to get a working producer and consumer sample:
| Language | Sample | Client library |
|---|---|---|
| Java | Quickstart | Apache Kafka client |
| C# / .NET | Quickstart | Confluent .NET client |
| Python | Quickstart | Confluent Python client |
| Node.js | Quickstart | node-rdkafka |
| Go | Quickstart | Confluent Go client |
| Go (Sarama) | Quickstart | Sarama client |
Command-line tools
For testing and debugging, use these CLI tools:
| Tool | Sample | Use case |
|---|---|---|
| Kafka CLI | Quickstart | Bundled with Apache Kafka distribution |
| kcat | Quickstart | Lightweight, fast CLI based on librdkafka |
Build streaming pipelines
Once you've connected your application, you can build more sophisticated streaming pipelines. This section covers integrations with popular stream processing frameworks and data integration tools.
Stream processing frameworks
Connect your stream processing applications to Event Hubs:
| Framework | Tutorial | Description |
|---|---|---|
| Apache Spark | Tutorial | Real-time streaming with Spark Structured Streaming |
| Apache Flink | Tutorial | Stateful stream processing with exactly-once semantics |
| Akka Streams | Tutorial | Reactive stream processing for Scala and Java |
| Azure Stream Analytics | Tutorial | No-code stream processing with SQL-like queries |
| Spring Cloud Stream | Tutorial | Spring Boot integration using Kafka binder |
Data integration with Kafka Connect
Kafka Connect enables you to stream data between Event Hubs and external systems using pre-built connectors:
| Resource | Description |
|---|---|
| Kafka Connect integration | Deploy and configure Kafka Connect with Event Hubs |
| Kafka Connect tutorial (GitHub) | End-to-end example with FileStreamSource and FileStreamSink |
Log aggregation and observability
Centralize logs from your infrastructure into Event Hubs:
| Tool | Tutorial | Description |
|---|---|---|
| Logstash | Tutorial | Elastic Stack log pipeline |
| Filebeat | Tutorial | Lightweight log shipper |
| FluentD | Tutorial | Unified logging layer |
| Apache NiFi | Tutorial | Visual dataflow management |
Migrate existing Kafka workloads
If you're migrating from an existing Kafka cluster, Event Hubs supports replication and hybrid scenarios.
Replicate data with MirrorMaker
Use Kafka MirrorMaker to replicate data from an existing Kafka cluster to Event Hubs:
| Resource | Description |
|---|---|
| Mirror a Kafka broker to Event Hubs | Step-by-step guide for MirrorMaker setup |
| MirrorMaker tutorial (GitHub) | Sample configurations and scripts |
Migration planning
For a complete migration guide, including configuration mapping and feature differences, see Apache Kafka migration guide for Event Hubs.
Advanced scenarios
Schema management
Manage schemas for your Kafka applications:
| Resource | Description |
|---|---|
| Azure Schema Registry | Native schema registry built into Event Hubs |
| Confluent Schema Registry integration | Use Confluent Schema Registry with Event Hubs |
Authentication with OAuth / Microsoft Entra ID
For production workloads, use Microsoft Entra ID instead of connection strings:
| Resource | Description |
|---|---|
| OAuth tutorial (GitHub) | Java and Go samples for OAuth authentication |
Protocol interoperability
Event Hubs supports multiple protocols. Learn how to exchange events between Kafka and AMQP clients:
| Resource | Description |
|---|---|
| Interop tutorial (GitHub) | Exchange events between different protocols |
Configuration reference
For recommended Kafka client configurations when using Event Hubs, see Apache Kafka client configurations. This guide covers:
- Required connection settings
- Configurations that differ from Kafka defaults
- Event Hubs-specific constraints
- Troubleshooting common configuration issues