We're excited to announce native support for Apache Kafka in Grafbase Extensions, bringing real-time event streaming capabilities directly to your federated GraphQL APIs.
With this launch, you can now declaratively integrate Kafka topics into your federated GraphQL API as a virtual subgraph. That means no subgraph infrastructure, no manual stitching, and no extra services required. It's all managed natively within the Grafbase platform.
Grafbase Extensions let you customize and extend your federated GraphQL API without deploying separate subgraphs. Extensions accelerate GraphQL Federation adoption by making it easy to integrate services like databases, authentication, storage - and now event streaming - declaratively within your schema.
We've previously launched extensions like Postgres, gRPC, NATS, and Snowflake - which are all available on the Grafbase Extensions Marketplace.
Now we're introducing event streaming for Federation with the Kafka Extension.
The Kafka extension supports two primary operations via GraphQL directives:
- Publishing messages to Kafka topics
- Subscribing to topics for real-time updates
It abstracts the complexity of Kafka client configuration, connection pooling, and message serialization, so you can focus on your application logic.
First, install the Grafbase CLI if you haven't already:
curl -fsSL https://grafbase.com/downloads/cli | bash
Add the Kafka extension to your grafbase.toml
configuration:
[extensions.kafka]
version = "0.1"
[[extensions.kafka.config.endpoint]]
bootstrap_servers = ["localhost:9092"]
# This is only needed when running `grafbase dev`
[subgraphs.kafka]
schema_path = "subgraph.graphql"
For production environments with authentication:
[[extensions.kafka.config.endpoint]]
name = "production"
bootstrap_servers = ["kafka-1.example.com:9092", "kafka-2.example.com:9092"]
[extensions.kafka.config.endpoint.tls]
system_ca = true
[extensions.kafka.config.endpoint.authentication]
type = "sasl_scram"
username = "my-kafka-user"
password = "my-kafka-password"
mechanism = "sha512"
# This is only needed when running `grafbase dev`
[subgraphs.kafka]
schema_path = "subgraph.graphql"
Create a schema file (subgraph.graphql
) that uses Kafka directives:
extend schema
@link(
url: "https://grafbase.com/extensions/kafka/0.1"
import: ["@kafkaProducer", "@kafkaPublish", "@kafkaSubscription"]
)
@kafkaProducer(
name: "eventProducer"
topic: "user-events"
)
type Mutation {
publishUserEvent(userId: String!, input: UserEventInput!): Boolean!
@kafkaPublish(producer: "eventProducer", key: "user-{{args.userId}}")
}
type Subscription {
userEvents(userId: String!): UserEvent!
@kafkaSubscription(
topic: "user-events"
keyFilter: "user-{{args.userId}}"
)
}
input UserEventInput {
action: String!
metadata: JSON
}
type UserEvent {
action: String!
metadata: JSON
timestamp: String!
}
Launch the development server to test your new API:
grafbase dev
After testing locally, publish your schema as a virtual subgraph:
grafbase login
grafbase publish \
--name kafka \
my-org/my-graph@branch \
-m "add kafka integration" \
--virtual
Start the Grafbase Gateway with your configuration:
export GRAFBASE_ACCESS_TOKEN=<your_access_token>
grafbase-gateway --graph my-graph@branch --config grafbase.toml
The Gateway handles all Kafka connections internally - your virtual subgraph doesn't need a URL because the extension manages all communication with your Kafka cluster.
An important consideration: the @kafkaSubscription
directive is designed for delivering real-time notifications to frontend applications, not for implementing full-fledged Kafka consumers.
Here's why:
- No offset management: The extension does not make guarantees on message delivery order or exactly-once processing
- No consumer groups: Each GraphQL subscription creates an ephemeral consumer
- Message filtering: Use
keyFilter
andselection
to deliver only relevant messages to clients - Simplified delivery: Perfect for UI updates, notifications, and real-time dashboards
For complex consumer scenarios requiring exactly-once processing, consumer group coordination, or manual offset management, implement a dedicated Kafka consumer service. Think of GraphQL subscriptions as a notification pipeline where occasional message loss is acceptable.
Example use cases perfect for GraphQL subscriptions:
type Subscription {
# Real-time order status updates for a user's dashboard
myOrderUpdates(customerId: String!): OrderUpdate!
@kafkaSubscription(
topic: "order-updates"
keyFilter: "customer-{{args.customerId}}"
)
# High-value transaction alerts
highValueTransactions(threshold: Float!): Transaction!
@kafkaSubscription(
topic: "transactions"
selection: "select(.amount > {{args.threshold}})"
)
}
Please find a full example from our GitHub repository.
Event streaming now requires just a few lines of configuration. The Kafka extension brings Apache Kafka's power to your federated GraphQL architecture without requiring you to manage a separate subgraph for publishing or subscribing from Kafka. Whether you're adding real-time features to an existing API or building an event-driven system from scratch, Kafka + Grafbase gives you the best of both worlds: the reliability of Kafka with the developer experience of GraphQL Federation.