Sharing Events with Streaming Channels
Sharing Events using Streaming Channels
Streaming channels provide a powerful way to share events between services. Unlike traditional message queues where each message is consumed by a single consumer, streaming channels allow multiple services to independently consume the same stream of events. Each service tracks its own position in the stream, enabling features like event replay and independent consumption rates.
Streaming channels are ideal for event-driven architectures where multiple services need to react to the same events independently. As Event can be published once, but consumed multiple times.
Architecture
To understand the difference better, let's assume we are using Queue Channel first, and we do have Service Map as follows:
#[ServiceContext]
public function serviceMap(): DistributedServiceMap
{
return DistributedServiceMap::initialize()
->withEventMapping(
channelName: "ticket_events",
subscriptionKeys: ["user.*"],
)
->withEventMapping(
channelName: "order_events",
subscriptionKeys: ["user.*"],
);
}This will map to this flow:

Like we can see, this requires delivery of same Events to both Channels ticket and order events. With Streaming Channels however, we can decide that publisher "announces" given set of Events, and let everyone joined the subscription instead:
This will map to this flow:

Then everyone can join the consumption from user_events, as Messages in Streaming Channels are not removed after consumption. Therefore we do have one shared event log, to be consumed by different Services.
Benefits of Streaming Channels for Event Distribution
Independent Consumption: Each service maintains its own position in the event stream
Event Replay: Services can replay events from any point in the stream
Scalability: Multiple consumers can read from the same stream without affecting each other
Durability: Events are persisted and can be consumed multiple times
Decoupling: Publishers don't need to know about consumers
In Memory Streaming Channel
For testing and development, you can use in-memory streaming channels:
Each service consuming from the same streaming channel should use a unique messageGroupId to track its position independently.
RabbitMQ Streaming Channel
RabbitMQ Streams provide high-throughput, persistent event streaming:
Configuration Options:
messageGroupId: Unique identifier for this consumer group - each service should have its own
commitInterval: How often to commit the consumer position (in number of messages)
Kafka Streaming Channel
Kafka provides distributed, fault-tolerant event streaming:
Configuration Options:
topicName: The Kafka topic to publish/consume from
messageGroupId: Kafka consumer group ID - each service should have its own
commitInterval: How often to commit the offset (in number of messages)
Multi-Service Example
Here's a complete example showing how multiple services can share events using a streaming channel:
Shared Service Map and Channel
Publisher Service (User Service):
Consumer Service 1 (Ticket Service):
Consumer Service 2 (Order Service):
Last updated
Was this helpful?