Skip to content

Publishing#

The FastStream RedisBroker supports all standard publishing use cases similar to the KafkaBroker, allowing you to publish messages to Redis channels with ease.

Below you will find guidance on how to utilize the RedisBroker for publishing messages, including creating publisher objects and using decorators for streamlined publishing workflows.

Basic Redis Channel Publishing#

The RedisBroker allows you to publish messages directly to Redis channels. You can use Python primitives and pydantic.BaseModel to define the content of the message.

To publish a message to a Redis channel, follow these steps:

  1. Create your RedisBroker instance

    broker = RedisBroker("redis://localhost:6379")
    
  2. Publish a message using the publish method

    await broker.publish(msg, "input_data")
    

This is the most straightforward way to use the RedisBroker to publish messages to Redis channels.

Creating a publisher object#

For a more structured approach and to include your publishers in the AsyncAPI documentation, it's recommended to create publisher objects. Here's how to do it:

  1. Create your RedisBroker instance

    broker = RedisBroker("redis://localhost:6379")
    
  2. Create a publisher instance for a specific channel

    prepared_publisher = broker.publisher("input_data")
    
  3. Publish a message using the publish method of the prepared publisher

    await prepared_publisher.publish(msg)
    

When you encapsulate your broker within a FastStream object, the publisher will be documented in your service's AsyncAPI documentation.

Decorating your publishing functions#

Decorators in FastStream provide a convenient way to define the data flow within your application. The RedisBroker allows you to use decorators to publish messages to Redis channels, similar to the KafkaBroker.

By decorating a function with both @broker.subscriber(...) and @broker.publisher(...), you create a DataPipeline unit that processes incoming messages and publishes the results to another channel. The order of decorators does not matter, but they must be applied to a function that has already been decorated by a @broker.subscriber(...).

The decorated function should have a return type annotation to ensure the correct interpretation of the return value before it's published.

Here's an example of using decorators with RedisBroker:

from pydantic import BaseModel, Field, NonNegativeFloat

from faststream import FastStream
from faststream.redis import RedisBroker


class Data(BaseModel):
    data: NonNegativeFloat = Field(
        ..., examples=[0.5], description="Float data example"
    )


broker = RedisBroker("redis://localhost:6379")
app = FastStream(broker)


to_output_data = broker.publisher("output_data")


@to_output_data
@broker.subscriber("input_data")
async def on_input_data(msg: Data) -> Data:
    return Data(data=msg.data + 1.0)
  1. Initialize the RedisBroker instance: Start by creating a RedisBroker instance.

    broker = RedisBroker("redis://localhost:6379")
    
  2. Prepare your publisher object to be used as a decorator:

    to_output_data = broker.publisher("output_data")
    
  3. Create your processing logic: Implement a function that will process incoming messages and produce a response to be published to another Redis channel.

    async def on_input_data(msg: Data) -> Data:
        return Data(data=msg.data + 1.0)
    
  4. Decorate your processing function: Apply the @broker.subscriber(...) and @broker.publisher(...) decorators to your function to define the input channel and the output channel, respectively. Once your application is running, this decorated function will be triggered whenever a new message arrives on the "input_data" channel, and it will publish the result to the "output_data" channel.

    1
    2
    3
    4
    @to_output_data
    @broker.subscriber("input_data")
    async def on_input_data(msg: Data) -> Data:
        return Data(data=msg.data + 1.0)