Skip to content

Lifespan Hooks#

Usage example#

Let's imagine that your application uses pydantic as your settings manager.

I highly recommend using pydantic for these purposes, because this dependency is already used at FastStream and you don't have to install an additional package

Also, let's imagine that you have several .env, .env.development, .env.test, .env.production files with your application settings, and you want to switch them at startup without any code changes.

By passing optional arguments with the command line to your code FastStream allows you to do this easily.

Lifespan#

Let's write some code for our example

from pydantic_settings import BaseSettings

from faststream import ContextRepo, FastStream
from faststream.kafka import KafkaBroker

broker = KafkaBroker()
app = FastStream(broker)


class Settings(BaseSettings):
    host: str = "localhost:9092"


@app.on_startup
async def setup(context: ContextRepo, env: str = ".env"):
    settings = Settings(_env_file=env)
    context.set_global("settings", settings)
    await broker.connect(settings.host)
from pydantic_settings import BaseSettings

from faststream import ContextRepo, FastStream
from faststream.confluent import KafkaBroker

broker = KafkaBroker()
app = FastStream(broker)


class Settings(BaseSettings):
    host: str = "localhost:9092"


@app.on_startup
async def setup(context: ContextRepo, env: str = ".env"):
    settings = Settings(_env_file=env)
    context.set_global("settings", settings)
    await broker.connect(settings.host)
from pydantic_settings import BaseSettings

from faststream import ContextRepo, FastStream
from faststream.rabbit import RabbitBroker

broker = RabbitBroker()
app = FastStream(broker)


class Settings(BaseSettings):
    host: str = "amqp://guest:guest@localhost:5672/" 


@app.on_startup
async def setup(context: ContextRepo, env: str = ".env"):
    settings = Settings(_env_file=env)
    context.set_global("settings", settings)
    await broker.connect(settings.host)
from pydantic_settings import BaseSettings

from faststream import ContextRepo, FastStream
from faststream.nats import NatsBroker

broker = NatsBroker()
app = FastStream(broker)


class Settings(BaseSettings):
    host: str = "nats://localhost:4222"


@app.on_startup
async def setup(context: ContextRepo, env: str = ".env"):
    settings = Settings(_env_file=env)
    context.set_global("settings", settings)
    await broker.connect(settings.host)
from pydantic_settings import BaseSettings

from faststream import ContextRepo, FastStream
from faststream.redis import RedisBroker

broker = RedisBroker()
app = FastStream(broker)


class Settings(BaseSettings):
    host: str = "redis://localhost:6379"


@app.on_startup
async def setup(context: ContextRepo, env: str = ".env"):
    settings = Settings(_env_file=env)
    context.set_global("settings", settings)
    await broker.connect(settings.host)

Now this application can be run using the following command to manage the environment:

faststream run serve:app --env .env.test

Details#

Now let's look into a little more detail

To begin with, we used a decorator

@app.on_startup
async def setup(context: ContextRepo, env: str = ".env"):
    settings = Settings(_env_file=env)
    context.set_global("settings", settings)
    await broker.connect(settings.host)

to declare a function that should run when our application starts

The next step is to declare the arguments that our function will receive

@app.on_startup
async def setup(context: ContextRepo, env: str = ".env"):
    settings = Settings(_env_file=env)
    context.set_global("settings", settings)
    await broker.connect(settings.host)

In this case, the env field will be passed to the setup function from the arguments with the command line

Tip

The default lifecycle functions are used with the decorator @apply_types, therefore, all context fields and dependencies are available in them

Then, we initialized the settings of our application using the file passed to us from the command line

@app.on_startup
async def setup(context: ContextRepo, env: str = ".env"):
    settings = Settings(_env_file=env)
    context.set_global("settings", settings)
    await broker.connect(settings.host)

And put these settings in a global context

@app.on_startup
async def setup(context: ContextRepo, env: str = ".env"):
    settings = Settings(_env_file=env)
    context.set_global("settings", settings)
    await broker.connect(settings.host)
@app.on_startup
async def setup(context: ContextRepo, env: str = ".env"):
    settings = Settings(_env_file=env)
    context.set_global("settings", settings)
    await broker.connect(settings.host)
@app.on_startup
async def setup(context: ContextRepo, env: str = ".env"):
    settings = Settings(_env_file=env)
    context.set_global("settings", settings)
    await broker.connect(settings.host)
@app.on_startup
async def setup(context: ContextRepo, env: str = ".env"):
    settings = Settings(_env_file=env)
    context.set_global("settings", settings)
    await broker.connect(settings.host)
@app.on_startup
async def setup(context: ContextRepo, env: str = ".env"):
    settings = Settings(_env_file=env)
    context.set_global("settings", settings)
    await broker.connect(settings.host)
Note

Now we can access our settings anywhere in the application right from the context

from faststream import Context, apply_types
@apply_types
async def func(settings = Context()): ...

The last step we initialized our broker: now, when the application starts, it will be ready to receive messages

@app.on_startup
async def setup(context: ContextRepo, env: str = ".env"):
    settings = Settings(_env_file=env)
    context.set_global("settings", settings)
    await broker.connect(settings.host)

Another example#

Now let's imagine that we have a machine learning model that needs to process messages from some broker.

Initialization of such models usually takes a long time. It would be wise to do this at the start of the application, and not when processing each message.

You can initialize your model somewhere at the top of your module/file. However, in this case, this code will be run even just in case of importing this module, for example, during testing. It is unlikely that you want to run your model on every test run...

Therefore, it is worth initializing the model in the @app.on_startup hook.

Also, we don't want the model to finish its work incorrectly when the application is stopped. To avoid this, we need the hook @app.on_shutdown

from faststream import Context, ContextRepo, FastStream
from faststream.kafka import KafkaBroker

broker = KafkaBroker("localhost:9092")
app = FastStream(broker)

ml_models = {}  # fake ML model


def fake_answer_to_everything_ml_model(x: float):
    return x * 42


@app.on_startup
async def setup_model(context: ContextRepo):
    # Load the ML model
    ml_models["answer_to_everything"] = fake_answer_to_everything_ml_model
    context.set_global("model", ml_models)


@app.on_shutdown
async def shutdown_model(model: dict = Context()):
    # Clean up the ML models and release the resources
    model.clear()


@broker.subscriber("test")
async def predict(x: float, model=Context()):
    result = model["answer_to_everything"](x)
    return {"result": result}
from faststream import Context, ContextRepo, FastStream
from faststream.confluent import KafkaBroker

broker = KafkaBroker("localhost:9092")
app = FastStream(broker)

ml_models = {}  # fake ML model


def fake_answer_to_everything_ml_model(x: float):
    return x * 42


@app.on_startup
async def setup_model(context: ContextRepo):
    # Load the ML model
    ml_models["answer_to_everything"] = fake_answer_to_everything_ml_model
    context.set_global("model", ml_models)


@app.on_shutdown
async def shutdown_model(model: dict = Context()):
    # Clean up the ML models and release the resources
    model.clear()


@broker.subscriber("test")
async def predict(x: float, model=Context()):
    result = model["answer_to_everything"](x)
    return {"result": result}
from faststream import Context, ContextRepo, FastStream
from faststream.rabbit import RabbitBroker

broker = RabbitBroker("amqp://guest:guest@localhost:5672/")
app = FastStream(broker)

ml_models = {}  # fake ML model


def fake_answer_to_everything_ml_model(x: float):
    return x * 42


@app.on_startup
async def setup_model(context: ContextRepo):
    # Load the ML model
    ml_models["answer_to_everything"] = fake_answer_to_everything_ml_model
    context.set_global("model", ml_models)


@app.on_shutdown
async def shutdown_model(model: dict = Context()):
    # Clean up the ML models and release the resources
    model.clear()


@broker.subscriber("test")
async def predict(x: float, model=Context()):
    result = model["answer_to_everything"](x)
    return {"result": result}
from faststream import Context, ContextRepo, FastStream
from faststream.nats import NatsBroker

broker = NatsBroker("nats://localhost:4222")
app = FastStream(broker)

ml_models = {}  # fake ML model


def fake_answer_to_everything_ml_model(x: float):
    return x * 42


@app.on_startup
async def setup_model(context: ContextRepo):
    # Load the ML model
    ml_models["answer_to_everything"] = fake_answer_to_everything_ml_model
    context.set_global("model", ml_models)


@app.on_shutdown
async def shutdown_model(model: dict = Context()):
    # Clean up the ML models and release the resources
    model.clear()


@broker.subscriber("test")
async def predict(x: float, model=Context()):
    result = model["answer_to_everything"](x)
    return {"result": result}
from faststream import Context, ContextRepo, FastStream
from faststream.redis import RedisBroker

broker = RedisBroker("redis://localhost:6379")
app = FastStream(broker)

ml_models = {}  # fake ML model


def fake_answer_to_everything_ml_model(x: float):
    return x * 42


@app.on_startup
async def setup_model(context: ContextRepo):
    # Load the ML model
    ml_models["answer_to_everything"] = fake_answer_to_everything_ml_model
    context.set_global("model", ml_models)


@app.on_shutdown
async def shutdown_model(model: dict = Context()):
    # Clean up the ML models and release the resources
    model.clear()


@broker.subscriber("test")
async def predict(x: float, model=Context()):
    result = model["answer_to_everything"](x)
    return {"result": result}

Multiple hooks#

If you want to declare multiple lifecycle hooks, they will be used in the order they are registered:

from faststream import Context, ContextRepo, FastStream

app = FastStream()


@app.on_startup
async def setup(context: ContextRepo):
    context.set_global("field", 1)


@app.on_startup
async def setup_later(field: int = Context()):
    assert field == 1

Some more details#

Async or not async#

In the asynchronous version of the application, both asynchronous and synchronous methods can be used as hooks. In the synchronous version, only synchronous methods are available.

Command line arguments#

Command line arguments are available in all @app.on_startup hooks. To use them in other parts of the application, put them in the ContextRepo.

Broker initialization#

The @app.on_startup hooks are called BEFORE the broker is launched by the application. The @app.after_shutdown hooks are triggered AFTER stopping the broker.

If you want to perform some actions AFTER initializing the broker: send messages, initialize objects, etc., you should use the @app.after_startup hook.