KafkaBroker
faststream.kafka.KafkaBroker #
KafkaBroker(
bootstrap_servers: str | Iterable[str] = "localhost",
*,
protocol: str = None,
protocol_version: str = "auto",
client_id: str = "faststream-" + __version__,
security: BaseSecurity | None = None,
**kwargs: Any
)
Bases: KafkaLoggingMixin
, BrokerAsyncUsecase[ConsumerRecord, ConsumerConnectionParams]
KafkaBroker is a class for managing Kafka message consumption and publishing.
It extends BrokerAsyncUsecase to handle asynchronous operations.
PARAMETER | DESCRIPTION |
---|---|
bootstrap_servers | Kafka bootstrap server(s). |
protocol | The protocol used (default is "kafka"). TYPE: |
protocol_version | The Kafka protocol version (default is "auto"). TYPE: |
client_id | The client ID for the Kafka client. TYPE: |
**kwargs | Additional keyword arguments. TYPE: |
METHOD | DESCRIPTION |
---|---|
connect | Establishes a connection to Kafka. |
start | Starts the KafkaBroker and message handlers. |
publish | Publishes a message to Kafka. |
Initialize a KafkaBroker instance.
PARAMETER | DESCRIPTION |
---|---|
bootstrap_servers | Kafka bootstrap server(s). |
protocol | The protocol used (default is "kafka"). TYPE: |
protocol_version | The Kafka protocol version (default is "auto"). TYPE: |
client_id | The client ID for the Kafka client. TYPE: |
security | Security protocol to use in communication with the broker (default is None). TYPE: |
**kwargs | Additional keyword arguments. TYPE: |
Source code in faststream/kafka/broker.py
close async
#
close(
exc_type: Optional[Type[BaseException]] = None,
exc_val: Optional[BaseException] = None,
exec_tb: Optional[TracebackType] = None,
) -> None
Closes the object.
PARAMETER | DESCRIPTION |
---|---|
exc_type | The type of the exception being handled, if any. TYPE: |
exc_val | The exception instance being handled, if any. TYPE: |
exec_tb | The traceback of the exception being handled, if any. TYPE: |
RETURNS | DESCRIPTION |
---|---|
None | None |
RAISES | DESCRIPTION |
---|---|
NotImplementedError | If the method is not implemented. |
Source code in faststream/broker/core/asynchronous.py
connect async
#
connect(
*args: Any, **kwargs: Any
) -> ConsumerConnectionParams
Establishes a connection to Kafka and returns connection parameters.
PARAMETER | DESCRIPTION |
---|---|
*args | Additional arguments. TYPE: |
**kwargs | Additional keyword arguments. TYPE: |
RETURNS | DESCRIPTION |
---|---|
ConsumerConnectionParams | The connection parameters. TYPE: |
Source code in faststream/kafka/broker.py
include_router #
include_router(router: BrokerRouter[Any, MsgType]) -> None
Includes a router in the current object.
PARAMETER | DESCRIPTION |
---|---|
router | The router to be included. TYPE: |
RETURNS | DESCRIPTION |
---|---|
None | None |
Source code in faststream/broker/core/abc.py
include_routers #
include_routers(
*routers: BrokerRouter[Any, MsgType]
) -> None
Includes routers in the current object.
PARAMETER | DESCRIPTION |
---|---|
*routers | Variable length argument list of routers to include. TYPE: |
RETURNS | DESCRIPTION |
---|---|
None | None |
Source code in faststream/broker/core/abc.py
publish async
#
Publish a message to Kafka.
PARAMETER | DESCRIPTION |
---|---|
*args | Positional arguments for message publishing. TYPE: |
**kwargs | Keyword arguments for message publishing. TYPE: |
RAISES | DESCRIPTION |
---|---|
RuntimeError | If KafkaBroker is not started yet. |
Source code in faststream/kafka/broker.py
publish_batch async
#
Publish a batch of messages to Kafka.
PARAMETER | DESCRIPTION |
---|---|
*args | Positional arguments for message publishing. TYPE: |
**kwargs | Keyword arguments for message publishing. TYPE: |
RAISES | DESCRIPTION |
---|---|
RuntimeError | If KafkaBroker is not started yet. |
Source code in faststream/kafka/broker.py
publisher #
publisher(
topic: str,
key: bytes | None = None,
partition: int | None = None,
timestamp_ms: int | None = None,
headers: dict[str, str] | None = None,
reply_to: str = "",
batch: bool = False,
title: str | None = None,
description: str | None = None,
schema: Any | None = None,
include_in_schema: bool = True,
) -> Publisher
Create a message publisher for the specified topic.
PARAMETER | DESCRIPTION |
---|---|
topic | The topic to publish messages to. TYPE: |
key | Message key. |
partition | Partition to send the message to. |
timestamp_ms | Message timestamp in milliseconds. |
headers | Message headers. |
reply_to | The topic to which responses should be sent. TYPE: |
batch | Whether to publish messages in batches. TYPE: |
title | AsyncAPI title. |
description | AsyncAPI description. |
schema | AsyncAPI schema. |
include_in_schema | Whether to include the message publisher in the AsyncAPI schema. TYPE: |
RETURNS | DESCRIPTION |
---|---|
Publisher | A message publisher. TYPE: |
Source code in faststream/kafka/broker.py
start async
#
Start the KafkaBroker and message handlers.
Source code in faststream/kafka/broker.py
subscriber #
subscriber(
*topics: str,
group_id: Optional[str] = None,
key_deserializer: Optional[
Callable[[bytes], Any]
] = None,
value_deserializer: Optional[
Callable[[bytes], Any]
] = None,
fetch_max_wait_ms: int = 500,
fetch_max_bytes: int = 52428800,
fetch_min_bytes: int = 1,
max_partition_fetch_bytes: int = 1 * 1024 * 1024,
auto_offset_reset: Literal[
"latest", "earliest", "none"
] = "latest",
auto_commit: bool = True,
auto_commit_interval_ms: int = 5000,
check_crcs: bool = True,
partition_assignment_strategy: Sequence[
AbstractPartitionAssignor
] = (RoundRobinPartitionAssignor),
max_poll_interval_ms: int = 300000,
rebalance_timeout_ms: Optional[int] = None,
session_timeout_ms: int = 10000,
heartbeat_interval_ms: int = 3000,
consumer_timeout_ms: int = 200,
max_poll_records: Optional[int] = None,
exclude_internal_topics: bool = True,
isolation_level: Literal[
"read_uncommitted", "read_committed"
] = "read_uncommitted",
dependencies: Sequence[Depends] = (),
parser: Optional[
Union[
CustomParser[ConsumerRecord, KafkaMessage],
CustomParser[
Tuple[ConsumerRecord, ...], KafkaMessage
],
]
] = None,
decoder: Optional[CustomDecoder] = None,
middlewares: Optional[
Sequence[Callable[[ConsumerRecord], BaseMiddleware]]
] = None,
filter: Union[
Filter[KafkaMessage],
Filter[StreamMessage[Tuple[ConsumerRecord, ...]]],
] = default_filter,
batch: bool = False,
max_records: Optional[int] = None,
batch_timeout_ms: int = 200,
no_ack: bool = False,
title: Optional[str] = None,
description: Optional[str] = None,
include_in_schema: bool = True,
**original_kwargs: Any
) -> Callable[
[Callable[P_HandlerParams, T_HandlerReturn]],
Union[
HandlerCallWrapper[
ConsumerRecord, P_HandlerParams, T_HandlerReturn
],
HandlerCallWrapper[
Tuple[ConsumerRecord, ...],
P_HandlerParams,
T_HandlerReturn,
],
],
]
Create a message subscriber for the specified topics.
PARAMETER | DESCRIPTION |
---|---|
*topics | The topics to subscribe to. TYPE: |
group_id | The Kafka consumer group ID. |
key_deserializer | Key deserializer function. |
value_deserializer | Value deserializer function. |
fetch_max_wait_ms | The maximum time to wait for data. TYPE: |
fetch_max_bytes | The maximum number of bytes to fetch. TYPE: |
fetch_min_bytes | The minimum number of bytes to fetch. TYPE: |
max_partition_fetch_bytes | The maximum bytes to fetch for a partition. TYPE: |
auto_offset_reset | Auto offset reset policy. TYPE: |
auto_commit | Whether to enable auto-commit. TYPE: |
auto_commit_interval_ms | Auto-commit interval in milliseconds. TYPE: |
check_crcs | Whether to check CRCs. TYPE: |
partition_assignment_strategy | Partition assignment strategy. TYPE: |
max_poll_interval_ms | Maximum poll interval in milliseconds. TYPE: |
rebalance_timeout_ms | Rebalance timeout in milliseconds. |
session_timeout_ms | Session timeout in milliseconds. TYPE: |
heartbeat_interval_ms | Heartbeat interval in milliseconds. TYPE: |
consumer_timeout_ms | Consumer timeout in milliseconds. TYPE: |
max_poll_records | Maximum number of records to poll. |
exclude_internal_topics | Whether to exclude internal topics. TYPE: |
isolation_level | Isolation level. TYPE: |
dependencies | Additional dependencies for message handling. TYPE: |
parser | Message parser. TYPE: |
decoder | Message decoder. TYPE: |
middlewares | Message middlewares. TYPE: |
filter | Message filter. TYPE: |
batch | Whether to process messages in batches. TYPE: |
max_records | Maximum number of records to process in each batch. |
batch_timeout_ms | Batch timeout in milliseconds. TYPE: |
no_ack | Whether not to ack/nack/reject messages. TYPE: |
title | AsyncAPI title. |
description | AsyncAPI description. |
include_in_schema | Whether to include the message handler in the AsyncAPI schema. TYPE: |
**original_kwargs | Additional keyword arguments. TYPE: |
RETURNS | DESCRIPTION |
---|---|
Callable | A decorator that wraps a message handler function. TYPE: |
Source code in faststream/kafka/broker.py
254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 |
|