Skip to content

Commit

Permalink
Add note
Browse files Browse the repository at this point in the history
  • Loading branch information
gwaramadze committed Nov 27, 2024
1 parent 5b502e9 commit eba26c5
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions docs/connectors/sinks/amazon-s3-sink.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,8 @@ pip install quixstreams[s3]

It batches processed records in memory per topic partition and writes them to S3 objects in a specified bucket and prefix structure. Objects are organized by topic and partition, with each batch being written to a separate object named by its starting offset.

Batches are written to S3 during the commit phase of processing. This means the size of each batch (and therefore each S3 object) is influenced by your application's commit settings - either through `commit_interval` or the `commit_every` parameters.

!!! note

The S3 bucket must already exist and be accessible. The sink does not create the bucket automatically. If the bucket does not exist or access is denied, an error will be raised when initializing the sink.
Expand Down

0 comments on commit eba26c5

Please sign in to comment.