Batching events
Batching allows a function to process multiple events in a single run. This is useful for high load systems where it's more efficient to handle a batch of events together rather than handling each event individually. Some use cases for batching include:
- Reducing the number of requests to an external API that supports batch operations.
- Creating a batch of database writes to reduce the number of transactions.
- Reducing the number of requests to your Inngest app to improve performance or serverless costs.
How to configure batching
inngest.createFunction(
{
id: "record-api-calls",
batchEvents: {
maxSize: 100,
timeout: "5s",
},
},
{ event: "log/api.call" },
async ({ events, step }) => {
// NOTE: Use the `events` argument, which is an array of event payloads
const attrs = events.map((evt) => {
return {
user_id: evt.data.user_id,
endpoint: evt.data.endpoint,
timestamp: toDateTime(evt.ts),
};
});
const result = await step.run("record-data-to-db", async () => {
return db.bulkWrite(attrs);
});
return { success: true, recorded: result.length };
}
);
Configuration reference
maxSize
- The maximum number of events to add to a single batch.timeout
- The duration of time to wait to add events to a batch. If the batch is not full after this time, the function will be invoked with whatever events are in the current batch, regardless of size.
It is recommended to consider the overall batch size that you will need to process including the typical event payload size. Processing large batches can lead to memory or performance issues in your application.
How batching works
When batching is enabled, Inngest creates a new batch when the first event is received. The batch is filled with events until the maxSize
is reached or the timeout
is up. The function is then invoked with the full list of events in the batch.
Depending on your SDK, the events
argument will contain the full list of events within a batch. This allows you to operate on all of them within a single function.
Combining with other flow control methods
Batching does not work with all other flow control features.
You can combine batching with simple concurrency limits, but will not work correctly with the key
configuration option.
You cannot use batching with idempotency, rate limiting, or cancellation events.
Limitations
- The maximum batch size is
100
. For the free tier, the maximum batch size is25
.