Event Storage

Batch creates a per event backup of your event bus, message system, or data stream. We do this through our open source tool plumber which can be deployed to your infrastructure or attached to your system as a simple consumer.

Find out more benefits of using Batch's Event Storage:

Real Time Archiving

Every event consumed by Batch is archived in real time. We do this by immediately making your data available in our fast storage while simultaneously indexing, batching and transforming your event into parquet that gets stored in S3.


By providing real time retention for your event bus, we help companies achieve the following:

Save money by lowering the retention period of your Kafka cluster or gain retention, if your message bus does not support it.

Provide real-time data observability for monitoring business-critical event streams

Enable different teams to replay specific data for their own use case whether that is analytics, testing or back filling new services

Data Deserialization

If you serialize your data in Avro, Protobuf or Gzip, Batch will automatically deserialize events on the wire and index every field, enabling you to quickly and precisely pinpoint the data you’re looking for during search and replay.

The original blob is stored along with the deserialized data which allows you to replay the exact data that you originally published to your bus.

Automatic Schema Discovery

While most data platforms require you to define your schemas prior to importing your data, Batch automatically analyzes and generates your event schema that is then used for storing structured data in our fast storage and as parquet in S3.

The generated schema also automatically evolves as you add new fields, further simplifying the process for bringing in changes to your event structure.

Continue pushing data, regardless of schema changes.

Reach out to us to schedule your demo and learn how Batch can improve your event driven system!