youtube image
From YouTube: Data Processing at Scale with Knative and Benthos - Mihai Todor & Murugappan Sevugan Chetty, Box

Description

Data Processing at Scale with Knative and Benthos - Mihai Todor & Murugappan Sevugan Chetty, Box

Knative serving provides push-based autoscaling (scale based on rps/concurrency) leaving a requirement on a "component" to push these requests. This works well for real-time http/grpc requests. How about event processing and batch processing? For event processing, we could leverage webhooks or knative eventing (different types of sources, brokers, etc). Challenge lies in processing batch data from databases, CSV files, etc. These are common enterprise use cases. To attain auto-scaling for each batch usecase a bespoke component needs to be developed and this is where Benthos shines. Benthos is a stateless data streaming engine that implements transaction-based resiliency with backpressure. When connecting to at-least-once sources and sinks, it's able to guarantee at-least-once delivery without needing to persist messages during transit. Data transformations can be expressed using a DSL. It's a safe, fast, and powerful way to perform document mapping within Benthos. In this session, Mihai and Murugappan will demo how to leverage the best of Knative and Benthos to process data at scale.