Build a real-time data pipeline with Apache Kafka*
Get your hands dirty with Apache Kafka building a simplistic streaming application that ingests data, performs transformations on it, and exposes it through APIs
Apache Kafka is an exciting new technology and a popular choice for building reactive data pipelines. Using a central event log and the pub/sub pattern, developers are able to write modular components that can be combined in flexible ways.
During this talk, we will step through a simple app that receives data from several sources. We will write stream processors that subscribe to topic so that they may normalize and transform data. Then we will write consumers that store data to isolate it from upstream failure and expose it through an HTTP API. Along with our pipeline, we will build sufficient logging to inspect the transformation process.
Any additional time will be spend discussing scaling, fault tolerance, deployment, and recovery.
streaming, kafka, data, pipeline, data pipeline, messaging
Never given this talk before and haven't spoken at a conference either.
However, I am an improviser and host at Curious Comedy Theater and love an audience :D
Software Engineer who started as copywriter. Michael started working for digital agencies, including ID Branding and Liquid Agency, before moving into software where he has worked on Cascade Energy’s SENSEI platform and OCHIN’s Acuere product.
In his freetime, Michael is an obsessive performer in Portland’s burgeoning improv scene.