Practically all business data is produced as an infinite stream of events: sensor measurements, website engagements, geo-location data from industrial IoT devices, database modifications, stock trades, and financial transactions, to name a few. Successful data-driven organizations go beyond just discovering valuable insights once, but do so continuously and in real-time.
As stream processing frameworks like Apache Flink become the norm for real-time analytics and event-driven applications, SQL is making a comeback as a way to lower the entry barrier to streaming. But how do you take the leap from analytics-based historical data to real-time insights on streams?
This talk will walk you through the full development pipeline, from initial interactive analysis to continuous production deployment using Flink SQL’s unique approach to unified batch and stream processing. You will learn how to use some of Flink’s most powerful features such as temporal table joins for working with historical data and complex pattern detection using MATCH_RECOGNIZE.
Seth Wiesman, Solutions Architect at Ververica
Seth Wiesman is a Solutions Architect at Ververica, where he works with engineering teams inside of various organizations to build the best possible stream processing architecture for their use cases.