← Back to Sessions

Unlocking Real-Time, Reliable, and Reusable Data

210 minIntermediateSydney
WorkshopConfluentKafkaTableflowReal-time DataApache IcebergData StreamingAnalyticsApache Flink

Description

Tired of complex, high-maintenance data pipelines? This workshop tackles the challenge of moving operational data to analytics by guiding you through building a robust, end-to-end event streaming system. Unlock real-time insights and streamline your data architecture with industry-leading tools.

Abstract

Tired of complex, high-maintenance data pipelines? This workshop tackles the challenge of moving operational data to analytics by guiding you through building a robust, end-to-end event streaming system. Unlock real-time insights and streamline your data architecture with industry-leading tools. What You'll Learn: ✓ An introduction to the fundamentals of event streaming with Apache Kafka. ✓ How Apache Flink provides powerful capabilities for real-time stream processing and transformations. ✓ The role of Tableflow in seamlessly producing high-quality Apache Iceberg data, and how to access this data from AWS. ✓ Practical techniques for integrating Kafka Connectors for efficient data ingestion. ✓ How to build and deploy a complete event streaming pipeline from source to sink, leveraging these cutting-edge technologies.

Key Takeaways

  • Master the fundamentals of event streaming with Apache Kafka
  • Learn real-time stream processing and transformations with Apache Flink
  • Understand how Tableflow produces high-quality Apache Iceberg data
  • Integrate Kafka Connectors for efficient data ingestion
  • Build and deploy complete event streaming pipelines from source to sink

Prerequisites

  • Basic understanding of data engineering concepts
  • Familiarity with data pipelines and ETL/ELT processes
  • Interest in real-time data solutions
LinkedIn Tracking