← Back to Workshops

Unlocking Real-Time, Reliable, and Reusable Data

Brought to you by

210 minIntermediate12:00-14:00Max 20 participantsMelbourne
WorkshopConfluentKafkaTableflowReal-time DataApache IcebergData StreamingAnalytics

Description

Most organisations are struggling with an incredibly common task - moving your operational data to your analytics estate. Huge effort goes into creating, monitoring, and maintaining data pipelines. The business relies on these pipelines.

Abstract

Most organisations are struggling with an incredibly common task - moving your operational data to your analytics estate. Huge effort goes into creating, monitoring, and maintaining data pipelines. The business relies on these pipelines. But current ETL and ELT processes are brittle. Pipelines break, the same data is processed multiple times, and trust in the data erodes. Organisations spend time and money just keeping these pipelines running, at the expense of building meaningful data products. Tableflow solves this problem. Tableflow = Better, Faster, More Trustworthy Data at the Click of a Button Tableflow helps you unite analytics and operations with data streaming. It converts streaming data to Apache Iceberg® tables for data warehouses, data lakes, and analytics engines. Confluent's unified platform, including Kafka, Flink, Stream Governance, and Tableflow for Iceberg and Delta Lake, helps businesses streamline analytics, reduce complexity, and maximise ROI. The future of data is real-time, governed, and efficient. Let's explore how Shift Left can transform your analytics stack. Join us for lunch to learn more about Tableflow including a demo.

Key Takeaways

  • Understand how Tableflow solves data pipeline challenges
  • Learn about real-time data streaming with Apache Iceberg
  • Explore Confluent's unified platform capabilities
  • Discover how to streamline analytics and reduce complexity
  • See practical demonstrations of Tableflow in action

Prerequisites

  • Basic understanding of data engineering concepts
  • Familiarity with data pipelines and ETL/ELT processes
  • Interest in real-time data solutions

Required Materials

  • Laptop with internet access
  • Basic understanding of data engineering concepts
LinkedIn Tracking