40 min

Data Modeling for Apache Kafka – Streams, Topics & More with Dani Traphagen Streaming Audio: A Confluent podcast about Apache Kafka®

    • Technology

Helping users be successful when it comes to using Apache Kafka® is a large part of Dani Traphagen’s role as a senior systems engineer at Confluent. Whether she’s advising companies on implementing parts of Kafka or rebuilding their systems entirely from the ground up, Dani is passionate about event-driven architecture and the way streaming data provides real-time insights on business activity. 
She explains the concept of a stream, topic, key, and stream-table duality, and how each of these pieces relate to one another. When it comes to data modeling, Dani covers importance business requirements, including the need for a domain model, practicing domain-driven design principles, and bounded context. She also discusses the attributes of data modeling: time, source, key, header, metadata, and payload, in addition to exploring the significance of data governance and lineage and performing joins.
EPISODE LINKS
Convert from table to stream and stream to table Distributed, Real-Time Joins and Aggregations on User Activity Events Using Kafka StreamsKSQL in Action: Real-Time Streaming ETL from Oracle Transactional DataKSQL in Action: Enriching CSV Events with Data from RDBMS into AWSJourney to Event Driven – Part 4: Four Pillars of Event Streaming MicroservicesJoin the Confluent Community SlackFully managed Apache Kafka as a service! Try free.

Helping users be successful when it comes to using Apache Kafka® is a large part of Dani Traphagen’s role as a senior systems engineer at Confluent. Whether she’s advising companies on implementing parts of Kafka or rebuilding their systems entirely from the ground up, Dani is passionate about event-driven architecture and the way streaming data provides real-time insights on business activity. 
She explains the concept of a stream, topic, key, and stream-table duality, and how each of these pieces relate to one another. When it comes to data modeling, Dani covers importance business requirements, including the need for a domain model, practicing domain-driven design principles, and bounded context. She also discusses the attributes of data modeling: time, source, key, header, metadata, and payload, in addition to exploring the significance of data governance and lineage and performing joins.
EPISODE LINKS
Convert from table to stream and stream to table Distributed, Real-Time Joins and Aggregations on User Activity Events Using Kafka StreamsKSQL in Action: Real-Time Streaming ETL from Oracle Transactional DataKSQL in Action: Enriching CSV Events with Data from RDBMS into AWSJourney to Event Driven – Part 4: Four Pillars of Event Streaming MicroservicesJoin the Confluent Community SlackFully managed Apache Kafka as a service! Try free.

40 min

Top Podcasts In Technology