Ready to dive into the lake?
lakeFS is currently only
available on desktop.

For an optimal experience, provide your email below and one of our lifeguards will send you a link to start swimming in the lake!

lakeFS Community

How Akamai process 10Gb/s of events in real-time using Kafka and Spark


17:30 – 18:00 – Mingling and food ๐Ÿ™‚
18:00 – 18:20 – Opening session
18:20 – 19:00 – How to implement Kafka Exactly-Once – Yulia Antonovsky – Senior II Software Engineer @ Akamai
19:00 – 19:40 – Deep dive into Spark 3 Data source read API – Kineret Raviv, Principal Software Developer @ Akamai

*********************** Note: ***********************
– The event will also be streamed live
– All sessions will be delivered in Hebrew

Title: How to implement Kafka Exactly-Once
Managing Kafka transactions the right way and how to escape endless rebalance storms when running with hundreds of consumers on the same topic.
Let’s talk about Kafka batch processing and how complicated it can be. We will review the issues we faced while building our ingest infrastructure in Azure, processing big-data malicious traffic at Akamai.

Title: Deep dive into Spark 3 Data source read API
At Akamai, we developed a new custom Spark data source and would like to share with you how we did it!
Our input data to the analytics components are stored in a complex format, and we had to implement a custom data source to support this.
In this session, you will learn about Spark 3 data source API, what it is, what it contains, and how we applied it to our use case and architecture.


Git for Data – lakeFS

  • Get Started
    Get Started
  • Did you know that lakeFS is an official Databricks Technology Partner? Learn more about -

    lakeFS for Databricks