What if you could manage your data lake just like you manage code? With rollback, versioning, and branching capabilities on top of your existing data lake?
lakeFS is an open-source project that provides a Git-like version control interface for data lakes, with seamless integration to most data tools and frameworks. lakeFS enables you to easily implement parallel pipelines for experimentation, reproducibility, and CI/CD for data.
And now you can start playing around with lakeFS in a fully functional lakeFS environment, with your own data and all the tools that you are already using. Get your isolated environment, integrate it with the tools you use, and see how it works in an environment similar to your own.
Please note: playground environments will be deleted after one week.
See these features in action:
- Full reproducibility of data and code
- Git-like operations: branch, commit, merge and revert
- Instant reversion of changes to data
- Petabytes scale version control
- Zero-copy branching for frictionless experimentation
- Seamless integration with most data tools and frameworks (Spark, Hive, AWS Athena, Presto)
- Support for AWS S3, Azure Blob Storage, and Google Cloud Storage (GCS)
Start playing here (no email or any additional information required)
One you have experienced the value in lakeFS, go ahead and install lakeFS open source, or sign up for lakeFS Cloud Beta. And don’t forget to share your feedback with us via our community on Slack or GitHub – this is the best way to let us know if any feature you need is missing!Read Related Articles.


Databricks and lakeFS Integration: Step-by-Step Configuration Tutorial
Introduction This tutorial will review all steps needed to configure lakeFS on Databricks. This tutorial assumes that lakeFS is already set up and running against


Audit Logs are Now Available in lakeFS Cloud
TL;DR lakeFS Cloud offers Audit Logs for compliance, operational stability, monitoring access, activities and security analysis. In the latest version of lakeFS Cloud, we introduced


The Airflow and lakeFS Integration: Step-by-Step Configuration Tutorial
Introduction lakeFS makes creating isolated environments for data ingestion instantaneous so you can run data ingestion jobs without impacting your production data and merge ingested