Skip to content
/ lakeFS Public
forked from treeverse/lakeFS

An open source platform that delivers resilience and manageability to object-storage based data lakes

License

Notifications You must be signed in to change notification settings

kzuri/lakeFS

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Hacktoberfest License Go Node

What is lakeFS

lakeFS is an open source layer that delivers resilience and manageability to object-storage based data lakes.

With lakeFS you can build repeatable, atomic and versioned data lake operations - from complex ETL jobs to data science and analytics.

lakeFS supports AWS S3 or Google Cloud Storage as its underlying storage service. It is API compatible with S3, and works seamlessly with all modern data frameworks such as Spark, Hive, AWS Athena, Presto, etc.

For more information see the Official Documentation.

Capabilities

Development Environment for Data

  • Experimentation - try tools, upgrade versions and evaluate code changes in isolation.
  • Reproducibility - go back to any point of time to a consistent version of your data lake.

Continuous Data Integration

  • Ingest new data safely by enforcing best practices - make sure new data sources adhere to your lake’s best practices such as format and schema enforcement, naming convention, etc.
  • Metadata validation - prevent breaking changes from entering the production data environment.

Continuous Data Deployment

  • Instantly revert changes to data - if low quality data is exposed to your consumers, you can revert instantly to a former, consistent and correct snapshot of your data lake.
  • Enforce cross collection consistency - provide to consumers several collections of data that must be synchronized, in one atomic, revertable, action
  • Prevent data quality issues by enabling
    • Testing of production data before exposing it to users / consumers
    • Testing of intermediate results in your DAG to avoid cascading quality issues

Getting Started

Docker (MacOS, Linux)

  1. Ensure you have Docker & Docker Compose installed on your computer.

  2. Run the following command:

    curl https://compose.lakefs.io | docker-compose -f - up
  3. Open http://127.0.0.1:8000/setup in your web browser to set up an initial admin user, used to login and send API requests.

Docker (Windows)

  1. Ensure you have Docker installed

  2. Run the following command in PowerShell:

    Invoke-WebRequest https://compose.lakefs.io | Select-Object -ExpandProperty Content | docker-compose -f - up
  3. Open http://127.0.0.1:8000/setup in your web browser to set up an initial admin user, used to login and send API requests.

Download the Binary

Alternatively, you can download the lakeFS binaries and run them directly.

Binaries are available at https://github.com/treeverse/lakeFS/releases.

Setting up a repository

Please follow the Guide to Get Started to set up your local lakeFS installation.

For more detailed information on how to set up lakeFS, please visit the documentation

Community

Keep up to date and get lakeFS support via:

  • Slack (to get help from our team and other users).
  • Twitter (follow for updates and news)
  • YouTube (learn from video tutorials)
  • Contact us (for anything)

More information

Licensing

lakeFS is completely free and open source and licensed under the Apache 2.0 License.

About

An open source platform that delivers resilience and manageability to object-storage based data lakes

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Go 79.8%
  • JavaScript 14.1%
  • Thrift 4.3%
  • PLpgSQL 0.9%
  • CSS 0.3%
  • Makefile 0.3%
  • Other 0.3%