develop |
master |
---|---|
"The centre of your data pipeline."
Kedro is a workflow development tool that helps you build data pipelines that are robust, scalable, deployable, reproducible and versioned. We provide a standard approach so that you can:
- spend more time building your data pipeline,
- worry less about how to write production-ready code,
- standardise the way that your team collaborates across your project,
- work more efficiently.
Kedro was originally designed by Aris Valtazanos and Nikolaos Tsaousis to solve challenges they faced in their project work.
This work was later turned into a product thanks to the following contributors: Ivan Danov, Dmitrii Deriabin, Gordon Wrigley, Yetunde Dada, Nasef Khan, Kiyohito Kunii, Nikolaos Kaltsas, Meisam Emamjome, Peteris Erins, Lorena Balan, Richard Westenra and Anton Kirilenko.
kedro
is a Python package. To install it, simply run:
pip install kedro
For more detailed installation instructions, including how to setup Python virtual environments, please visit our installation guide.
- A standard and easy-to-use project template
- Configuration for credentials, logging, data loading and Jupyter Notebooks / Lab
- Test-driven development using
pytest
- Sphinx integration to produce well-documented code
- Separation of the compute layer from the data handling layer, including support for different data formats and storage options
- Versioning for your data sets and machine learning models
- Support for pure Python functions,
nodes
, to break large chunks of code into small independent sections - Automatic resolution of dependencies between
nodes
- Visualise your data pipeline with Kedro-Viz, a tool that shows the pipeline structure of Kedro projects
Note: Read our FAQs to learn how we differ from workflow managers like Airflow and Luigi.
A pipeline visualisation generated using Kedro-Viz
- A plugin system that injects commands into the Kedro command line interface (CLI)
- List of officially supported plugins:
- Kedro-Airflow, making it easy to prototype your data pipeline in Kedro before deploying to Airflow, a workflow scheduler
- Kedro-Docker, a tool for packaging and shipping Kedro projects within containers
- Kedro can be deployed locally, on-premise and cloud (AWS, Azure and GCP) servers, or clusters (EMR, Azure HDinsight, GCP and Databricks)
You can find the overview of Kedro architecture here.
Our documentation explains:
- A typical Kedro workflow
- How to set up the project configuration
- Building your first pipeline
- How to use the CLI offered by
kedro_cli.py
(kedro new
,kedro run
, ...)
Note: The CLI is a convenient tool for being able to run
kedro
commands but you can also invoke the Kedro CLI as a Python module withpython -m kedro
This CLI command will open the documentation for your current version of Kedro in a browser:
kedro docs
Documentation for the latest stable release can be found here. Check these out first:
Yes! Want to help build Kedro? Check out our guide to contributing.
We use Semantic Versioning. The best way to safely upgrade is to check our release notes for any notable breaking changes.
Once Kedro is installed, you can check your version as follows:
kedro --version
To later upgrade Kedro to a different version, simply run:
pip install kedro -U
Kedro is licensed under the Apache 2.0 License.
Do you want to be part of the team that builds Kedro and other great products at QuantumBlack? If so, you're in luck! QuantumBlack is currently hiring Software Engineers who love using data to drive their decisions. Take a look at our open positions and see if you're a fit.