KitOps is a packaging, versioning, and sharing system for AI/ML projects that uses open standards so it works with the AI/ML, development, and DevOps tools you are already using, and can be stored in your enterprise container registry. It's AI/ML platform engineering teams' preferred solution for securely packaging and versioning assets.
KitOps creates a ModelKit for your AI/ML project which includes everything you need to reproduce it locally or deploy it into production. You can even selectively unpack a ModelKit so different team members can save time and storage space by only grabbing what they need for a task. Because ModelKits are immutable, signable, and live in your existing container registry they're easy for organizations to track, control, and audit.
ModelKits simplify the handoffs between data scientists, application developers, and SREs working with LLMs and other AI/ML models. Teams and enterprises use KitOps as a secure storage throughout the AI/ML project lifecycle.
Use KitOps to speed up and de-risk all types of AI/ML projects:
- Predictive models
- Large language models
- Computer vision models
- Multi-modal models
- Audio models
- etc...
For our friends in the EU - ModelKits are the perfect way to create a library of model versions for EU AI Act compliance because they're tamper-proof, signable, and auditable.
- π’ Create a runnable container from a ModelKit with one command! Read KitOps deploy docs for details.
- π₯ Get the most out of KitOps' ModelKits by using them with the Jozu Hub repository. Or, continue using ModelKits with your existing OCI registry (even on-premises and air-gapped).
- π οΈ Use KitOps with Dagger pipelines using our modules from the Daggerverse.
- βοΈ KitOps works great with Red Hat InstructLab and Quay.io products.
- π Unified packaging: A ModelKit package includes models, datasets, configurations, and code. Add as much or as little as your project needs.
- π Versioning: Each ModelKit is tagged so everyone knows which dataset and model work together.
- π Tamper-proofing: Each ModelKit package includes an SHA digest for itself, and every artifact it holds.
- π€© Selective-unpacking: Unpack only what you need from a ModelKit with the
kit unpack --filter
command - just the model, just the dataset and code, or any other combination. - π€ Automation: Pack or unpack a ModelKit locally or as part of your CI/CD workflow for testing, integration, or deployment (e.g. GitHub Actions or Dagger.
- π³ Deploy containers: Generate a basic or custom docker container from any ModelKit.
- π’ Kubernetes-ready: Generate a Kubernetes / KServe deployment config from any ModelKit.
- πͺ LLM fine-tuning: Use KitOps to fine-tune a large language model using LoRA.
- π― RAG pipelines: Create a RAG pipeline for tailoring an LLM with KitOps.
- π Artifact signing: ModelKits and their assets can be signed so you can be confident of their provenance.
- π Standards-based: Store ModelKits in any OCI 1.1-compliant container or artifact registry.
- π₯§ Simple syntax: Kitfiles are easy to write and read, using a familiar YAML syntax.
- π©° Flexible: Reference base models using
model parts
, or store key-value pairs (or any YAML-compatible JSON data) in your Kitfile - use it to keep features, hyperparameters, links to MLOps tool experiments, or validation output. - πββοΈββ‘οΈ Run locally: Kit's Dev Mode lets you run an LLM locally, configure it, and prompt/chat with it instantly.
- π€ Universal: ModelKits can be used with any AI, ML, or LLM project - even multi-modal models.
There's a video of KitOps in action on the KitOps site.
- Install the CLI for your platform.
- Follow the Getting Started docs to learn to pack, unpack, and share a ModelKit.
- Test drive one of our ModelKit Quick Starts that includes everything you need to run your model including a codebase, dataset, documentation, and of course the model.
For those who prefer to build from the source, follow these steps to get the latest version from our repository.
ModelKit: At the heart of KitOps is the ModelKit, an OCI-compliant packaging format for sharing all AI project artifacts: datasets, code, configurations, and models. By standardizing the way these components are packaged, versioned, and shared, ModelKits facilitate a more streamlined and collaborative development process that is compatible with any MLOps or DevOps tool.
Kitfile: A ModelKit is defined by a Kitfile - your AI/ML project's blueprint. It uses YAML to describe where to find each of the artifacts that will be packaged into the ModelKit. The Kitfile outlines what each part of the project is.
Kit CLI: The Kit CLI not only enables users to create, manage, run, and deploy ModelKits -- it lets you pull only the pieces you need. Just need the serialized model for deployment? Use unpack --model
, or maybe you just want the training datasets? unpack --datasets
.
For support, release updates, and general KitOps discussion, please join the KitOps Discord. Follow KitOps on X for daily updates.
If you need help there are several ways to reach our community and Maintainers outlined in our support doc
Your insights help KitOps evolve as an open standard for AI/ML. We deeply value the issues and feature requests we get from users in our community π. To contribute your thoughts,navigate to the Issues tab and hitting the New Issue green button. Our templates guide you in providing essential details to address your request effectively.
We β€οΈ our KitOps community and contributors. To learn more about the many ways you can contribute (you don't need to be a coder) and how to get started see our Contributor's Guide. Please read our Governance and our Code of Conduct before contributing.
At KitOps, inclusivity, empathy, and responsibility are at our core. Please read our Code of Conduct to understand the values guiding our community.
We share our roadmap openly so anyone in the community can provide feedback and ideas. Let us know what you'd like to see by pinging us on Discord or creating an issue.