Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add list of pre-made ModelKits to docs #425

Merged
merged 4 commits into from
Jul 30, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 11 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ Use KitOps to speed up and de-risk all types of AI projects from small analysis

### 🎉 New

Get the most out of KitOps' ModelKits by using them with the **[Jozu Hub](https://jozu.ml/discover)** repository (don't worry - ModelKits are still compatible with any OCI registry).
Get the most out of KitOps' ModelKits by using them with the **[Jozu Hub](https://jozu.ml/discover)** repository (don't worry - ModelKits are compatible with any OCI registry).


### Features
Expand All @@ -39,15 +39,15 @@ Get the most out of KitOps' ModelKits by using them with the **[Jozu Hub](https:
* 📝 **[Artifact signing](./docs/src/docs/next-steps.md):** ModelKits and their assets can be signed so you can be confident of their provenance.
* 🌈 **[Standards-based](https://kitops.ml/docs/modelkit/compatibility.html):** Store ModelKits in any OCI 1.1-compliant container or artifact registry.
* 🥧 **[Simple syntax](https://kitops.ml/docs/kitfile/kf-overview.html):** Kitfiles are easy to write and read, using a familiar YAML syntax.
* 🏃‍♂️‍➡️ **[Run locally](./docs/src/docs/quick-start.md#_8-run-an-llm-locally):** Kit's Dev Mode lets your run an LLM locally, configure it, and prompt/chat with it instantly.
* 🏃‍♂️‍➡️ **[Run locally](./docs/src/docs/dev-mode.md):** Kit's Dev Mode lets you run an LLM locally, configure it, and prompt/chat with it instantly.
* 🐳 **Deploy containers:** Generate a Docker container as part of your `kit unpack` (coming soon).
* 🚢 **Kubernetes-ready:** Generate a Kubernetes / KServe deployment config as part of your `kit unpack` (coming soon).
* 🩰 **Flexible:** Store key-value pairs, or any YAML-compatible JSON data in your Kitfile - use it to keep features, hyperparameters, links to MLOps tool experiments our validation output...whatever you want!
* 🩰 **[Flexible](./docs/src/docs/kitfile/format.md):** Store key-value pairs, or any YAML-compatible JSON data in your Kitfile - use it to keep features, hyperparameters, links to MLOps tool experiments our validation output...whatever you want!
* 🤗 **Universal:** ModelKits can be used with any AI, ML, or LLM project - even multi-modal models.

### See KitOps in Action

https://github.com/jozu-ai/kitops/assets/4766570/05ae1362-afd3-4e78-bfce-e982c17f8df2
There's a video of KitOps in action on the [KitOps site](https://kitops.ml/).

### What is in the box?

Expand All @@ -57,6 +57,8 @@ https://github.com/jozu-ai/kitops/assets/4766570/05ae1362-afd3-4e78-bfce-e982c17

**[Kit CLI](./docs/src/docs/cli/cli-reference.md):** The Kit CLI not only enables users to create, manage, run, and deploy ModelKits -- it lets you pull only the pieces you need. Just need the serialized model for deployment? Use `unpack --model`, or maybe you just want the training datasets? `unpack --datasets`.

You can pull pre-built ModelKits from [Jozu Hub](https://jozu.ml/discover).

## 🚀 Try Kit in under 15 Minutes

1. [Install the CLI](./docs/src/docs/cli/installation.md) for your platform.
Expand All @@ -78,9 +80,11 @@ We've been busy and shipping quickly!

You can see all the gory details in our [release changelogs](https://github.com/jozu-ai/kitops/releases).

## Your Voice Matters
## Need Help?

### Join KitOps community

### Need Help?
For support, release updates, and general KitOps discussion, please join the [KitOps Discord](https://discord.gg/Tapeh8agYy). Follow [KitOps on X](https://twitter.com/Kit_Ops) for daily updates.

If you need help there are several ways to reach our community and [Maintainers](./MAINTAINERS.md) outlined in our [support doc](./SUPPORT.md)

Expand All @@ -96,11 +100,7 @@ We ❤️ our Kit community and contributors. To learn more about the many ways

At KitOps, inclusivity, empathy, and responsibility are at our core. Please read our [Code of Conduct](./CODE-OF-CONDUCT.md) to understand the values guiding our community.

### Join KitOps community

For support, release updates, and general KitOps discussion, please join the [KitOps Discord](https://discord.gg/Tapeh8agYy). Follow [KitOps on X](https://twitter.com/Kit_Ops) for daily updates.

### Roadmap
## Roadmap

We [share our roadmap openly](./ROADMAP.md) so anyone in the community can provide feedback and ideas. Let us know what you'd like to see by pinging us on Discord or creating an issue.

Expand Down
2 changes: 2 additions & 0 deletions docs/.vitepress/config.mts
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,7 @@ export default defineConfig({
{ text: 'Overview', link: '/docs/overview' },
{ text: 'Quick Start', link: '/docs/quick-start' },
{ text: 'Next Steps', link: '/docs/next-steps' },
{ text: 'Kit Dev', link: '/docs/dev-mode' },
{ text: 'Why KitOps?', link: '/docs/why-kitops' },
{ text: 'How it is Used', link: '/docs/use-cases' },
{ text: 'KitOps versus...', link: '/docs/versus' },
Expand All @@ -73,6 +74,7 @@ export default defineConfig({
items: [
{ text: 'Overview', link: '/docs/modelkit/intro' },
{ text: 'Specification', link: '/docs/modelkit/spec' },
{ text: 'Pre-built ModelKits', link: '/docs/modelkit/prebuilt-modelkits' },
{ text: 'Compatibility', link: '/docs/modelkit/compatibility' },
]
},
Expand Down
41 changes: 41 additions & 0 deletions docs/src/docs/dev-mode.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# Kit Dev: Run an LLM Locally
bmicklea marked this conversation as resolved.
Show resolved Hide resolved

:::info
This is a beta feature only available on MacOS today. To provide feedback (we love that) you can [file an issue](https://github.com/jozu-ai/kitops/issues) in our GitHub repo, or [join our Discord](https://discord.gg/Tapeh8agYy) channel.
:::

If you're using Kit with LLMs you can quickly run the model locally to speed integration, testing, or experimentation.

To run the ModelKit locally, first create a new directory for your LLM:

```sh
mkdir kitdev
cd kitdev
```

Now unpack an LLM ModelKit - there are several on [Jozu Hub](https://jozu.ml/discover), but here we're using Phi3 Mini because of its size:


```sh
kit unpack jozu.ml/jozu/phi3:3.8b-mini-instruct-4k-q4_K_M
```

Now start your LLM dev server locally using the [kit dev start command](./cli/cli-reference.md#kit-dev-start):

```sh
kit dev start .
```

In the command output you'll see a URL you can use to interact with the LLM (there's a command flag to always use the same port). You can control parameters of the model, change the prompt, or chat with the LLM.

If you need to get logs use the [dev logs command](./cli/cli-reference.md#kit-dev-logs):

```sh
kit dev logs
```

When you're done don't forget to stop the Kit dev server:

```sh
kit dev stop
```
2 changes: 2 additions & 0 deletions docs/src/docs/modelkit/intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@

ModelKit revolutionizes the way AI/ML artifacts are shared and managed throughout the lifecycle of AI/ML projects. As an OCI-compliant packaging format, ModelKit encapsulates datasets, code, configurations, and models into a single, standardized unit. This approach not only streamlines the development process but also ensures broad compatibility and integration with a vast array of tools and platforms.

Start with a [pre-built ModelKit](./prebuilt-modelkits.md), see the [ModelKit spec](./spec.md), or look over the [tool compatibility list](./compatibility.md).

## Key Features of ModelKit:

**Seamless Sharing and Collaboration:** ModelKit's standardized format fosters a collaborative environment, enabling teams to share and manage AI/ML artifacts effortlessly across different stages of development.
Expand Down
60 changes: 60 additions & 0 deletions docs/src/docs/modelkit/prebuilt-modelkits.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
# Pre-Built ModelKits

You can find many pre-built ModelKits on [Jozu Hub](https://jozu.ml/discover), covering popular models and datasets. These can be a great way to kick off a project if you don't have a model or dataset in-house already.

See the [models](#for-models) or the [datasets](#for-datasets) below.

## For Models

### Llama
* [Llama 3.1 8B](https://jozu.ml/repositories/42)
* [Llama 3.1 70B](https://jozu.ml/repositories/45)
* [Llama 3.0 8B](https://jozu.ml/repositories/26)
* [Llama 3.0 70B](https://jozu.ml/repositories/28)

### Mistral & Mixtral
* [Mistral v0.3 7b](https://jozu.ml/repositories/44)
* [Mistral v0.1 7b](https://jozu.ml/repositories/43)
* [Mixtral 8x 7B](https://jozu.ml/repositories/33)
* [Mixtral 8x 22B](https://jozu.ml/repositories/34)

### Falcon
* [Falcon 7B](https://jozu.ml/repositories/31)
* [Falcon 40B](https://jozu.ml/repositories/32)

### Microsoft
* [Phi3 Mini](https://jozu.ml/repositories/5)

### Google
* [Gemma 2B](https://jozu.ml/repositories/24)
* [Gemma 7B](https://jozu.ml/repositories/25)
* [BERT base uncased](https://jozu.ml/repositories/37)

### Qwen2
* [Qwen2 0.5B](https://jozu.ml/repositories/23)
* [Qwen2 7B](https://jozu.ml/repositories/30)

### Bloom
* [Bloom 560M](https://jozu.ml/repositories/19)
* [Bloom 1b1](https://jozu.ml/repositories/20)
* [Bloom 1b7](https://jozu.ml/repositories/21)
* [Bloom 3b](https://jozu.ml/repositories/18)

### YOLO
* [YOLO v10](https://jozu.ml/repositories/41)

### AST
* [AST Finetuned audioset](https://jozu.ml/repositories/39)
* [AST Finetuned audioset 10-10-0.4593](https://jozu.ml/repositories/40)

### KitOps
* [KitOps RAG pipeline example](https://jozu.ml/repositories/29)

## For Datasets
* [Databricks Dolly 15K](https://jozu.ml/repositories/36)
* [MMLU dataset](https://jozu.ml/repositories/35)
* [Medical QA shared task v1](https://jozu.ml/repositories/38)
* [Artifact Deduplication A](https://jozu.ml/repositories/15)
* [Artifact Deduplication B](https://jozu.ml/repositories/16)
* [Artifact Deduplication C](https://jozu.ml/repositories/17)
* [KitOps Data Versioning](https://jozu.ml/repositories/14)
6 changes: 6 additions & 0 deletions docs/src/docs/next-steps.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,10 @@ In this guide you'll learn how to:
* Read the Kitfile or manifest from a ModelKit
* Tag ModelKits and keep your registry tidy

::: info
If you're interested in running an LLM locally using Kit, you can jump to the [Kit Dev](./dev-mode.md) documentation.
:::

## Signing your ModelKit

Because ModelKits are OCI artifacts, they can be signed like any other OCI artifact (you may already sign your containers, for example).
Expand Down Expand Up @@ -310,4 +314,6 @@ mymodel champion Rajat Finetuning_SLM 13.1 MiB sha256:f268a74f

You can learn more about all the Kit CLI commands from our [command reference doc](./cli/cli-reference.md).

To learn about how to run an LLM locally using Kit, see our [Kit Dev](./dev-mode.md) documentation.

Thanks for taking some time to play with Kit. We'd love to hear what you think. Feel free to drop us an [issue in our GitHub repository](https://github.com/jozu-ai/kitops/issues) or join [our Discord server](https://discord.gg/Tapeh8agYy).
52 changes: 4 additions & 48 deletions docs/src/docs/quick-start.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ After entering your username and password, you'll see `Log in successful`. If yo

### 3/ Get a Sample ModelKit

Let's use the [unpack command](./cli/cli-reference.md#kit-unpack) to pull a sample ModelKit to our machine that we can play with. In this case we'll unpack the whole thing, but one of the great things about Kit is that you can also selectively unpack only the artifacts you need: just the model, the model and dataset, the code, the configuration...whatever you want. Check out the `unpack` [command reference](./cli/cli-reference.md#kit-unpack) for details.
Let's use the [unpack command](./cli/cli-reference.md#kit-unpack) to pull a [sample ModelKit](./modelkit/premade-modelkits.md) to our machine that we can play with. In this case we'll unpack the whole thing, but one of the great things about Kit is that you can also selectively unpack only the artifacts you need: just the model, the model and dataset, the code, the configuration...whatever you want. Check out the `unpack` [command reference](./cli/cli-reference.md#kit-unpack) for details.

You can grab <a href="https://jozu.ml/discover"
v-ga-track="{
Expand Down Expand Up @@ -117,59 +117,15 @@ Next, we would repeat the `kit pack` command in the previous step, being sure to

The [push command](./cli/cli-reference.md#kit-push) will copy the newly built ModelKit from your local repository to the remote repository you logged into earlier. The naming of your ModelKit will need to be the same as what you see in your `kit list` command (REPOSITORY:TAG). You can even copy and paste it. In my case it looks like:

```sh
kit push ghcr.io/jozubrad/mymodelkit:latest
```

### 8/ Run an LLM Locally

If you're using Kit with LLMs you can quickly run the model locally to speed integration, testing, or experimentation.

<!-- The syntax below makes an Info callout box in Vitpress at compile time -->
::: info
If you're not interested in running the ModelKit locally you can jump to the [Next Steps](next-steps.md) where you'll learn how to sign ModelKits, write your own Kitfiles, and maintain your repository.
:::

To run the ModelKit locally, first create a new directory for your LLM:

```sh
mkdir devmode
cd devmode
```

Now unpack an LLM ModelKit - we have [several](https://github.com/orgs/jozu-ai/packages), but I've chosen Phi3:

```sh
kit unpack ghcr.io/jozu-ai/phi3:3.8b-mini-instruct-4k-q4_K_M
```

Now start your LLM dev server locally using the [dev start command](./cli/cli-reference.md#kit-dev-start):

```sh
kit dev start .
```

In the command output you'll see a URL you can use to interact with the LLM (there's a command flag if you want to always use the same port). You can control parameters of the model, change the prompt, or chat with the LLM.

If you need to get logs use the [dev logs command](./cli/cli-reference.md#kit-dev-logs):

```sh
kit dev logs
```

When you're done don't forget to stop the LLM dev server:
<!-- replace with Jozu Hub once private repos are ready -->

```sh
kit dev stop
kit push ghcr.io/jozubrad/mymodelkit:latest
```

### Congratulations

You've learned how to unpack a ModelKit, pack one up, push it, and run an LLM locally. Anyone with access to your remote repository can now pull your new ModelKit and start playing with your model:

```sh
kit pull ghcr.io/jozubrad/mymodelkit:latest
```
You've learned how to unpack a ModelKit, pack one up, push it, and run an LLM locally. Anyone with access to your remote repository can now pull your new ModelKit and start playing with your model using the `kit pull` or `kit unpack` commands.

If you'd like to learn more about using Kit, try our [Next Steps with Kit](./next-steps.md) document that covers:
* Signing your ModeKit
Expand Down