Skip to content

Basic setup to run self-hosted Ollama and use LLMs locally (Works with Coolify)

Notifications You must be signed in to change notification settings

oliverll1/ollama-quick-start

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Ollama Self-Hosting Setup

Basic setup to run self-hosted Ollama and use LLMs locally (Works with Coolify)

Requirements

  • Ubuntu 22.04
  • Nvidia GPU
  • Docker

Steps

  1. Run setup.bash (installs Nvidia drivers, Nvidia CUDA toolkit, and Nvidia Container Toolkit)
  2. If using Coolify, paste the docker-compose.yaml content in the Docker Compose section of your project. Otherwise, just run docker compose up -d inside the project's folder on your machine.
  3. Go to localhost:3000 or the respective Coolify link for your service.

Installing models

For more information:

About

Basic setup to run self-hosted Ollama and use LLMs locally (Works with Coolify)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages