Change the repository type filter
All
Repositories list
22 repositories
tensorrtllm_backend
PublicTensorRT-LLM
PublicTensorRT-LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and build TensorRT engines that contain state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT-LLM also contains components to create Python and C++ runtimes that execute those TensorRT engines.vllm
PublicPyramid-Flow
Publicllama-stack
Publicngx-http-auth-jwt-module
Publicdeepctl
Publiclangchainjs
Public- Official TypeScript wrapper for DeepInfra Inference API
lm-evaluation-harness
Publiclangchain
Publiclitellm
Publicfetch-stream-parser
Publicfetch-event-source
Publiccog
Publiccog-llama-2
Publictransformers
Publicsuperfans-gpu-controller
Publicwhisper-timestamped
Publicsentence-transformers
Public- Source for https://fullstackdeeplearning.com