
Artificial Intelligence (AI)
Next.js Vllm UI - Self-hosted Web Interface for vLLM and Ollama
Next.js Vllm UI is a free and open-source self-hosted system that enables you to have a reactive user-friendly interface for large language models 0r LLMs as Ollama. It is easy to setup with one command using Docker. It is built on top of Next.js, Tailwind CSS framework, uses