N/A
micytao/vLLM MCP Server
Exposes vLLM capabilities to AI assistants, enabling chat completions, model management, and platform-aware container control with automatic detection of Docker/Podman and GPU availability across Linux, macOS, and Windows.
Scan Scheduled
This agent is queued for security scanning. It will be graded in the next scan batch.
What We Know
- URL https://glama.ai/mcp/servers/zugcx9u26t
- Framework mcp
- Sources glama
- First Seen Mar 16, 2026
- Repository github.com/micytao/vllm-mcp-server
Browse more:
Search all agents
Ecosystem Report