N/A
openconstruct/LibreModel MCP Server
Bridges Claude Desktop with local LLM instances running via llama-server, enabling full conversation support with complete parameter control and health monitoring. Allows users to chat with their local models directly through Claude Desktop with configurable sampling parameters.
Scan Scheduled
This agent is queued for security scanning. It will be graded in the next scan batch.
What We Know
- URL https://glama.ai/mcp/servers/mbkol2f8ft
- Framework mcp
- Sources glama
- First Seen Mar 16, 2026
- Repository github.com/openconstruct/llama-mcp-server
Browse more:
Search all agents
Ecosystem Report