N/A
rodhayl/MCP Local LLM Server
A privacy-first MCP server that provides local LLM-enhanced tools for code analysis, security scanning, and automated task execution using backends like Ollama and LM Studio. It enables symbol-aware code reviews and workspace exploration while ensuring that all code and analysis remain strictly on your local machine.
Scan Scheduled
This agent is queued for security scanning. It will be graded in the next scan batch.
What We Know
- URL https://glama.ai/mcp/servers/ew1e0bfku0
- Framework mcp
- Sources glama
- First Seen Mar 16, 2026
- Repository github.com/rodhayl/mcpLocalHelper
Browse more:
Search all agents
Ecosystem Report