ZIN MCP Client
The ZIN MCP Client acts as a friendly bridge that allows local AI models to communicate with various software tools and services. In simple terms, it takes an AI running on a personal computer (via Ollama) and gives it the ability to "reach out" and use external programs to perform …
About this Protocol
How to Use
1. Installation
Prerequisites:
* Python 3.10+
* Ollama (for running local LLMs)
* uv (Recommended for dependency management)
Steps:
1. Download and Extract: Download the latest release from the GitHub releases page and unzip it.
bash
unzip zin-mcp-client-<version>.zip
cd zin-mcp-client
2. Install uv:
bash
curl -LsSf https://astral.sh/uv/install.sh | sh
3. Install Dependencies:
Using uv (Recommended):
bash
uv pip install -r requirements.txt
Using pip:
bash
pip install -r requirements.txt
4. Ollama Setup:
Run a model with tool-calling capabilities (e.g., Llama 3.1):
bash
ollama run llama3.1:8b
ollama serve
2. Configuration
The ZIN MCP Client uses a configuration file named mcp-config.json located in the root directory. It follows a structure similar to Claude Desktop's configuration.
JSON Configuration Example:
{
"mcpServers": {
"jadx-mcp-server": {
"command": "/path/to/uv",
"args": [
"--directory",
"/path/to/jadx-mcp-server/",
"run",
"jadx_mcp_server.py"
]
},
"apktool-mcp-server": {
"command": "/path/to/uv",
"args": [
"--directory",
"/path/to/apktool-mcp-server/",
"run",
"apktool_mcp_server.py"
]
}
}
}
Note: Replace /path/to/uv and /path/to/...-mcp-server/ with the actual absolute paths on your system.
3. Available Tools
The ZIN MCP Client is a bridge that provides access to tools hosted on other MCP servers. It provides several ways to interface with those tools:
Execution Commands:
* Interactive CLI: uv run zin_mcp_client.py
* Web UI: uv run web_client.py
* Open Web UI Proxy: uv run mcp_proxy.py (Exposes a proxy at http://localhost:8000)
CLI Options:
* --server [name]: Specify which server from the config to use.
* --model [name]: Specify the Ollama model (e.g., llama3.1:8b).
* --config [path]: Provide a custom path to a configuration JSON file.
* --debug: Enable verbose logging and raw traffic details.
4. Example Prompts
While specific text prompts aren't listed, the documentation suggests using the client for the following tasks via connected MCP servers:
- Vulnerability Research: "Perform code review to find vulnerabilities locally."
- Reverse Engineering: Using the Ghidra, JADX, or APKTool MCP servers to analyze binary files or Android applications.
- Tool Orchestration: Asking the agent to use specific tools from multiple connected servers simultaneously using the ReAct framework.
Use Cases
Use Case 1: Private Mobile App Security Auditing
Problem: Security researchers often need to analyze APK files for vulnerabilities or malicious code, but uploading proprietary or sensitive binaries to cloud-based LLMs (like Claude or GPT-4) poses a significant data privacy risk.
Solution: ZIN MCP Client bridges local LLMs (via Ollama) with specialized security MCP servers like apktool-mcp-server and jadx-mcp-server. This creates a completely air-gapped or local environment where the AI can decompile, inspect, and explain code without the data ever leaving the machine.
Example: A researcher runs uv run zin_mcp_client.py --server jadx-mcp-server. They then prompt the local Llama 3.1 model: "Decompile app.apk and find any hardcoded API keys or insecure network configurations." The client invokes the local JADX tools and provides a summarized security report.
Use Case 2: Air-Gapped Malware Analysis using Ghidra
Problem: Malware analysts frequently work in isolated virtual machines without internet access to prevent malware from communicating with Command & Control (C2) servers. In these environments, cloud-based AI assistants are unavailable.
Solution: Because ZIN MCP Client is lightweight and runs entirely locally, it can be deployed within an air-gapped VM. It can connect to the GhidraMCP server to allow a local LLM to interact with Ghidra's decompiler and symbol tables.
Example: An analyst opens a suspicious binary in Ghidra and starts the ZIN MCP Client. They ask the LLM: "Analyze the function at address 0x401000 and explain its logic." The LLM uses the Ghidra toolset to retrieve the assembly, convert it to pseudo-code, and explain the malware's obfuscation technique to the analyst.
Use Case 3: Extending Open Web UI with Local Toolsets
Problem: Users of "Open Web UI" (a popular local ChatGPT-like interface) often find it difficult to connect their local LLMs to external tools like file systems, web scrapers, or custom scripts because the setup is complex.
Solution: ZIN MCP Client includes an mcp_proxy.py feature. This acts as a bridge that turns standard STDIO-based MCP servers into an API that Open Web UI can consume. This allows users to use a polished web interface while still leveraging powerful, local tool-calling capabilities.
Example: A developer configures ZIN's MCP Proxy to run a filesystem MCP server. In their Open Web UI settings, they add the ZIN proxy as a connection. Now, they can chat with their local model and say, "Search my 'Projects' folder for all Python files that don't have docstrings," and the model executes the search via the local proxy.
Use Case 4: Rapid Testing and Debugging for MCP Server Developers
Problem: Developers building their own MCP servers need a fast, low-overhead way to test tool-calling logic and ReAct agent loops without the "weight" of a full IDE or complex desktop application like Claude Desktop.
Solution: ZIN MCP Client provides a specialized CLI with a --debug flag that prints every raw interaction between the LLM and the MCP server. Its lightweight nature makes it the ideal "test bench" for ensuring a new MCP server is responding correctly to JSON-RPC calls.
Example: A developer creates a new "Weather MCP." Instead of a complex setup, they run uv run zin_mcp_client.py --config my-test-config.json --debug. They can instantly see if the local LLM is correctly formatting the get_weather tool arguments and how the server handles the response, all within the terminal.