NO RATE LIMIT FOR THU STUDENT!! THU Agent by CyberCraze

Interactive terminal coding agent powered by the THU lab proxy OpenAI-compatible API.

The agent runs in your current terminal, works in your current directory, can inspect files, propose shell commands, and wait for your approval before running them.

Platform Use

Linux

Use the built executable:

./dist/thu-agent

Linux executable path:

dist/thu-agent

To run it globally, copy or symlink it into a directory on your PATH, for example:

sudo install -m 755 dist/thu-agent /usr/local/bin/thu-agent

Then run:

thu-agent

Windows

Use the Windows executable after building it on Windows:

.\dist\thu-agent.exe

Windows executable path:

dist\thu-agent.exe

To run it globally on Windows, add the repo dist directory to your PATH, or copy the executable into a directory already on PATH.

Example PowerShell command to add the current repo dist directory for your user:

[Environment]::SetEnvironmentVariable(
  "Path",
  $env:Path + ";C:\Users\USER\Downloads\THU-deepseek-glm-api-mcp-server\dist",
  "User"
)

Then open a new terminal and run:

thu-agent.exe

Build it from Windows with:

powershell -ExecutionPolicy Bypass -File .\build_agent_windows.ps1

macOS

There is no packaged macOS binary in this repo.

Run the Python entrypoint directly:

python3 agent.py

If you want a global command on macOS, create a small wrapper in /usr/local/bin or another directory on your PATH:

sudo ln -sf "/absolute/path/to/agent.py" /usr/local/bin/thu-agent.py

or run the repo-local command directly from a shell alias.

API Setup

The agent uses the THU lab proxy.

Create an API key first at:

https://lab.cs.tsinghua.edu.cn/ai-platform/c/new

Base URL:

https://lab.cs.tsinghua.edu.cn/ai-platform/api/v1

Set your key with an environment variable:

export THU_LAB_PROXY_API_KEY='your_proxy_key_here'
export THU_LAB_PROXY_BASE_URL='https://lab.cs.tsinghua.edu.cn/ai-platform/api/v1'

On Windows PowerShell:

$env:THU_LAB_PROXY_API_KEY='your_proxy_key_here'
$env:THU_LAB_PROXY_BASE_URL='https://lab.cs.tsinghua.edu.cn/ai-platform/api/v1'

You can also launch the agent and paste the key when prompted. The agent saves it into a per-user global config file for reuse.

Config location:

  • Linux and macOS: ~/.thu-cybercraze-agent/.env
  • Windows: %USERPROFILE%\.thu-cybercraze-agent\.env

Start the Agent

From the repo root:

./dist/thu-agent

Or with Python:

python3 agent.py

You can also pass the model and key directly:

python3 agent.py --model deepseek-v3.2 --api-key "$THU_LAB_PROXY_API_KEY"

Model Selection

The startup picker shows the models currently wired into the agent.

Default model:

deepseek-v3.2

Current supported models:

  • qwen3-max-thinking
  • qwen3-max
  • glm-5
  • glm-5-thinking
  • glm-4.7-thinking
  • kimi-k2.5
  • kimi-k2.5-thinking
  • minimax-m2.5
  • minimax-m2.5-thinking
  • qwen3.5-plus
  • qwen3.5-plus-thinking
  • qwen3.5-mini
  • deepseek-v3.2-thinking
  • deepseek-v3.2

In-Agent Commands

Slash commands available in the session:

  • /help
  • /model
  • /key
  • /pwd
  • /alwaysRun
  • /exit

While the agent is thinking or running a command, press Ctrl+C to cancel the current operation and return to the prompt without exiting the whole session.

Typical Workflow

  1. Start the agent.
  2. Choose a model or press Enter for the default.
  3. Reuse the saved API key or paste a new one.
  4. Type requests at the > prompt.
  5. Approve commands when the agent asks.

Example prompts:

  • list the files in this directory
  • write a hello world script in python
  • inspect this project and explain how to run it
  • create a small bash script that prints the current date

Command Approval

By default, the agent asks before running each command.

To auto-approve commands for the current session:

/alwaysRun

Use that carefully.

Build

Linux build

bash build_agent.sh

Result:

dist/thu-agent

This build uses the current Python environment and PyInstaller, with extra excludes plus strip/optimize enabled to keep the binary smaller.

Windows build

Run this on Windows, not inside WSL:

py -3 -m pip install pyinstaller
powershell -ExecutionPolicy Bypass -File .\build_agent_windows.ps1

Result:

dist\thu-agent.exe

macOS run path

macOS users should run the Python entrypoint directly:

python3 agent.py

Direct API Test

You can test the proxy directly:

curl --location --request POST \
  'https://lab.cs.tsinghua.edu.cn/ai-platform/api/v1/chat/completions' \
  --header 'Content-Type: application/json' \
  --header "authorization: Bearer $THU_LAB_PROXY_API_KEY" \
  --data-raw '{
    "model": "deepseek-v3.2",
    "messages": [{"role": "user", "content": "Reply with exactly: ok"}],
    "temperature": 0.2,
    "repetition_penalty": 1.1,
    "stream": false
  }'

Notes

  • The Linux binary is already buildable from this repo.
  • The Windows .exe must be built from a Windows Python environment.
  • macOS users should run agent.py directly unless they package it themselves.
  • The MCP server code in server.py still uses the older backend and is separate from the interactive agent in agent.py.

MCP Server · Populars

MCP Server · New

    globau

    Firefox DevTools MCP

    Model Context Protocol server for Firefox DevTools - enables AI assistants to inspect and control Firefox browser through the Remote Debugging Protocol

    Community globau
    nukeop

    nuclear

    Streaming music player that finds free music for you

    Community nukeop
    qualixar

    SuperLocalMemory V3

    World's first local-only AI memory to break 74% retrieval and 60% zero-LLM on LoCoMo. No cloud, no APIs, no data leaves your machine. Additionally, mode C (LLM/Cloud) - 87.7% LoCoMo. Research-backed. arXiv: 2603.14588

    Community qualixar
    proxy-intell

    Facebook Ads Library MCP Server

    MCP Server for Facebook ADs Library - Get instant answers from FB's ad library

    Community proxy-intell
    genkit-ai

    Genkit MCP

    Open-source framework for building AI-powered apps in JavaScript, Go, and Python, built and used in production by Google

    Community genkit-ai