60 lines
1.8 KiB
Markdown
60 lines
1.8 KiB
Markdown
抓到了。CodeBuddy SDK/CLI 会走代理,HTTPS 明文能被 `mitmdump` 解出来。
|
||
|
||
我新增了一个打码抓包脚本:
|
||
|
||
[srcipts/mitm-redact.py](/home/wolves/project/codebuddy2api/scripts/mitm-redact.py)
|
||
|
||
注意路径是 `scripts/mitm-redact.py`。它只记录摘要,Authorization、API key、token、cookie 都会打码。
|
||
|
||
本次关键结果:
|
||
|
||
```text
|
||
POST https://copilot.tencent.com/v2/chat/completions
|
||
POST https://copilot.tencent.com/v2/report
|
||
```
|
||
|
||
主请求是 OpenAI Chat Completions 风格:
|
||
|
||
```json
|
||
{
|
||
"model": "glm-5.1",
|
||
"messages": [
|
||
{ "role": "system", "content": "..." },
|
||
{ "role": "user", "content": [{ "type": "text" }, { "type": "text" }] }
|
||
],
|
||
"tools": ["Agent", "Read", "Write", "Edit", "MultiEdit", "Bash", "Glob", "Grep", "EnterPlanMode", "ExitPlanMode", "TaskCreate", "TaskGet", "TaskUpdate", "TaskList", "WebFetch", "WebSearch", "TaskStop", "TaskOutput", "Skill", "AskUserQuestion"],
|
||
"temperature": 1,
|
||
"stream": true,
|
||
"stream_options": { "include_usage": true },
|
||
"reasoning_effort": "medium"
|
||
}
|
||
```
|
||
|
||
请求头关键项:
|
||
|
||
```text
|
||
X-API-Key: <redacted>
|
||
Authorization: <redacted>
|
||
X-Conversation-ID
|
||
X-Conversation-Request-ID
|
||
X-Conversation-Message-ID
|
||
X-Agent-Intent: craft
|
||
X-IDE-Type: CLI
|
||
X-IDE-Name: CLI
|
||
X-IDE-Version: 2.93.3
|
||
x-codebuddy-request: 1
|
||
User-Agent: CLI/2.93.3 CodeBuddy/2.93.3 ...
|
||
Content-Encoding: gzip
|
||
```
|
||
|
||
响应也是 OpenAI SSE chunk:
|
||
|
||
```text
|
||
content-type: text/event-stream
|
||
object: chat.completion.chunk
|
||
model: glm-5.1-tencent
|
||
data: {...}
|
||
data: [DONE]
|
||
```
|
||
|
||
结论:低内存直连方向是可行的,至少主链路不是私有二进制协议,而是 `copilot.tencent.com/v2/chat/completions` 的 OpenAI-compatible SSE。下一步要做的是把当前 TS 中转从 SDK 改成直接调用这个上游:复刻必要 headers、gzip 请求、SSE 转发,并验证只用 `apikey` 能不能稳定鉴权。 |