Compare commits

...

15 Commits

Author SHA1 Message Date
槑囿脑袋
7fc7ac0dbd
Merge f95b040b07 into fd6986076a 2025-12-18 13:37:13 +08:00
dependabot[bot]
fd6986076a
chore(deps): bump jws from 4.0.0 to 4.0.1 (#11977)
Bumps [jws](https://github.com/brianloveswords/node-jws) from 4.0.0 to 4.0.1.
- [Release notes](https://github.com/brianloveswords/node-jws/releases)
- [Changelog](https://github.com/auth0/node-jws/blob/master/CHANGELOG.md)
- [Commits](https://github.com/brianloveswords/node-jws/compare/v4.0.0...v4.0.1)

---
updated-dependencies:
- dependency-name: jws
  dependency-version: 4.0.1
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-18 13:34:39 +08:00
LiuVaayne
6309cc179d
feat(mcp): add Nowledge Mem builtin MCP server (#11875)
*  feat(mcp): add Nowledge Mem builtin MCP server

Add @cherry/nowLedgeMem as a new builtin MCP server that connects
to local Nowledge Mem service via HTTP at 127.0.0.1:14242/mcp.

- Add nowLedgeMem to BuiltinMCPServerNames type definitions
- Add HTTP transport handling in MCPService with APP header
- Add server config to builtinMCPServers array
- Add i18n translations (en-us, zh-cn, zh-tw)

* Fix Nowledge Mem server name typos across codebase

* 🌐 i18n: add missing translations for Nowledge Mem and Git Bash settings

Translate [to be translated] markers across 8 locale files:
- zh-tw, de-de, fr-fr, es-es, pt-pt, ru-ru: nowledgeMem description
- fr-fr, es-es, pt-pt, ru-ru, el-gr, ja-jp: xhigh reasoning chain option
- el-gr, ja-jp: Git Bash configuration strings

* 🐛 fix: address PR review comments for Nowledge Mem MCP

- Fix log message typo: use server.name instead of hardcoded "NowLedgeMem"
- Rename i18n key from "nowledgeMem" to "nowledge_mem" for consistency
- Update descriptions to warn about external dependency requirement
2025-12-18 13:34:06 +08:00
SuYao
c04529a23c
refactor: improve budget calculation logic (#11973)
* refactor: improve budget calculation logic

* Update src/renderer/src/aiCore/utils/__tests__/reasoning.test.ts

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* Update src/renderer/src/aiCore/utils/__tests__/reasoning.test.ts

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* [WIP] Address feedback on budget calculation logic refactor (#11974)

* Initial plan

* fix: revert budget calculation to linear interpolation formula

Reverted the budget calculation in getAnthropicThinkingBudget from
`tokenLimit.max * effortRatio` back to the original linear interpolation
formula `(tokenLimit.max - tokenLimit.min) * effortRatio + tokenLimit.min`.

The new formula was causing lower budgets for all effort ratios (e.g.,
LOW effort changed from 2609 to 1638 tokens, a 37% reduction). The linear
interpolation formula ensures budgets range from min (at effortRatio=0) to
max (at effortRatio=1), matching the behavior in other parts of the codebase
(lines 221, 597).

Updated tests to reflect the correct expected values with the linear
interpolation formula.

Co-authored-by: DeJeune <67425183+DeJeune@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: DeJeune <67425183+DeJeune@users.noreply.github.com>

* fix(test): reasoning

* fix: test

---------

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <198982749+Copilot@users.noreply.github.com>
2025-12-18 13:30:41 +08:00
George·Dong
0f1b3afa72
feat: 添加火山引擎 Doubao-Seed-1.8 模型支持 (#11972)
- 新增模型定义: doubao-seed-1-8-251215
- 支持思考模式: reasoning_effort (minimal/low/medium/high)
- 支持 Function Call
- 支持图像理解 (Vision)
- 更新正则表达式支持 seed-1.8 变体
- 添加完整测试覆盖

修改文件:
- src/renderer/src/config/models/default.ts
- src/renderer/src/config/models/reasoning.ts
- src/renderer/src/aiCore/utils/reasoning.ts
- src/renderer/src/config/models/vision.ts
- src/renderer/src/config/models/tooluse.ts
- src/renderer/src/config/models/__tests__/reasoning.test.ts
2025-12-18 13:30:23 +08:00
Phantom
0cf0072b51
feat: add default reasoning effort option to resolve confusion between undefined and none (#11942)
* feat(reasoning): add default reasoning effort option and update i18n

Add 'default' reasoning effort option to all reasoning models to represent no additional configuration. Update translations for new option and modify reasoning logic to handle default case. Also update store version and migration for new reasoning_effort field.

Update test cases and reasoning configuration to include default option. Add new lightbulb question icon for default reasoning state.

* fix(ThinkingButton): correct isThinkingEnabled condition to exclude 'default'

The condition now properly disables thinking when effort is 'default' to match intended behavior. Click thinking button will not switch reasoning effort to 'none'.

* refactor(types): improve reasoning_effort_cache documentation

Update comments to clarify the purpose and future direction of reasoning_effort_cache
Remove TODO and replace with FIXME suggesting external cache service

* feat(i18n): add reasoning effort descriptions and update thinking button logic

add descriptions for reasoning effort options in multiple languages
move reasoning effort label maps to component for better maintainability

* fix(aiCore): handle default reasoning_effort value consistently across providers

Ensure consistent behavior when reasoning_effort is 'default' or undefined by returning empty object

* test(reasoning): fix failing tests after 'default' option introduction

Fixed two test cases that were failing after the introduction of the 'default'
reasoning effort option:

1. getAnthropicReasoningParams test: Updated to explicitly set reasoning_effort
   to 'none' instead of empty settings, as undefined/empty now represents
   'default' behavior (no configuration override)

2. getGeminiReasoningParams test: Similarly updated to set reasoning_effort
   to 'none' for the disabled thinking test case

This aligns with the new semantic where:
- undefined/'default' = use model's default behavior (returns {})
- 'none' = explicitly disable reasoning (returns disabled config)
2025-12-18 13:00:23 +08:00
eeee0717
f95b040b07 Merge branch 'main' into feat/bonjour 2025-12-18 12:33:20 +08:00
eeee0717
483dcb1dfc fix: pr review 2025-12-18 11:32:13 +08:00
eeee0717
e711824701 chore: remove qrcode dependency 2025-12-18 10:48:32 +08:00
eeee0717
fc92f356ed fix: pr review 2025-12-18 10:06:34 +08:00
beyondkmp
150bb3e3a0
fix: auto-discover and persist Git Bash path on Windows for scoop (#11921)
* feat: auto-discover and persist Git Bash path on Windows

- Add autoDiscoverGitBash function to find and cache Git Bash path when needed
- Modify System_CheckGitBash IPC handler to auto-discover and persist path
- Update Claude Code service with fallback auto-discovery mechanism
- Git Bash path is now cached after first discovery, improving UX for Windows users

* udpate

* fix: remove redundant validation of auto-discovered Git Bash path

The autoDiscoverGitBash function already returns a validated path, so calling validateGitBashPath again is unnecessary.

Co-Authored-By: Claude <noreply@anthropic.com>

* udpate

* test: add unit tests for autoDiscoverGitBash function

Add comprehensive test coverage for autoDiscoverGitBash including:
- Discovery with no existing config path
- Validation of existing config paths
- Handling of invalid existing paths
- Config persistence verification
- Real-world scenarios (standard Git, portable Git, user-configured paths)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: remove unnecessary async keyword from System_CheckGitBash handler

The handler doesn't use await since autoDiscoverGitBash is synchronous.
Removes async for consistency with other IPC handlers.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: rename misleading test to match actual behavior

Renamed "should not call configManager.set multiple times on single discovery"
to "should persist on each discovery when config remains undefined" to
accurately describe that each call to autoDiscoverGitBash persists when
the config mock returns undefined.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* refactor: use generic type parameter instead of type assertion

Replace `as string | undefined` with `get<string | undefined>()` for
better type safety when retrieving GitBashPath from config.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* refactor: simplify Git Bash path resolution in Claude Code service

Remove redundant validateGitBashPath call since autoDiscoverGitBash
already handles validation of configured paths before attempting
discovery. Also remove unused ConfigKeys and configManager imports.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: attempt auto-discovery when configured Git Bash path is invalid

Previously, if a user had an invalid configured path (e.g., Git was
moved or uninstalled), autoDiscoverGitBash would return null without
attempting to find a valid installation. Now it logs a warning and
attempts auto-discovery, providing a better user experience by
automatically fixing invalid configurations.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: ensure CLAUDE_CODE_GIT_BASH_PATH env var takes precedence over config

Previously, if a valid config path existed, the environment variable
CLAUDE_CODE_GIT_BASH_PATH was never checked. Now the precedence order is:

1. CLAUDE_CODE_GIT_BASH_PATH env var (highest - runtime override)
2. Configured path from settings
3. Auto-discovery via findGitBash

This allows users to temporarily override the configured path without
modifying their persistent settings.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* refactor: improve code quality and test robustness

- Remove duplicate logging in Claude Code service (autoDiscoverGitBash logs internally)
- Simplify Git Bash path initialization with ternary expression
- Add afterEach cleanup to restore original env vars in tests
- Extract mockExistingPaths helper to reduce test code duplication

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* feat: track Git Bash path source to distinguish manual vs auto-discovered

- Add GitBashPathSource type and GitBashPathInfo interface to shared constants
- Add GitBashPathSource config key to persist path origin ('manual' | 'auto')
- Update autoDiscoverGitBash to mark discovered paths as 'auto'
- Update setGitBashPath IPC to mark user-set paths as 'manual'
- Add getGitBashPathInfo API to retrieve path with source info
- Update AgentModal UI to show different text based on source:
  - Manual: "Using custom path" with clear button
  - Auto: "Auto-discovered" without clear button

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* refactor: simplify Git Bash config UI as form field

- Replace large Alert components with compact form field
- Use static isWin constant instead of async platform detection
- Show Git Bash field only on Windows with auto-fill support
- Disable save button when Git Bash path is missing on Windows
- Add "Auto-discovered" hint for auto-detected paths
- Remove hasGitBash state, simplify checkGitBash logic

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* ui: add explicit select button for Git Bash path

Replace click-on-input interaction with a dedicated "Select" button
for clearer UX

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* refactor: simplify Git Bash UI by removing clear button

- Remove handleClearGitBash function (no longer needed)
- Remove clear button from UI (auto-discover fills value, user can re-select)
- Remove auto-discovered hint (SourceHint)
- Remove unused SourceHint styled component

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* feat: add reset button to restore auto-discovered Git Bash path

- Add handleResetGitBash to clear manual setting and re-run auto-discovery
- Show "Reset" button only when source is 'manual'
- Show "Auto-discovered" hint when path was found automatically
- User can re-select if auto-discovered path is not suitable

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: re-run auto-discovery when resetting Git Bash path

When setGitBashPath(null) is called (reset), now automatically
re-runs autoDiscoverGitBash() to restore the auto-discovered path.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* feat(i18n): add Git Bash config translations

Add translations for:
- autoDiscoveredHint: hint text for auto-discovered paths
- placeholder: input placeholder for bash.exe selection
- tooltip: help tooltip text
- error.required: validation error message

Supported languages: en-US, zh-CN, zh-TW

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* update i18n

* fix: auto-discover Git Bash when getting path info

When getGitBashPathInfo() is called and no path is configured,
automatically trigger autoDiscoverGitBash() first. This handles
the upgrade scenario from old versions that don't have Git Bash
path configured.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-18 09:57:23 +08:00
eeee0717
0dc9658846 fix: pr review 2025-12-18 09:48:52 +08:00
eeee0717
8bab2c8ebc chore: update docs and tests 2025-12-17 21:50:45 +08:00
eeee0717
c2c416ea93 Merge branch 'main' into feat/bonjour 2025-12-17 21:03:13 +08:00
eeee0717
37db8f1c9c refactor: change qrcode landrop to lantransfer 2025-12-17 20:22:15 +08:00
68 changed files with 6489 additions and 1458 deletions

View File

@ -0,0 +1,850 @@
# Cherry Studio 局域网传输协议规范
> 版本: 1.0
> 最后更新: 2025-12
本文档定义了 Cherry Studio 桌面客户端Electron与移动端Expo之间的局域网文件传输协议。
---
## 目录
1. [协议概述](#1-协议概述)
2. [服务发现Bonjour/mDNS](#2-服务发现bonjourmdns)
3. [TCP 连接与握手](#3-tcp-连接与握手)
4. [消息格式规范](#4-消息格式规范)
5. [文件传输协议](#5-文件传输协议)
6. [心跳与连接保活](#6-心跳与连接保活)
7. [错误处理](#7-错误处理)
8. [常量与配置](#8-常量与配置)
9. [完整时序图](#9-完整时序图)
10. [移动端实现指南](#10-移动端实现指南)
---
## 1. 协议概述
### 1.1 架构角色
| 角色 | 平台 | 职责 |
| -------------------- | --------------- | ---------------------------- |
| **Client客户端** | Electron 桌面端 | 扫描服务、发起连接、发送文件 |
| **Server服务端** | Expo 移动端 | 发布服务、接受连接、接收文件 |
### 1.2 协议栈v1
```
┌─────────────────────────────────────┐
│ 应用层(文件传输) │
├─────────────────────────────────────┤
│ 消息层(控制: JSON \n
│ (数据: 二进制帧) │
├─────────────────────────────────────┤
│ 传输层TCP
├─────────────────────────────────────┤
│ 发现层Bonjour/mDNS
└─────────────────────────────────────┘
```
### 1.3 通信流程概览
```
1. 服务发现 → 移动端发布 mDNS 服务,桌面端扫描发现
2. TCP 握手 → 建立连接,交换设备信息(`version=1`
3. 文件传输 → 控制消息使用 JSON`file_chunk` 使用二进制帧分块传输
4. 连接保活 → ping/pong 心跳
```
---
## 2. 服务发现Bonjour/mDNS
### 2.1 服务类型
| 属性 | 值 |
| ------------ | -------------------- |
| 服务类型 | `cherrystudio` |
| 协议 | `tcp` |
| 完整服务标识 | `_cherrystudio._tcp` |
### 2.2 服务发布(移动端)
移动端需要通过 mDNS/Bonjour 发布服务:
```typescript
// 服务发布参数
{
name: "Cherry Studio Mobile", // 设备名称
type: "cherrystudio", // 服务类型
protocol: "tcp", // 协议
port: 53317, // TCP 监听端口
txt: { // TXT 记录(可选)
version: "1",
platform: "ios" // 或 "android"
}
}
```
### 2.3 服务发现(桌面端)
桌面端扫描并解析服务信息:
```typescript
// 发现的服务信息结构
type LocalTransferPeer = {
id: string; // 唯一标识符
name: string; // 设备名称
host?: string; // 主机名
fqdn?: string; // 完全限定域名
port?: number; // TCP 端口
type?: string; // 服务类型
protocol?: "tcp" | "udp"; // 协议
addresses: string[]; // IP 地址列表
txt?: Record<string, string>; // TXT 记录
updatedAt: number; // 发现时间戳
};
```
### 2.4 IP 地址选择策略
当服务有多个 IP 地址时,优先选择 IPv4
```typescript
// 优先选择 IPv4 地址
const preferredAddress = addresses.find((addr) => isIPv4(addr)) || addresses[0];
```
---
## 3. TCP 连接与握手
### 3.1 连接建立
1. 客户端使用发现的 `host:port` 建立 TCP 连接
2. 连接成功后立即发送握手消息
3. 等待服务端响应握手确认
### 3.2 握手消息(协议版本 v1
#### Client → Server: `handshake`
```typescript
type LanTransferHandshakeMessage = {
type: "handshake";
deviceName: string; // 设备名称
version: string; // 协议版本,当前为 "1"
platform?: string; // 平台:'darwin' | 'win32' | 'linux'
appVersion?: string; // 应用版本
};
```
**示例:**
```json
{
"type": "handshake",
"deviceName": "Cherry Studio 1.7.2",
"version": "1",
"platform": "darwin",
"appVersion": "1.7.2"
}
```
### 4. 消息格式规范(混合协议)
v1 使用"控制 JSON + 二进制数据帧"的混合协议(流式传输模式,无 per-chunk ACK
- **控制消息**握手、心跳、file_start/ack、file_end、file_completeUTF-8 JSON`\n` 分隔
- **数据消息**`file_chunk`):二进制帧,使用 Magic + 总长度做分帧,不经 Base64
### 4.1 控制消息编码JSON + `\n`
| 属性 | 规范 |
| ---------- | ------------ |
| 编码格式 | UTF-8 |
| 序列化格式 | JSON |
| 消息分隔符 | `\n`0x0A |
```typescript
function sendControlMessage(socket: Socket, message: object): void {
socket.write(`${JSON.stringify(message)}\n`);
}
```
### 4.2 `file_chunk` 二进制帧格式
为解决 TCP 分包/粘包并消除 Base64 开销,`file_chunk` 采用带总长度的二进制帧:
```
┌──────────┬──────────┬────────┬───────────────┬──────────────┬────────────┬───────────┐
│ Magic │ TotalLen │ Type │ TransferId Len│ TransferId │ ChunkIdx │ Data │
│ 0x43 0x53│ (4B BE) │ 0x01 │ (2B BE) │ (UTF-8) │ (4B BE) │ (raw) │
└──────────┴──────────┴────────┴───────────────┴──────────────┴────────────┴───────────┘
```
| 字段 | 大小 | 说明 |
| -------------- | ---- | ------------------------------------------- |
| Magic | 2B | 常量 `0x43 0x53` ("CS"), 用于区分 JSON 消息 |
| TotalLen | 4B | Big-endian帧总长度不含 Magic/TotalLen |
| Type | 1B | `0x01` 代表 `file_chunk` |
| TransferId Len | 2B | Big-endiantransferId 字符串长度 |
| TransferId | nB | UTF-8 transferId长度由上一字段给出 |
| ChunkIdx | 4B | Big-endian块索引从 0 开始 |
| Data | mB | 原始文件二进制数据(未编码) |
> 计算帧总长度:`TotalLen = 1 + 2 + transferIdLen + 4 + dataLen`(即 Type~Data 的长度和)。
### 4.3 消息解析策略
1. 读取 socket 数据到缓冲区;
2. 若前两字节为 `0x43 0x53` → 按二进制帧解析:
- 至少需要 6 字节头Magic + TotalLen不足则等待更多数据
- 读取 `TotalLen` 判断帧整体长度,缓冲区不足则继续等待
- 解析 Type/TransferId/ChunkIdx/Data并传入文件接收逻辑
3. 否则若首字节为 `{` → 按 JSON + `\n` 解析控制消息
4. 其它数据丢弃 1 字节并继续循环,避免阻塞。
### 4.4 消息类型汇总v1
| 类型 | 方向 | 编码 | 用途 |
| ---------------- | --------------- | -------- | ----------------------- |
| `handshake` | Client → Server | JSON+\n | 握手请求version=1 |
| `handshake_ack` | Server → Client | JSON+\n | 握手响应 |
| `ping` | Client → Server | JSON+\n | 心跳请求 |
| `pong` | Server → Client | JSON+\n | 心跳响应 |
| `file_start` | Client → Server | JSON+\n | 开始文件传输 |
| `file_start_ack` | Server → Client | JSON+\n | 文件传输确认 |
| `file_chunk` | Client → Server | 二进制帧 | 文件数据块(无 Base64流式无 per-chunk ACK |
| `file_end` | Client → Server | JSON+\n | 文件传输结束 |
| `file_complete` | Server → Client | JSON+\n | 传输完成结果 |
```
{"type":"message_type",...其他字段...}\n
```
---
## 5. 文件传输协议
### 5.1 传输流程
```
Client (Sender) Server (Receiver)
| |
|──── 1. file_start ────────────────>|
| (文件元数据) |
| |
|<─── 2. file_start_ack ─────────────|
| (接受/拒绝) |
| |
|══════ 循环发送数据块(流式,无 ACK ═════|
| |
|──── 3. file_chunk [0] ────────────>|
| |
|──── 3. file_chunk [1] ────────────>|
| |
| ... 重复直到所有块发送完成 ... |
| |
|══════════════════════════════════════
| |
|──── 5. file_end ──────────────────>|
| (所有块已发送) |
| |
|<─── 6. file_complete ──────────────|
| (最终结果) |
```
### 5.2 消息定义
#### 5.2.1 `file_start` - 开始传输
**方向:** Client → Server
```typescript
type LanTransferFileStartMessage = {
type: "file_start";
transferId: string; // UUID唯一传输标识
fileName: string; // 文件名(含扩展名)
fileSize: number; // 文件总字节数
mimeType: string; // MIME 类型
checksum: string; // 整个文件的 SHA-256 哈希hex
totalChunks: number; // 总数据块数
chunkSize: number; // 每块大小(字节)
};
```
**示例:**
```json
{
"type": "file_start",
"transferId": "550e8400-e29b-41d4-a716-446655440000",
"fileName": "backup.zip",
"fileSize": 524288000,
"mimeType": "application/zip",
"checksum": "a1b2c3d4e5f6789012345678901234567890abcdef1234567890abcdef123456",
"totalChunks": 8192,
"chunkSize": 65536
}
```
#### 5.2.2 `file_start_ack` - 传输确认
**方向:** Server → Client
```typescript
type LanTransferFileStartAckMessage = {
type: "file_start_ack";
transferId: string; // 对应的传输 ID
accepted: boolean; // 是否接受传输
message?: string; // 拒绝原因
};
```
**接受示例:**
```json
{
"type": "file_start_ack",
"transferId": "550e8400-e29b-41d4-a716-446655440000",
"accepted": true
}
```
**拒绝示例:**
```json
{
"type": "file_start_ack",
"transferId": "550e8400-e29b-41d4-a716-446655440000",
"accepted": false,
"message": "Insufficient storage space"
}
```
#### 5.2.3 `file_chunk` - 数据块
**方向:** Client → Server**二进制帧**,见 4.2
- 不再使用 JSON/`\n`,也不再使用 Base64
- 帧结构:`Magic` + `TotalLen` + `Type` + `TransferId` + `ChunkIdx` + `Data`
- `Type` 固定 `0x01``Data` 为原始文件二进制数据
- 传输完整性依赖 `file_start.checksum`(全文件 SHA-256分块校验和可选不在帧中发送
#### 5.2.4 `file_chunk_ack` - 数据块确认v1 流式不使用)
v1 采用流式传输,不发送 per-chunk ACK。本节类型仅保留作为向后兼容参考实际不会发送。
#### 5.2.5 `file_end` - 传输结束
**方向:** Client → Server
```typescript
type LanTransferFileEndMessage = {
type: "file_end";
transferId: string; // 传输 ID
};
```
**示例:**
```json
{
"type": "file_end",
"transferId": "550e8400-e29b-41d4-a716-446655440000"
}
```
#### 5.2.6 `file_complete` - 传输完成
**方向:** Server → Client
```typescript
type LanTransferFileCompleteMessage = {
type: "file_complete";
transferId: string; // 传输 ID
success: boolean; // 是否成功
filePath?: string; // 保存路径(成功时)
error?: string; // 错误信息(失败时)
};
```
**成功示例:**
```json
{
"type": "file_complete",
"transferId": "550e8400-e29b-41d4-a716-446655440000",
"success": true,
"filePath": "/storage/emulated/0/Documents/backup.zip"
}
```
**失败示例:**
```json
{
"type": "file_complete",
"transferId": "550e8400-e29b-41d4-a716-446655440000",
"success": false,
"error": "File checksum verification failed"
}
```
### 5.3 校验和算法
#### 整个文件校验和(保持不变)
```typescript
async function calculateFileChecksum(filePath: string): Promise<string> {
const hash = crypto.createHash("sha256");
const stream = fs.createReadStream(filePath);
for await (const chunk of stream) {
hash.update(chunk);
}
return hash.digest("hex");
}
```
#### 数据块校验和
v1 默认 **不传输分块校验和**,依赖最终文件 checksum。若需要可在应用层自定义非协议字段
### 5.4 校验流程
**发送端Client**
1. 发送前计算整个文件的 SHA-256 → `file_start.checksum`
2. 分块直接发送原始二进制(无 Base64
**接收端Server**
1. 收到 `file_chunk` 后直接使用二进制数据
2. 边收边落盘并增量计算 SHA-256推荐
3. 所有块接收完成后,计算/完成增量哈希,得到最终 SHA-256
4. 与 `file_start.checksum` 比对,结果写入 `file_complete`
### 5.5 数据块大小计算
```typescript
const CHUNK_SIZE = 512 * 1024; // 512KB
const totalChunks = Math.ceil(fileSize / CHUNK_SIZE);
// 最后一个块可能小于 CHUNK_SIZE
const lastChunkSize = fileSize % CHUNK_SIZE || CHUNK_SIZE;
```
---
## 6. 心跳与连接保活
### 6.1 心跳消息
#### `ping`
**方向:** Client → Server
```typescript
type LanTransferPingMessage = {
type: "ping";
payload?: string; // 可选载荷
};
```
```json
{
"type": "ping",
"payload": "heartbeat"
}
```
#### `pong`
**方向:** Server → Client
```typescript
type LanTransferPongMessage = {
type: "pong";
received: boolean; // 确认收到
payload?: string; // 回传 ping 的载荷
};
```
```json
{
"type": "pong",
"received": true,
"payload": "heartbeat"
}
```
### 6.2 心跳策略
- 握手成功后立即发送一次 `ping` 验证连接
- 可选:定期发送心跳保持连接活跃
- `pong` 应返回 `ping` 中的 `payload`(可选)
---
## 7. 错误处理
### 7.1 超时配置
| 操作 | 超时时间 | 说明 |
| ---------- | -------- | --------------------- |
| TCP 连接 | 10 秒 | 连接建立超时 |
| 握手等待 | 10 秒 | 等待 `handshake_ack` |
| 传输完成 | 60 秒 | 等待 `file_complete` |
### 7.2 错误场景处理
| 场景 | Client 处理 | Server 处理 |
| --------------- | ------------------ | ---------------------- |
| TCP 连接失败 | 通知 UI允许重试 | - |
| 握手超时 | 断开连接,通知 UI | 关闭 socket |
| 握手被拒绝 | 显示拒绝原因 | - |
| 数据块处理失败 | 中止传输,清理状态 | 清理临时文件 |
| 连接意外断开 | 清理状态,通知 UI | 清理临时文件 |
| 存储空间不足 | - | 发送 `accepted: false` |
### 7.3 资源清理
**Client 端:**
```typescript
function cleanup(): void {
// 1. 销毁文件读取流
if (readStream) {
readStream.destroy();
}
// 2. 清理传输状态
activeTransfer = undefined;
// 3. 关闭 socket如需要
socket?.destroy();
}
```
**Server 端:**
```typescript
function cleanup(): void {
// 1. 关闭文件写入流
if (writeStream) {
writeStream.end();
}
// 2. 删除未完成的临时文件
if (tempFilePath) {
fs.unlinkSync(tempFilePath);
}
// 3. 清理传输状态
activeTransfer = undefined;
}
```
---
## 8. 常量与配置
### 8.1 协议常量
```typescript
// 协议版本v1 = 控制 JSON + 二进制 chunk + 流式传输)
export const LAN_TRANSFER_PROTOCOL_VERSION = "1";
// 服务发现
export const LAN_TRANSFER_SERVICE_TYPE = "cherrystudio";
export const LAN_TRANSFER_SERVICE_FULL_NAME = "_cherrystudio._tcp";
// TCP 端口
export const LAN_TRANSFER_TCP_PORT = 53317;
// 文件传输(与二进制帧一致)
export const LAN_TRANSFER_CHUNK_SIZE = 512 * 1024; // 512KB
export const LAN_TRANSFER_GLOBAL_TIMEOUT_MS = 10 * 60 * 1000; // 10 分钟
// 超时设置
export const LAN_TRANSFER_HANDSHAKE_TIMEOUT_MS = 10_000; // 10秒
export const LAN_TRANSFER_CHUNK_TIMEOUT_MS = 30_000; // 30秒
export const LAN_TRANSFER_COMPLETE_TIMEOUT_MS = 60_000; // 60秒
```
### 8.2 支持的文件类型
当前仅支持 ZIP 文件:
```typescript
export const LAN_TRANSFER_ALLOWED_EXTENSIONS = [".zip"];
export const LAN_TRANSFER_ALLOWED_MIME_TYPES = [
"application/zip",
"application/x-zip-compressed",
];
```
---
## 9. 完整时序图
### 9.1 完整传输流程v1流式传输
```
┌─────────┐ ┌─────────┐ ┌─────────┐
│ Renderer│ │ Main │ │ Mobile │
│ (UI) │ │ Process │ │ Server │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
│ ════════════ 服务发现阶段 ════════════ │
│ │ │
│ startScan() │ │
│────────────────────────────────────>│ │
│ │ mDNS browse │
│ │ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─>│
│ │ │
│ │<─ ─ ─ service discovered ─ ─ ─ ─ ─ ─│
│ │ │
<────── onServicesUpdated ───────────│ │
│ │ │
│ ════════════ 握手连接阶段 ════════════ │
│ │ │
│ connect(peer) │ │
│────────────────────────────────────>│ │
│ │──────── TCP Connect ───────────────>│
│ │ │
│ │──────── handshake ─────────────────>│
│ │ │
│ │<─────── handshake_ack ──────────────│
│ │ │
│ │──────── ping ──────────────────────>│
│ │<─────── pong ───────────────────────│
│ │ │
<────── connect result ──────────────│ │
│ │ │
│ ════════════ 文件传输阶段 ════════════ │
│ │ │
│ sendFile(path) │ │
│────────────────────────────────────>│ │
│ │──────── file_start ────────────────>│
│ │ │
│ │<─────── file_start_ack ─────────────│
│ │ │
│ │ │
│ │══════ 循环发送数据块 ═══════════════│
│ │ │
│ │──────── file_chunk[0] (binary) ────>│
<────── progress event ──────────────│ │
│ │ │
│ │──────── file_chunk[1] (binary) ────>│
<────── progress event ──────────────│ │
│ │ │
│ │ ... 重复 ... │
│ │ │
│ │══════════════════════════════════════│
│ │ │
│ │──────── file_end ──────────────────>│
│ │ │
│ │<─────── file_complete ──────────────│
│ │ │
<────── complete event ──────────────│ │
<────── sendFile result ─────────────│ │
│ │ │
```
---
## 10. 移动端实现指南v1 要点)
### 10.1 必须实现的功能
1. **mDNS 服务发布**
- 发布 `_cherrystudio._tcp` 服务
- 提供 TCP 端口号 `53317`
- 可选TXT 记录(版本、平台信息)
2. **TCP 服务端**
- 监听指定端口
- 支持单连接或多连接
3. **消息解析**
- 控制消息UTF-8 + `\n` JSON
- 数据消息二进制帧Magic+TotalLen 分帧)
4. **握手处理**
- 验证 `handshake` 消息
- 发送 `handshake_ack` 响应
- 响应 `ping` 消息
5. **文件接收(流式模式)**
- 解析 `file_start`,准备接收
- 接收 `file_chunk` 二进制帧,直接写入文件/缓冲并增量哈希
- v1 不发送 per-chunk ACK流式传输
- 处理 `file_end`,完成增量哈希并校验 checksum
- 发送 `file_complete` 结果
### 10.2 推荐的库
**React Native / Expo**
- mDNS: `react-native-zeroconf``@homielab/react-native-bonjour`
- TCP: `react-native-tcp-socket`
- Crypto: `expo-crypto``react-native-quick-crypto`
### 10.3 接收端伪代码
```typescript
class FileReceiver {
private transfer?: {
id: string;
fileName: string;
fileSize: number;
checksum: string;
totalChunks: number;
receivedChunks: number;
tempPath: string;
// v1: 边收边写文件,避免大文件 OOM
// stream: FileSystem writable stream (平台相关封装)
};
handleMessage(message: any) {
switch (message.type) {
case "handshake":
this.handleHandshake(message);
break;
case "ping":
this.sendPong(message);
break;
case "file_start":
this.handleFileStart(message);
break;
// v1: file_chunk 为二进制帧,不再走 JSON 分支
case "file_end":
this.handleFileEnd(message);
break;
}
}
handleFileStart(msg: LanTransferFileStartMessage) {
// 1. 检查存储空间
// 2. 创建临时文件
// 3. 初始化传输状态
// 4. 发送 file_start_ack
}
// v1: 二进制帧处理在 socket data 流中解析,随后调用 handleBinaryFileChunk
handleBinaryFileChunk(transferId: string, chunkIndex: number, data: Buffer) {
// 直接使用二进制数据,按 chunkSize/lastChunk 计算长度
// 写入文件流并更新增量 SHA-256
this.transfer.receivedChunks++;
// v1: 流式传输,不发送 per-chunk ACK
}
handleFileEnd(msg: LanTransferFileEndMessage) {
// 1. 合并所有数据块
// 2. 验证完整文件 checksum
// 3. 写入最终位置
// 4. 发送 file_complete
}
}
```
---
## 附录 ATypeScript 类型定义
完整的类型定义位于 `packages/shared/config/types.ts`
```typescript
// 握手消息
export interface LanTransferHandshakeMessage {
type: "handshake";
deviceName: string;
version: string;
platform?: string;
appVersion?: string;
}
export interface LanTransferHandshakeAckMessage {
type: "handshake_ack";
accepted: boolean;
message?: string;
}
// 心跳消息
export interface LanTransferPingMessage {
type: "ping";
payload?: string;
}
export interface LanTransferPongMessage {
type: "pong";
received: boolean;
payload?: string;
}
// 文件传输消息 (Client -> Server)
export interface LanTransferFileStartMessage {
type: "file_start";
transferId: string;
fileName: string;
fileSize: number;
mimeType: string;
checksum: string;
totalChunks: number;
chunkSize: number;
}
export interface LanTransferFileChunkMessage {
type: "file_chunk";
transferId: string;
chunkIndex: number;
data: string; // Base64 encoded (v1: 二进制帧模式下不使用)
}
export interface LanTransferFileEndMessage {
type: "file_end";
transferId: string;
}
// 文件传输响应消息 (Server -> Client)
export interface LanTransferFileStartAckMessage {
type: "file_start_ack";
transferId: string;
accepted: boolean;
message?: string;
}
// v1 流式不发送 per-chunk ACK以下类型仅用于向后兼容参考
export interface LanTransferFileChunkAckMessage {
type: "file_chunk_ack";
transferId: string;
chunkIndex: number;
received: boolean;
error?: string;
}
export interface LanTransferFileCompleteMessage {
type: "file_complete";
transferId: string;
success: boolean;
filePath?: string;
error?: string;
}
// 常量
export const LAN_TRANSFER_TCP_PORT = 53317;
export const LAN_TRANSFER_CHUNK_SIZE = 512 * 1024;
export const LAN_TRANSFER_CHUNK_TIMEOUT_MS = 30_000;
```
---
## 附录 B版本历史
| 版本 | 日期 | 变更 |
| ---- | ------- | ---------------------------------------- |
| 1.0 | 2025-12 | 初始发布版本,支持二进制帧格式与流式传输 |

View File

@ -87,6 +87,7 @@
"@napi-rs/system-ocr": "patch:@napi-rs/system-ocr@npm%3A1.0.2#~/.yarn/patches/@napi-rs-system-ocr-npm-1.0.2-59e7a78e8b.patch",
"@paymoapp/electron-shutdown-handler": "^1.1.2",
"@strongtz/win32-arm64-msvc": "^0.4.7",
"bonjour-service": "^1.3.0",
"emoji-picker-element-data": "^1",
"express": "^5.1.0",
"font-list": "^2.0.0",
@ -97,10 +98,8 @@
"node-stream-zip": "^1.15.0",
"officeparser": "^4.2.0",
"os-proxy-config": "^1.1.2",
"qrcode.react": "^4.2.0",
"selection-hook": "^1.0.12",
"sharp": "^0.34.3",
"socket.io": "^4.8.1",
"swagger-jsdoc": "^6.2.8",
"swagger-ui-express": "^5.0.1",
"tesseract.js": "patch:tesseract.js@npm%3A6.0.1#~/.yarn/patches/tesseract.js-npm-6.0.1-2562a7e46d.patch",

View File

@ -233,6 +233,8 @@ export enum IpcChannel {
Backup_ListS3Files = 'backup:listS3Files',
Backup_DeleteS3File = 'backup:deleteS3File',
Backup_CheckS3Connection = 'backup:checkS3Connection',
Backup_CreateLanTransferBackup = 'backup:createLanTransferBackup',
Backup_DeleteTempBackup = 'backup:deleteTempBackup',
// zip
Zip_Compress = 'zip:compress',
@ -244,6 +246,7 @@ export enum IpcChannel {
System_GetCpuName = 'system:getCpuName',
System_CheckGitBash = 'system:checkGitBash',
System_GetGitBashPath = 'system:getGitBashPath',
System_GetGitBashPathInfo = 'system:getGitBashPathInfo',
System_SetGitBashPath = 'system:setGitBashPath',
// DevTools
@ -380,10 +383,14 @@ export enum IpcChannel {
ClaudeCodePlugin_ReadContent = 'claudeCodePlugin:read-content',
ClaudeCodePlugin_WriteContent = 'claudeCodePlugin:write-content',
// WebSocket
WebSocket_Start = 'webSocket:start',
WebSocket_Stop = 'webSocket:stop',
WebSocket_Status = 'webSocket:status',
WebSocket_SendFile = 'webSocket:send-file',
WebSocket_GetAllCandidates = 'webSocket:get-all-candidates'
// Local Transfer
LocalTransfer_ListServices = 'local-transfer:list',
LocalTransfer_StartScan = 'local-transfer:start-scan',
LocalTransfer_StopScan = 'local-transfer:stop-scan',
LocalTransfer_ServicesUpdated = 'local-transfer:services-updated',
LocalTransfer_Connect = 'local-transfer:connect',
LocalTransfer_Disconnect = 'local-transfer:disconnect',
LocalTransfer_ClientEvent = 'local-transfer:client-event',
LocalTransfer_SendFile = 'local-transfer:send-file',
LocalTransfer_CancelTransfer = 'local-transfer:cancel-transfer'
}

View File

@ -488,3 +488,11 @@ export const MACOS_TERMINALS_WITH_COMMANDS: TerminalConfigWithCommand[] = [
// resources/scripts should be maintained manually
export const HOME_CHERRY_DIR = '.cherrystudio'
// Git Bash path configuration types
export type GitBashPathSource = 'manual' | 'auto'
export interface GitBashPathInfo {
path: string | null
source: GitBashPathSource | null
}

View File

@ -52,3 +52,196 @@ export interface WebSocketCandidatesResponse {
interface: string
priority: number
}
export type LocalTransferPeer = {
id: string
name: string
host?: string
fqdn?: string
port?: number
type?: string
protocol?: 'tcp' | 'udp'
addresses: string[]
txt?: Record<string, string>
updatedAt: number
}
export type LocalTransferState = {
services: LocalTransferPeer[]
isScanning: boolean
lastScanStartedAt?: number
lastUpdatedAt: number
lastError?: string
}
export type LanHandshakeRequestMessage = {
type: 'handshake'
deviceName: string
version: string
platform?: string
appVersion?: string
}
export type LanHandshakeAckMessage = {
type: 'handshake_ack'
accepted: boolean
message?: string
}
export type LocalTransferConnectPayload = {
peerId: string
metadata?: Record<string, string>
timeoutMs?: number
}
export type LanClientEvent =
| {
type: 'ping_sent'
payload: string
timestamp: number
peerId?: string
peerName?: string
}
| {
type: 'pong'
payload?: string
received?: boolean
timestamp: number
peerId?: string
peerName?: string
}
| {
type: 'socket_closed'
reason?: string
timestamp: number
peerId?: string
peerName?: string
}
| {
type: 'error'
message: string
timestamp: number
peerId?: string
peerName?: string
}
| {
type: 'file_transfer_progress'
transferId: string
fileName: string
bytesSent: number
totalBytes: number
chunkIndex: number
totalChunks: number
progress: number // 0-100
speed: number // bytes/sec
timestamp: number
peerId?: string
peerName?: string
}
| {
type: 'file_transfer_complete'
transferId: string
fileName: string
success: boolean
filePath?: string
error?: string
timestamp: number
peerId?: string
peerName?: string
}
// =============================================================================
// LAN File Transfer Protocol Types
// =============================================================================
// Constants for file transfer
export const LAN_TRANSFER_TCP_PORT = 53317
export const LAN_TRANSFER_CHUNK_SIZE = 512 * 1024 // 512KB
export const LAN_TRANSFER_MAX_FILE_SIZE = 500 * 1024 * 1024 // 500MB
export const LAN_TRANSFER_COMPLETE_TIMEOUT_MS = 60_000 // 60s - wait for file_complete after file_end
export const LAN_TRANSFER_GLOBAL_TIMEOUT_MS = 10 * 60 * 1000 // 10 minutes - global transfer timeout
// Binary protocol constants (v1)
export const LAN_TRANSFER_PROTOCOL_VERSION = '1'
export const LAN_BINARY_FRAME_MAGIC = 0x4353 // "CS" as uint16
export const LAN_BINARY_TYPE_FILE_CHUNK = 0x01
// Messages from Electron (Client/Sender) to Mobile (Server/Receiver)
/** Request to start file transfer */
export type LanFileStartMessage = {
type: 'file_start'
transferId: string
fileName: string
fileSize: number
mimeType: string // 'application/zip'
checksum: string // SHA-256 of entire file
totalChunks: number
chunkSize: number
}
/**
* File chunk data (JSON format)
* @deprecated Use binary frame format in protocol v1. This type is kept for reference only.
*/
export type LanFileChunkMessage = {
type: 'file_chunk'
transferId: string
chunkIndex: number
data: string // Base64 encoded
chunkChecksum: string // SHA-256 of this chunk
}
/** Notification that all chunks have been sent */
export type LanFileEndMessage = {
type: 'file_end'
transferId: string
}
/** Request to cancel file transfer */
export type LanFileCancelMessage = {
type: 'file_cancel'
transferId: string
reason?: string
}
// Messages from Mobile (Server/Receiver) to Electron (Client/Sender)
/** Acknowledgment of file transfer request */
export type LanFileStartAckMessage = {
type: 'file_start_ack'
transferId: string
accepted: boolean
message?: string // Rejection reason
}
/**
* Acknowledgment of file chunk received
* @deprecated Protocol v1 uses streaming mode without per-chunk acknowledgment.
* This type is kept for backward compatibility reference only.
*/
export type LanFileChunkAckMessage = {
type: 'file_chunk_ack'
transferId: string
chunkIndex: number
received: boolean
message?: string
}
/** Final result of file transfer */
export type LanFileCompleteMessage = {
type: 'file_complete'
transferId: string
success: boolean
filePath?: string // Path where file was saved on mobile
error?: string
// Enhanced error diagnostics
errorCode?: 'CHECKSUM_MISMATCH' | 'INCOMPLETE_TRANSFER' | 'DISK_ERROR' | 'CANCELLED'
receivedChunks?: number
receivedBytes?: number
}
/** Payload for sending a file via IPC */
export type LanFileSendPayload = {
filePath: string
}

View File

@ -19,8 +19,10 @@ import { agentService } from './services/agents'
import { apiServerService } from './services/ApiServerService'
import { appMenuService } from './services/AppMenuService'
import { configManager } from './services/ConfigManager'
import { nodeTraceService } from './services/NodeTraceService'
import { lanTransferClientService } from './services/lanTransfer'
import mcpService from './services/MCPService'
import { localTransferService } from './services/LocalTransferService'
import { nodeTraceService } from './services/NodeTraceService'
import powerMonitorService from './services/PowerMonitorService'
import {
CHERRY_STUDIO_PROTOCOL,
@ -156,6 +158,7 @@ if (!app.requestSingleInstanceLock()) {
registerShortcuts(mainWindow)
registerIpc(mainWindow, app)
localTransferService.startDiscovery({ resetList: true })
replaceDevtoolsFont(mainWindow)
@ -237,6 +240,9 @@ if (!app.requestSingleInstanceLock()) {
if (selectionService) {
selectionService.quit()
}
lanTransferClientService.dispose()
localTransferService.dispose()
})
app.on('will-quit', async () => {

View File

@ -6,11 +6,19 @@ import { loggerService } from '@logger'
import { isLinux, isMac, isPortable, isWin } from '@main/constant'
import { generateSignature } from '@main/integration/cherryai'
import anthropicService from '@main/services/AnthropicService'
import { findGitBash, getBinaryPath, isBinaryExists, runInstallScript, validateGitBashPath } from '@main/utils/process'
import {
autoDiscoverGitBash,
getBinaryPath,
getGitBashPathInfo,
isBinaryExists,
runInstallScript,
validateGitBashPath
} from '@main/utils/process'
import { handleZoomFactor } from '@main/utils/zoom'
import type { SpanEntity, TokenUsage } from '@mcp-trace/trace-core'
import type { UpgradeChannel } from '@shared/config/constant'
import { MIN_WINDOW_HEIGHT, MIN_WINDOW_WIDTH } from '@shared/config/constant'
import type { LocalTransferConnectPayload } from '@shared/config/types'
import { IpcChannel } from '@shared/IpcChannel'
import type { PluginError } from '@types'
import type {
@ -42,6 +50,8 @@ import { ExportService } from './services/ExportService'
import { fileStorage as fileManager } from './services/FileStorage'
import FileService from './services/FileSystemService'
import KnowledgeService from './services/KnowledgeService'
import { lanTransferClientService } from './services/lanTransfer'
import { localTransferService } from './services/LocalTransferService'
import mcpService from './services/MCPService'
import MemoryService from './services/memory/MemoryService'
import { openTraceWindow, setTraceWindowTitle } from './services/NodeTraceService'
@ -73,7 +83,6 @@ import {
import storeSyncService from './services/StoreSyncService'
import { themeService } from './services/ThemeService'
import VertexAIService from './services/VertexAIService'
import WebSocketService from './services/WebSocketService'
import { setOpenLinkExternal } from './services/WebviewService'
import { windowService } from './services/WindowService'
import { calculateDirectorySize, getResourcePath } from './utils'
@ -499,9 +508,8 @@ export function registerIpc(mainWindow: BrowserWindow, app: Electron.App) {
}
try {
const customPath = configManager.get(ConfigKeys.GitBashPath) as string | undefined
const bashPath = findGitBash(customPath)
// Use autoDiscoverGitBash to handle auto-discovery and persistence
const bashPath = autoDiscoverGitBash()
if (bashPath) {
logger.info('Git Bash is available', { path: bashPath })
return true
@ -524,13 +532,22 @@ export function registerIpc(mainWindow: BrowserWindow, app: Electron.App) {
return customPath ?? null
})
// Returns { path, source } where source is 'manual' | 'auto' | null
ipcMain.handle(IpcChannel.System_GetGitBashPathInfo, () => {
return getGitBashPathInfo()
})
ipcMain.handle(IpcChannel.System_SetGitBashPath, (_, newPath: string | null) => {
if (!isWin) {
return false
}
if (!newPath) {
// Clear manual setting and re-run auto-discovery
configManager.set(ConfigKeys.GitBashPath, null)
configManager.set(ConfigKeys.GitBashPathSource, null)
// Re-run auto-discovery to restore auto-discovered path if available
autoDiscoverGitBash()
return true
}
@ -539,7 +556,9 @@ export function registerIpc(mainWindow: BrowserWindow, app: Electron.App) {
return false
}
// Set path with 'manual' source
configManager.set(ConfigKeys.GitBashPath, validated)
configManager.set(ConfigKeys.GitBashPathSource, 'manual')
return true
})
@ -566,6 +585,8 @@ export function registerIpc(mainWindow: BrowserWindow, app: Electron.App) {
ipcMain.handle(IpcChannel.Backup_ListS3Files, backupManager.listS3Files.bind(backupManager))
ipcMain.handle(IpcChannel.Backup_DeleteS3File, backupManager.deleteS3File.bind(backupManager))
ipcMain.handle(IpcChannel.Backup_CheckS3Connection, backupManager.checkS3Connection.bind(backupManager))
ipcMain.handle(IpcChannel.Backup_CreateLanTransferBackup, backupManager.createLanTransferBackup.bind(backupManager))
ipcMain.handle(IpcChannel.Backup_DeleteTempBackup, backupManager.deleteTempBackup.bind(backupManager))
// file
ipcMain.handle(IpcChannel.File_Open, fileManager.open.bind(fileManager))
@ -1097,12 +1118,17 @@ export function registerIpc(mainWindow: BrowserWindow, app: Electron.App) {
}
})
// WebSocket
ipcMain.handle(IpcChannel.WebSocket_Start, WebSocketService.start)
ipcMain.handle(IpcChannel.WebSocket_Stop, WebSocketService.stop)
ipcMain.handle(IpcChannel.WebSocket_Status, WebSocketService.getStatus)
ipcMain.handle(IpcChannel.WebSocket_SendFile, WebSocketService.sendFile)
ipcMain.handle(IpcChannel.WebSocket_GetAllCandidates, WebSocketService.getAllCandidates)
ipcMain.handle(IpcChannel.LocalTransfer_ListServices, () => localTransferService.getState())
ipcMain.handle(IpcChannel.LocalTransfer_StartScan, () => localTransferService.startDiscovery({ resetList: true }))
ipcMain.handle(IpcChannel.LocalTransfer_StopScan, () => localTransferService.stopDiscovery())
ipcMain.handle(IpcChannel.LocalTransfer_Connect, (_, payload: LocalTransferConnectPayload) =>
lanTransferClientService.connectAndHandshake(payload)
)
ipcMain.handle(IpcChannel.LocalTransfer_Disconnect, () => lanTransferClientService.disconnect())
ipcMain.handle(IpcChannel.LocalTransfer_SendFile, (_, payload: { filePath: string }) =>
lanTransferClientService.sendFile(payload.filePath)
)
ipcMain.handle(IpcChannel.LocalTransfer_CancelTransfer, () => lanTransferClientService.cancelTransfer())
ipcMain.handle(IpcChannel.APP_CrashRenderProcess, () => {
mainWindow.webContents.forcefullyCrashRenderer()

View File

@ -767,6 +767,56 @@ class BackupManager {
const s3Client = this.getS3Storage(s3Config)
return await s3Client.checkConnection()
}
/**
* Create a temporary backup for LAN transfer
* Creates a lightweight backup (skipBackupFile=true) in the temp directory
* Returns the path to the created ZIP file
*/
async createLanTransferBackup(_: Electron.IpcMainInvokeEvent, data: string): Promise<string> {
const timestamp = new Date()
.toISOString()
.replace(/[-:T.Z]/g, '')
.slice(0, 12)
const fileName = `cherry-studio.${timestamp}.zip`
const tempPath = path.join(app.getPath('temp'), 'cherry-studio', 'lan-transfer')
// Ensure temp directory exists
await fs.ensureDir(tempPath)
// Create backup with skipBackupFile=true (no Data folder)
const backupedFilePath = await this.backup(_, fileName, data, tempPath, true)
logger.info(`[BackupManager] Created LAN transfer backup at: ${backupedFilePath}`)
return backupedFilePath
}
/**
* Delete a temporary backup file after LAN transfer completes
*/
async deleteTempBackup(_: Electron.IpcMainInvokeEvent, filePath: string): Promise<boolean> {
try {
// Security check: only allow deletion within temp directory
const tempBase = path.normalize(path.join(app.getPath('temp'), 'cherry-studio', 'lan-transfer'))
const resolvedPath = path.normalize(path.resolve(filePath))
// Use normalized paths with trailing separator to prevent prefix attacks (e.g., /temp-evil)
if (!resolvedPath.startsWith(tempBase + path.sep) && resolvedPath !== tempBase) {
logger.warn(`[BackupManager] Attempted to delete file outside temp directory: ${filePath}`)
return false
}
if (await fs.pathExists(resolvedPath)) {
await fs.remove(resolvedPath)
logger.info(`[BackupManager] Deleted temp backup: ${resolvedPath}`)
return true
}
return false
} catch (error) {
logger.error('[BackupManager] Failed to delete temp backup:', error as Error)
return false
}
}
}
export default BackupManager

View File

@ -32,7 +32,8 @@ export enum ConfigKeys {
Proxy = 'proxy',
EnableDeveloperMode = 'enableDeveloperMode',
ClientId = 'clientId',
GitBashPath = 'gitBashPath'
GitBashPath = 'gitBashPath',
GitBashPathSource = 'gitBashPathSource' // 'manual' | 'auto' | null
}
export class ConfigManager {

View File

@ -0,0 +1,207 @@
import { loggerService } from '@logger'
import type { LocalTransferPeer, LocalTransferState } from '@shared/config/types'
import { IpcChannel } from '@shared/IpcChannel'
import type { Browser, Service } from 'bonjour-service'
import Bonjour from 'bonjour-service'
import { windowService } from './WindowService'
const SERVICE_TYPE = 'cherrystudio'
const SERVICE_PROTOCOL = 'tcp' as const
const logger = loggerService.withContext('LocalTransferService')
type StartDiscoveryOptions = {
resetList?: boolean
}
class LocalTransferService {
private static instance: LocalTransferService
private bonjour: Bonjour | null = null
private browser: Browser | null = null
private services = new Map<string, LocalTransferPeer>()
private isScanning = false
private lastScanStartedAt?: number
private lastUpdatedAt = Date.now()
private lastError?: string
private constructor() {}
public static getInstance(): LocalTransferService {
if (!LocalTransferService.instance) {
LocalTransferService.instance = new LocalTransferService()
}
return LocalTransferService.instance
}
public startDiscovery(options?: StartDiscoveryOptions): LocalTransferState {
if (options?.resetList) {
this.services.clear()
}
this.isScanning = true
this.lastScanStartedAt = Date.now()
this.lastUpdatedAt = Date.now()
this.lastError = undefined
this.restartBrowser()
this.broadcastState()
return this.getState()
}
public stopDiscovery(): LocalTransferState {
if (this.browser) {
try {
this.browser.stop()
} catch (error) {
logger.warn('Failed to stop local transfer browser', error as Error)
}
}
this.isScanning = false
this.lastUpdatedAt = Date.now()
this.broadcastState()
return this.getState()
}
public getState(): LocalTransferState {
const services = Array.from(this.services.values()).sort((a, b) => a.name.localeCompare(b.name))
return {
services,
isScanning: this.isScanning,
lastScanStartedAt: this.lastScanStartedAt,
lastUpdatedAt: this.lastUpdatedAt,
lastError: this.lastError
}
}
public getPeerById(id: string): LocalTransferPeer | undefined {
return this.services.get(id)
}
public dispose(): void {
this.stopDiscovery()
this.services.clear()
this.browser?.removeAllListeners()
this.browser = null
if (this.bonjour) {
try {
this.bonjour.destroy()
} catch (error) {
logger.warn('Failed to destroy Bonjour instance', error as Error)
}
this.bonjour = null
}
}
private getBonjour(): Bonjour {
if (!this.bonjour) {
this.bonjour = new Bonjour()
}
return this.bonjour
}
private restartBrowser(): void {
// Clean up existing browser
if (this.browser) {
this.browser.removeAllListeners()
try {
this.browser.stop()
} catch (error) {
logger.warn('Error while stopping Bonjour browser', error as Error)
}
this.browser = null
}
// Destroy and recreate Bonjour instance to prevent socket leaks
if (this.bonjour) {
try {
this.bonjour.destroy()
} catch (error) {
logger.warn('Error while destroying Bonjour instance', error as Error)
}
this.bonjour = null
}
const browser = this.getBonjour().find({ type: SERVICE_TYPE, protocol: SERVICE_PROTOCOL })
this.browser = browser
this.bindBrowserEvents(browser)
try {
browser.start()
logger.info('Local transfer discovery started')
} catch (error) {
const err = error instanceof Error ? error : new Error(String(error))
this.lastError = err.message
logger.error('Failed to start local transfer discovery', err)
}
}
private bindBrowserEvents(browser: Browser) {
browser.on('up', (service) => {
const peer = this.normalizeService(service)
logger.info(`LAN peer detected: ${peer.name} (${peer.addresses.join(', ')})`)
this.services.set(peer.id, peer)
this.lastUpdatedAt = Date.now()
this.broadcastState()
})
browser.on('down', (service) => {
const key = this.buildServiceKey(service.fqdn || service.name, service.host, service.port)
if (this.services.delete(key)) {
logger.info(`LAN peer removed: ${service.name}`)
this.lastUpdatedAt = Date.now()
this.broadcastState()
}
})
browser.on('error', (error) => {
const err = error instanceof Error ? error : new Error(String(error))
logger.error('Local transfer discovery error', err)
this.lastError = err.message
this.broadcastState()
})
}
private normalizeService(service: Service): LocalTransferPeer {
const addressCandidates = [...(service.addresses || []), service.referer?.address].filter(
(value): value is string => typeof value === 'string' && value.length > 0
)
const addresses = Array.from(new Set(addressCandidates))
const txtEntries = Object.entries(service.txt || {})
const txt =
txtEntries.length > 0
? Object.fromEntries(
txtEntries.map(([key, value]) => [key, value === undefined || value === null ? '' : String(value)])
)
: undefined
const peer: LocalTransferPeer = {
id: this.buildServiceKey(service.fqdn || service.name, service.host, service.port),
name: service.name,
host: service.host,
fqdn: service.fqdn,
port: service.port,
type: service.type,
protocol: service.protocol,
addresses,
txt,
updatedAt: Date.now()
}
return peer
}
private buildServiceKey(name?: string, host?: string, port?: number): string {
const raw = [name, host, port?.toString()].filter(Boolean).join('-')
return raw || `service-${Date.now()}`
}
private broadcastState() {
const mainWindow = windowService.getMainWindow()
if (!mainWindow || mainWindow.isDestroyed()) {
return
}
mainWindow.webContents.send(IpcChannel.LocalTransfer_ServicesUpdated, this.getState())
}
}
export const localTransferService = LocalTransferService.getInstance()

View File

@ -249,6 +249,26 @@ class McpService {
StdioClientTransport | SSEClientTransport | InMemoryTransport | StreamableHTTPClientTransport
> => {
// Create appropriate transport based on configuration
// Special case for nowledgeMem - uses HTTP transport instead of in-memory
if (isBuiltinMCPServer(server) && server.name === BuiltinMCPServerNames.nowledgeMem) {
const nowledgeMemUrl = 'http://127.0.0.1:14242/mcp'
const options: StreamableHTTPClientTransportOptions = {
fetch: async (url, init) => {
return net.fetch(typeof url === 'string' ? url : url.toString(), init)
},
requestInit: {
headers: {
...defaultAppHeaders(),
APP: 'Cherry Studio'
}
},
authProvider
}
getServerLogger(server).debug(`Using StreamableHTTPClientTransport for ${server.name}`)
return new StreamableHTTPClientTransport(new URL(nowledgeMemUrl), options)
}
if (isBuiltinMCPServer(server) && server.name !== BuiltinMCPServerNames.mcpAutoInstall) {
getServerLogger(server).debug(`Using in-memory transport`)
const [clientTransport, serverTransport] = InMemoryTransport.createLinkedPair()

View File

@ -1,359 +0,0 @@
import { loggerService } from '@logger'
import type { WebSocketCandidatesResponse, WebSocketStatusResponse } from '@shared/config/types'
import * as fs from 'fs'
import { networkInterfaces } from 'os'
import * as path from 'path'
import type { Socket } from 'socket.io'
import { Server } from 'socket.io'
import { windowService } from './WindowService'
const logger = loggerService.withContext('WebSocketService')
class WebSocketService {
private io: Server | null = null
private isStarted = false
private port = 7017
private connectedClients = new Set<string>()
private getLocalIpAddress(): string | undefined {
const interfaces = networkInterfaces()
// 按优先级排序的网络接口名称模式
const interfacePriority = [
// macOS: 以太网/Wi-Fi 优先
/^en[0-9]+$/, // en0, en1 (以太网/Wi-Fi)
/^(en|eth)[0-9]+$/, // 以太网接口
/^wlan[0-9]+$/, // 无线接口
// Windows: 以太网/Wi-Fi 优先
/^(Ethernet|Wi-Fi|Local Area Connection)/,
/^(Wi-Fi|无线网络连接)/,
// Linux: 以太网/Wi-Fi 优先
/^(eth|enp|wlp|wlan)[0-9]+/,
// 虚拟化接口(低优先级)
/^bridge[0-9]+$/, // Docker bridge
/^veth[0-9]+$/, // Docker veth
/^docker[0-9]+/, // Docker interfaces
/^br-[0-9a-f]+/, // Docker bridge
/^vmnet[0-9]+$/, // VMware
/^vboxnet[0-9]+$/, // VirtualBox
// VPN 隧道接口(低优先级)
/^utun[0-9]+$/, // macOS VPN
/^tun[0-9]+$/, // Linux/Unix VPN
/^tap[0-9]+$/, // TAP interfaces
/^tailscale[0-9]*$/, // Tailscale VPN
/^wg[0-9]+$/ // WireGuard VPN
]
const candidates: Array<{ interface: string; address: string; priority: number }> = []
for (const [name, ifaces] of Object.entries(interfaces)) {
for (const iface of ifaces || []) {
if (iface.family === 'IPv4' && !iface.internal) {
// 计算接口优先级
let priority = 999 // 默认最低优先级
for (let i = 0; i < interfacePriority.length; i++) {
if (interfacePriority[i].test(name)) {
priority = i
break
}
}
candidates.push({
interface: name,
address: iface.address,
priority
})
}
}
}
if (candidates.length === 0) {
logger.warn('无法获取局域网 IP使用默认 IP: 127.0.0.1')
return '127.0.0.1'
}
// 按优先级排序,选择优先级最高的
candidates.sort((a, b) => a.priority - b.priority)
const best = candidates[0]
logger.info(`获取局域网 IP: ${best.address} (interface: ${best.interface})`)
return best.address
}
public start = async (): Promise<{ success: boolean; port?: number; error?: string }> => {
if (this.isStarted && this.io) {
return { success: true, port: this.port }
}
try {
this.io = new Server(this.port, {
cors: {
origin: '*',
methods: ['GET', 'POST']
},
transports: ['websocket', 'polling'],
allowEIO3: true,
pingTimeout: 60000,
pingInterval: 25000
})
this.io.on('connection', (socket: Socket) => {
this.connectedClients.add(socket.id)
const mainWindow = windowService.getMainWindow()
if (!mainWindow) {
logger.error('Main window is null, cannot send connection event')
} else {
mainWindow.webContents.send('websocket-client-connected', {
connected: true,
clientId: socket.id
})
logger.info(`Connection event sent to renderer, total clients: ${this.connectedClients.size}`)
}
socket.on('message', (data) => {
logger.info('Received message from mobile:', data)
mainWindow?.webContents.send('websocket-message-received', data)
socket.emit('message_received', { success: true })
})
socket.on('disconnect', () => {
logger.info(`Client disconnected: ${socket.id}`)
this.connectedClients.delete(socket.id)
if (this.connectedClients.size === 0) {
mainWindow?.webContents.send('websocket-client-connected', {
connected: false,
clientId: socket.id
})
}
})
})
// Engine 层面的事件监听
this.io.engine.on('connection_error', (err) => {
logger.error('Engine connection error:', err)
})
this.io.engine.on('connection', (rawSocket) => {
const remoteAddr = rawSocket.request.connection.remoteAddress
logger.info(`[Engine] Raw connection from: ${remoteAddr}`)
logger.info(`[Engine] Transport: ${rawSocket.transport.name}`)
rawSocket.on('packet', (packet: { type: string; data?: any }) => {
logger.info(
`[Engine] ← Packet from ${remoteAddr}: type="${packet.type}"`,
packet.data ? { data: packet.data } : {}
)
})
rawSocket.on('packetCreate', (packet: { type: string; data?: any }) => {
logger.info(`[Engine] → Packet to ${remoteAddr}: type="${packet.type}"`)
})
rawSocket.on('close', (reason: string) => {
logger.warn(`[Engine] Connection closed from ${remoteAddr}, reason: ${reason}`)
})
rawSocket.on('error', (error: Error) => {
logger.error(`[Engine] Connection error from ${remoteAddr}:`, error)
})
})
// Socket.IO 握手失败监听
this.io.on('connection_error', (err) => {
logger.error('[Socket.IO] Connection error during handshake:', err)
})
this.isStarted = true
logger.info(`WebSocket server started on port ${this.port}`)
return { success: true, port: this.port }
} catch (error) {
logger.error('Failed to start WebSocket server:', error as Error)
return {
success: false,
error: error instanceof Error ? error.message : 'Unknown error'
}
}
}
public stop = async (): Promise<{ success: boolean }> => {
if (!this.isStarted || !this.io) {
return { success: true }
}
try {
await new Promise<void>((resolve) => {
this.io!.close(() => {
resolve()
})
})
this.io = null
this.isStarted = false
this.connectedClients.clear()
logger.info('WebSocket server stopped')
return { success: true }
} catch (error) {
logger.error('Failed to stop WebSocket server:', error as Error)
return { success: false }
}
}
public getStatus = async (): Promise<WebSocketStatusResponse> => {
return {
isRunning: this.isStarted,
port: this.isStarted ? this.port : undefined,
ip: this.isStarted ? this.getLocalIpAddress() : undefined,
clientConnected: this.connectedClients.size > 0
}
}
public getAllCandidates = async (): Promise<WebSocketCandidatesResponse[]> => {
const interfaces = networkInterfaces()
// 按优先级排序的网络接口名称模式
const interfacePriority = [
// macOS: 以太网/Wi-Fi 优先
/^en[0-9]+$/, // en0, en1 (以太网/Wi-Fi)
/^(en|eth)[0-9]+$/, // 以太网接口
/^wlan[0-9]+$/, // 无线接口
// Windows: 以太网/Wi-Fi 优先
/^(Ethernet|Wi-Fi|Local Area Connection)/,
/^(Wi-Fi|无线网络连接)/,
// Linux: 以太网/Wi-Fi 优先
/^(eth|enp|wlp|wlan)[0-9]+/,
// 虚拟化接口(低优先级)
/^bridge[0-9]+$/, // Docker bridge
/^veth[0-9]+$/, // Docker veth
/^docker[0-9]+/, // Docker interfaces
/^br-[0-9a-f]+/, // Docker bridge
/^vmnet[0-9]+$/, // VMware
/^vboxnet[0-9]+$/, // VirtualBox
// VPN 隧道接口(低优先级)
/^utun[0-9]+$/, // macOS VPN
/^tun[0-9]+$/, // Linux/Unix VPN
/^tap[0-9]+$/, // TAP interfaces
/^tailscale[0-9]*$/, // Tailscale VPN
/^wg[0-9]+$/ // WireGuard VPN
]
const candidates: Array<{ host: string; interface: string; priority: number }> = []
for (const [name, ifaces] of Object.entries(interfaces)) {
for (const iface of ifaces || []) {
if (iface.family === 'IPv4' && !iface.internal) {
// 计算接口优先级
let priority = 999 // 默认最低优先级
for (let i = 0; i < interfacePriority.length; i++) {
if (interfacePriority[i].test(name)) {
priority = i
break
}
}
candidates.push({
host: iface.address,
interface: name,
priority
})
logger.debug(`Found interface: ${name} -> ${iface.address} (priority: ${priority})`)
}
}
}
// 按优先级排序返回
candidates.sort((a, b) => a.priority - b.priority)
logger.info(
`Found ${candidates.length} IP candidates: ${candidates.map((c) => `${c.host}(${c.interface})`).join(', ')}`
)
return candidates
}
public sendFile = async (
_: Electron.IpcMainInvokeEvent,
filePath: string
): Promise<{ success: boolean; error?: string }> => {
if (!this.isStarted || !this.io) {
const errorMsg = 'WebSocket server is not running.'
logger.error(errorMsg)
return { success: false, error: errorMsg }
}
if (this.connectedClients.size === 0) {
const errorMsg = 'No client connected.'
logger.error(errorMsg)
return { success: false, error: errorMsg }
}
const mainWindow = windowService.getMainWindow()
return new Promise((resolve, reject) => {
const stats = fs.statSync(filePath)
const totalSize = stats.size
const filename = path.basename(filePath)
const stream = fs.createReadStream(filePath)
let bytesSent = 0
const startTime = Date.now()
logger.info(`Starting file transfer: ${filename} (${this.formatFileSize(totalSize)})`)
// 向客户端发送文件开始的信号,包含文件名和总大小
this.io!.emit('zip-file-start', { filename, totalSize })
stream.on('data', (chunk) => {
bytesSent += chunk.length
const progress = (bytesSent / totalSize) * 100
// 向客户端发送文件块
this.io!.emit('zip-file-chunk', chunk)
// 向渲染进程发送进度更新
mainWindow?.webContents.send('file-send-progress', { progress })
// 每10%记录一次进度
if (Math.floor(progress) % 10 === 0) {
const elapsed = (Date.now() - startTime) / 1000
const speed = elapsed > 0 ? bytesSent / elapsed : 0
logger.info(`Transfer progress: ${Math.floor(progress)}% (${this.formatFileSize(speed)}/s)`)
}
})
stream.on('end', () => {
const totalTime = (Date.now() - startTime) / 1000
const avgSpeed = totalTime > 0 ? totalSize / totalTime : 0
logger.info(
`File transfer completed: ${filename} in ${totalTime.toFixed(1)}s (${this.formatFileSize(avgSpeed)}/s)`
)
// 确保发送100%的进度
mainWindow?.webContents.send('file-send-progress', { progress: 100 })
// 向客户端发送文件结束的信号
this.io!.emit('zip-file-end')
resolve({ success: true })
})
stream.on('error', (error) => {
logger.error(`File transfer failed: ${filename}`, error)
reject({
success: false,
error: error instanceof Error ? error.message : 'Unknown error'
})
})
})
}
private formatFileSize(bytes: number): string {
if (bytes === 0) return '0 B'
const k = 1024
const sizes = ['B', 'KB', 'MB', 'GB']
const i = Math.floor(Math.log(bytes) / Math.log(k))
return parseFloat((bytes / Math.pow(k, i)).toFixed(2)) + ' ' + sizes[i]
}
}
export default new WebSocketService()

View File

@ -0,0 +1,277 @@
import { beforeEach, describe, expect, it, vi } from 'vitest'
// Use vi.hoisted to define mocks that are available during hoisting
const { mockLogger } = vi.hoisted(() => ({
mockLogger: {
info: vi.fn(),
warn: vi.fn(),
error: vi.fn()
}
}))
vi.mock('@logger', () => ({
loggerService: {
withContext: () => mockLogger
}
}))
vi.mock('electron', () => ({
app: {
getPath: vi.fn((key: string) => {
if (key === 'temp') return '/tmp'
if (key === 'userData') return '/mock/userData'
return '/mock/unknown'
})
}
}))
vi.mock('fs-extra', () => ({
default: {
pathExists: vi.fn(),
remove: vi.fn(),
ensureDir: vi.fn(),
copy: vi.fn(),
readdir: vi.fn(),
stat: vi.fn(),
readFile: vi.fn(),
writeFile: vi.fn(),
createWriteStream: vi.fn(),
createReadStream: vi.fn()
},
pathExists: vi.fn(),
remove: vi.fn(),
ensureDir: vi.fn(),
copy: vi.fn(),
readdir: vi.fn(),
stat: vi.fn(),
readFile: vi.fn(),
writeFile: vi.fn(),
createWriteStream: vi.fn(),
createReadStream: vi.fn()
}))
vi.mock('../WindowService', () => ({
windowService: {
getMainWindow: vi.fn()
}
}))
vi.mock('../WebDav', () => ({
default: vi.fn()
}))
vi.mock('../S3Storage', () => ({
default: vi.fn()
}))
vi.mock('../../utils', () => ({
getDataPath: vi.fn(() => '/mock/data')
}))
vi.mock('archiver', () => ({
default: vi.fn()
}))
vi.mock('node-stream-zip', () => ({
default: vi.fn()
}))
// Import after mocks
import * as fs from 'fs-extra'
import BackupManager from '../BackupManager'
describe('BackupManager.deleteTempBackup - Security Tests', () => {
let backupManager: BackupManager
beforeEach(() => {
vi.clearAllMocks()
backupManager = new BackupManager()
})
describe('Normal Operations', () => {
it('should delete valid file in allowed directory', async () => {
vi.mocked(fs.pathExists).mockResolvedValue(true as never)
vi.mocked(fs.remove).mockResolvedValue(undefined as never)
const validPath = '/tmp/cherry-studio/lan-transfer/backup.zip'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, validPath)
expect(result).toBe(true)
expect(fs.remove).toHaveBeenCalledWith(validPath)
expect(mockLogger.info).toHaveBeenCalledWith(expect.stringContaining('Deleted temp backup'))
})
it('should delete file in nested subdirectory', async () => {
vi.mocked(fs.pathExists).mockResolvedValue(true as never)
vi.mocked(fs.remove).mockResolvedValue(undefined as never)
const nestedPath = '/tmp/cherry-studio/lan-transfer/sub/dir/file.zip'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, nestedPath)
expect(result).toBe(true)
expect(fs.remove).toHaveBeenCalledWith(nestedPath)
})
it('should return false when file does not exist', async () => {
vi.mocked(fs.pathExists).mockResolvedValue(false as never)
const missingPath = '/tmp/cherry-studio/lan-transfer/missing.zip'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, missingPath)
expect(result).toBe(false)
expect(fs.remove).not.toHaveBeenCalled()
})
})
describe('Path Traversal Attacks', () => {
it('should block basic directory traversal attack (../../../../etc/passwd)', async () => {
const attackPath = '/tmp/cherry-studio/lan-transfer/../../../../etc/passwd'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, attackPath)
expect(result).toBe(false)
expect(fs.pathExists).not.toHaveBeenCalled()
expect(fs.remove).not.toHaveBeenCalled()
expect(mockLogger.warn).toHaveBeenCalledWith(expect.stringContaining('outside temp directory'))
})
it('should block absolute path escape (/etc/passwd)', async () => {
const attackPath = '/etc/passwd'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, attackPath)
expect(result).toBe(false)
expect(fs.remove).not.toHaveBeenCalled()
expect(mockLogger.warn).toHaveBeenCalled()
})
it('should block traversal with multiple slashes', async () => {
const attackPath = '/tmp/cherry-studio/lan-transfer/../../../etc/passwd'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, attackPath)
expect(result).toBe(false)
expect(fs.remove).not.toHaveBeenCalled()
})
it('should block relative path traversal from current directory', async () => {
const attackPath = '../../../etc/passwd'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, attackPath)
expect(result).toBe(false)
expect(fs.remove).not.toHaveBeenCalled()
})
it('should block traversal to parent directory', async () => {
const attackPath = '/tmp/cherry-studio/lan-transfer/../backup/secret.zip'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, attackPath)
expect(result).toBe(false)
expect(fs.remove).not.toHaveBeenCalled()
})
})
describe('Prefix Attacks', () => {
it('should block similar prefix attack (lan-transfer-evil)', async () => {
const attackPath = '/tmp/cherry-studio/lan-transfer-evil/file.zip'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, attackPath)
expect(result).toBe(false)
expect(fs.remove).not.toHaveBeenCalled()
expect(mockLogger.warn).toHaveBeenCalled()
})
it('should block path without separator (lan-transferx)', async () => {
const attackPath = '/tmp/cherry-studio/lan-transferx'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, attackPath)
expect(result).toBe(false)
expect(fs.remove).not.toHaveBeenCalled()
})
it('should block different temp directory prefix', async () => {
const attackPath = '/tmp-evil/cherry-studio/lan-transfer/file.zip'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, attackPath)
expect(result).toBe(false)
expect(fs.remove).not.toHaveBeenCalled()
})
})
describe('Error Handling', () => {
it('should return false and log error on permission denied', async () => {
vi.mocked(fs.pathExists).mockResolvedValue(true as never)
vi.mocked(fs.remove).mockRejectedValue(new Error('EACCES: permission denied') as never)
const validPath = '/tmp/cherry-studio/lan-transfer/file.zip'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, validPath)
expect(result).toBe(false)
expect(mockLogger.error).toHaveBeenCalledWith(
expect.stringContaining('Failed to delete'),
expect.any(Error)
)
})
it('should return false on fs.pathExists error', async () => {
vi.mocked(fs.pathExists).mockRejectedValue(new Error('ENOENT') as never)
const validPath = '/tmp/cherry-studio/lan-transfer/file.zip'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, validPath)
expect(result).toBe(false)
expect(mockLogger.error).toHaveBeenCalled()
})
it('should handle empty path string', async () => {
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, '')
expect(result).toBe(false)
expect(fs.remove).not.toHaveBeenCalled()
})
})
describe('Edge Cases', () => {
it('should allow deletion of the temp directory itself', async () => {
vi.mocked(fs.pathExists).mockResolvedValue(true as never)
vi.mocked(fs.remove).mockResolvedValue(undefined as never)
const tempDir = '/tmp/cherry-studio/lan-transfer'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, tempDir)
expect(result).toBe(true)
expect(fs.remove).toHaveBeenCalledWith(tempDir)
})
it('should handle path with trailing slash', async () => {
vi.mocked(fs.pathExists).mockResolvedValue(true as never)
vi.mocked(fs.remove).mockResolvedValue(undefined as never)
const pathWithSlash = '/tmp/cherry-studio/lan-transfer/sub/'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, pathWithSlash)
// path.normalize removes trailing slash
expect(result).toBe(true)
})
it('should handle file with special characters in name', async () => {
vi.mocked(fs.pathExists).mockResolvedValue(true as never)
vi.mocked(fs.remove).mockResolvedValue(undefined as never)
const specialPath = '/tmp/cherry-studio/lan-transfer/file with spaces & (special).zip'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, specialPath)
expect(result).toBe(true)
expect(fs.remove).toHaveBeenCalled()
})
it('should handle path with double slashes', async () => {
vi.mocked(fs.pathExists).mockResolvedValue(true as never)
vi.mocked(fs.remove).mockResolvedValue(undefined as never)
const doubleSlashPath = '/tmp/cherry-studio//lan-transfer//file.zip'
const result = await backupManager.deleteTempBackup({} as Electron.IpcMainInvokeEvent, doubleSlashPath)
// path.normalize handles double slashes
expect(result).toBe(true)
})
})
})

View File

@ -0,0 +1,481 @@
import { EventEmitter } from 'events'
import { afterEach, beforeEach, describe, expect, it, type Mock,vi } from 'vitest'
// Create mock objects before vi.mock calls
const mockLogger = {
info: vi.fn(),
warn: vi.fn(),
error: vi.fn()
}
let mockMainWindow: {
isDestroyed: Mock
webContents: { send: Mock }
} | null = null
let mockBrowser: EventEmitter & {
start: Mock
stop: Mock
removeAllListeners: Mock
}
let mockBonjour: {
find: Mock
destroy: Mock
}
// Mock dependencies before importing the service
vi.mock('@logger', () => ({
loggerService: {
withContext: () => mockLogger
}
}))
vi.mock('../WindowService', () => ({
windowService: {
getMainWindow: vi.fn(() => mockMainWindow)
}
}))
vi.mock('bonjour-service', () => ({
default: vi.fn(() => mockBonjour)
}))
describe('LocalTransferService', () => {
beforeEach(() => {
vi.clearAllMocks()
vi.resetModules()
// Reset mock objects
mockMainWindow = {
isDestroyed: vi.fn(() => false),
webContents: { send: vi.fn() }
}
mockBrowser = Object.assign(new EventEmitter(), {
start: vi.fn(),
stop: vi.fn(),
removeAllListeners: vi.fn()
})
mockBonjour = {
find: vi.fn(() => mockBrowser),
destroy: vi.fn()
}
})
afterEach(() => {
vi.resetAllMocks()
})
describe('startDiscovery', () => {
it('should set isScanning to true and start browser', async () => {
const { localTransferService } = await import('../LocalTransferService')
const state = localTransferService.startDiscovery()
expect(state.isScanning).toBe(true)
expect(state.lastScanStartedAt).toBeDefined()
expect(mockBonjour.find).toHaveBeenCalledWith({ type: 'cherrystudio', protocol: 'tcp' })
expect(mockBrowser.start).toHaveBeenCalled()
})
it('should clear services when resetList is true', async () => {
const { localTransferService } = await import('../LocalTransferService')
// First, start discovery and add a service
localTransferService.startDiscovery()
mockBrowser.emit('up', {
name: 'Test Service',
host: 'localhost',
port: 12345,
addresses: ['192.168.1.100'],
fqdn: 'test.local'
})
expect(localTransferService.getState().services).toHaveLength(1)
// Now restart with resetList
const state = localTransferService.startDiscovery({ resetList: true })
expect(state.services).toHaveLength(0)
})
it('should broadcast state after starting discovery', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
expect(mockMainWindow?.webContents.send).toHaveBeenCalled()
})
it('should handle browser.start() error', async () => {
mockBrowser.start.mockImplementation(() => {
throw new Error('Failed to start mDNS')
})
const { localTransferService } = await import('../LocalTransferService')
const state = localTransferService.startDiscovery()
expect(state.lastError).toBe('Failed to start mDNS')
expect(mockLogger.error).toHaveBeenCalled()
})
})
describe('stopDiscovery', () => {
it('should set isScanning to false and stop browser', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
const state = localTransferService.stopDiscovery()
expect(state.isScanning).toBe(false)
expect(mockBrowser.stop).toHaveBeenCalled()
})
it('should handle browser.stop() error gracefully', async () => {
mockBrowser.stop.mockImplementation(() => {
throw new Error('Stop failed')
})
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
// Should not throw
expect(() => localTransferService.stopDiscovery()).not.toThrow()
expect(mockLogger.warn).toHaveBeenCalled()
})
it('should broadcast state after stopping', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
vi.clearAllMocks()
localTransferService.stopDiscovery()
expect(mockMainWindow?.webContents.send).toHaveBeenCalled()
})
})
describe('browser events', () => {
it('should add service on "up" event', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
mockBrowser.emit('up', {
name: 'Test Service',
host: 'localhost',
port: 12345,
addresses: ['192.168.1.100'],
fqdn: 'test.local',
type: 'cherrystudio',
protocol: 'tcp'
})
const state = localTransferService.getState()
expect(state.services).toHaveLength(1)
expect(state.services[0].name).toBe('Test Service')
expect(state.services[0].port).toBe(12345)
expect(state.services[0].addresses).toContain('192.168.1.100')
})
it('should remove service on "down" event', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
// Add service
mockBrowser.emit('up', {
name: 'Test Service',
host: 'localhost',
port: 12345,
addresses: ['192.168.1.100'],
fqdn: 'test.local'
})
expect(localTransferService.getState().services).toHaveLength(1)
// Remove service
mockBrowser.emit('down', {
name: 'Test Service',
host: 'localhost',
port: 12345,
fqdn: 'test.local'
})
expect(localTransferService.getState().services).toHaveLength(0)
expect(mockLogger.info).toHaveBeenCalledWith(expect.stringContaining('removed'))
})
it('should set lastError on "error" event', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
mockBrowser.emit('error', new Error('Discovery failed'))
const state = localTransferService.getState()
expect(state.lastError).toBe('Discovery failed')
expect(mockLogger.error).toHaveBeenCalled()
})
it('should handle non-Error objects in error event', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
mockBrowser.emit('error', 'String error message')
const state = localTransferService.getState()
expect(state.lastError).toBe('String error message')
})
})
describe('getState', () => {
it('should return sorted services by name', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
mockBrowser.emit('up', {
name: 'Zebra Service',
host: 'host1',
port: 1001,
addresses: ['192.168.1.1']
})
mockBrowser.emit('up', {
name: 'Alpha Service',
host: 'host2',
port: 1002,
addresses: ['192.168.1.2']
})
const state = localTransferService.getState()
expect(state.services[0].name).toBe('Alpha Service')
expect(state.services[1].name).toBe('Zebra Service')
})
it('should include all state properties', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
const state = localTransferService.getState()
expect(state).toHaveProperty('services')
expect(state).toHaveProperty('isScanning')
expect(state).toHaveProperty('lastScanStartedAt')
expect(state).toHaveProperty('lastUpdatedAt')
})
})
describe('getPeerById', () => {
it('should return peer when exists', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
mockBrowser.emit('up', {
name: 'Test Service',
host: 'localhost',
port: 12345,
addresses: ['192.168.1.100'],
fqdn: 'test.local'
})
const services = localTransferService.getState().services
const peer = localTransferService.getPeerById(services[0].id)
expect(peer).toBeDefined()
expect(peer?.name).toBe('Test Service')
})
it('should return undefined when peer does not exist', async () => {
const { localTransferService } = await import('../LocalTransferService')
const peer = localTransferService.getPeerById('non-existent-id')
expect(peer).toBeUndefined()
})
})
describe('normalizeService', () => {
it('should deduplicate addresses', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
mockBrowser.emit('up', {
name: 'Test Service',
host: 'localhost',
port: 12345,
addresses: ['192.168.1.100', '192.168.1.100', '10.0.0.1'],
referer: { address: '192.168.1.100' }
})
const services = localTransferService.getState().services
expect(services[0].addresses).toHaveLength(2)
expect(services[0].addresses).toContain('192.168.1.100')
expect(services[0].addresses).toContain('10.0.0.1')
})
it('should filter empty addresses', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
mockBrowser.emit('up', {
name: 'Test Service',
host: 'localhost',
port: 12345,
addresses: ['192.168.1.100', '', null as any]
})
const services = localTransferService.getState().services
expect(services[0].addresses).toEqual(['192.168.1.100'])
})
it('should convert txt null/undefined values to empty strings', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
mockBrowser.emit('up', {
name: 'Test Service',
host: 'localhost',
port: 12345,
addresses: ['192.168.1.100'],
txt: {
version: '1.0',
nullValue: null,
undefinedValue: undefined,
numberValue: 42
}
})
const services = localTransferService.getState().services
expect(services[0].txt).toEqual({
version: '1.0',
nullValue: '',
undefinedValue: '',
numberValue: '42'
})
})
it('should not include txt when empty', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
mockBrowser.emit('up', {
name: 'Test Service',
host: 'localhost',
port: 12345,
addresses: ['192.168.1.100'],
txt: {}
})
const services = localTransferService.getState().services
expect(services[0].txt).toBeUndefined()
})
})
describe('dispose', () => {
it('should clean up all resources', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
mockBrowser.emit('up', {
name: 'Test Service',
host: 'localhost',
port: 12345,
addresses: ['192.168.1.100']
})
localTransferService.dispose()
expect(localTransferService.getState().services).toHaveLength(0)
expect(localTransferService.getState().isScanning).toBe(false)
expect(mockBrowser.removeAllListeners).toHaveBeenCalled()
expect(mockBonjour.destroy).toHaveBeenCalled()
})
it('should handle bonjour.destroy() error gracefully', async () => {
mockBonjour.destroy.mockImplementation(() => {
throw new Error('Destroy failed')
})
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
// Should not throw
expect(() => localTransferService.dispose()).not.toThrow()
expect(mockLogger.warn).toHaveBeenCalled()
})
it('should be safe to call multiple times', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
expect(() => {
localTransferService.dispose()
localTransferService.dispose()
}).not.toThrow()
})
})
describe('broadcastState', () => {
it('should not throw when main window is null', async () => {
mockMainWindow = null
const { localTransferService } = await import('../LocalTransferService')
// Should not throw
expect(() => localTransferService.startDiscovery()).not.toThrow()
})
it('should not throw when main window is destroyed', async () => {
mockMainWindow = {
isDestroyed: vi.fn(() => true),
webContents: { send: vi.fn() }
}
const { localTransferService } = await import('../LocalTransferService')
// Should not throw
expect(() => localTransferService.startDiscovery()).not.toThrow()
expect(mockMainWindow.webContents.send).not.toHaveBeenCalled()
})
})
describe('restartBrowser', () => {
it('should destroy old bonjour instance to prevent socket leaks', async () => {
const { localTransferService } = await import('../LocalTransferService')
// First start
localTransferService.startDiscovery()
expect(mockBonjour.destroy).not.toHaveBeenCalled()
// Restart - should destroy old instance
localTransferService.startDiscovery()
expect(mockBonjour.destroy).toHaveBeenCalled()
})
it('should remove all listeners from old browser', async () => {
const { localTransferService } = await import('../LocalTransferService')
localTransferService.startDiscovery()
localTransferService.startDiscovery()
expect(mockBrowser.removeAllListeners).toHaveBeenCalled()
})
})
})

View File

@ -15,8 +15,8 @@ import { query } from '@anthropic-ai/claude-agent-sdk'
import { loggerService } from '@logger'
import { config as apiConfigService } from '@main/apiServer/config'
import { validateModelId } from '@main/apiServer/utils'
import { ConfigKeys, configManager } from '@main/services/ConfigManager'
import { validateGitBashPath } from '@main/utils/process'
import { isWin } from '@main/constant'
import { autoDiscoverGitBash } from '@main/utils/process'
import getLoginShellEnvironment from '@main/utils/shell-env'
import { app } from 'electron'
@ -109,7 +109,8 @@ class ClaudeCodeService implements AgentServiceInterface {
Object.entries(loginShellEnv).filter(([key]) => !key.toLowerCase().endsWith('_proxy'))
) as Record<string, string>
const customGitBashPath = validateGitBashPath(configManager.get(ConfigKeys.GitBashPath) as string | undefined)
// Auto-discover Git Bash path on Windows (already logs internally)
const customGitBashPath = isWin ? autoDiscoverGitBash() : null
const env = {
...loginShellEnvWithoutProxies,

View File

@ -0,0 +1,525 @@
import * as crypto from 'node:crypto'
import { createConnection, type Socket } from 'node:net'
import { loggerService } from '@logger'
import type {
LanClientEvent,
LanFileCompleteMessage,
LanHandshakeAckMessage,
LocalTransferConnectPayload,
LocalTransferPeer
} from '@shared/config/types'
import { LAN_TRANSFER_GLOBAL_TIMEOUT_MS } from '@shared/config/types'
import { IpcChannel } from '@shared/IpcChannel'
import { localTransferService } from '../LocalTransferService'
import { windowService } from '../WindowService'
import {
abortTransfer,
buildHandshakeMessage,
calculateFileChecksum,
cleanupTransfer,
createDataHandler,
createTransferState,
formatFileSize,
HANDSHAKE_PROTOCOL_VERSION,
pickHost,
sendFileEnd,
sendFileStart,
sendTestPing,
streamFileChunks,
validateFile,
waitForFileComplete,
waitForFileStartAck
} from './handlers'
import { ResponseManager } from './responseManager'
import type { ActiveFileTransfer, ConnectionContext, FileTransferContext } from './types'
const DEFAULT_HANDSHAKE_TIMEOUT_MS = 10_000
const logger = loggerService.withContext('LanTransferClientService')
/**
* LAN Transfer Client Service
*
* Handles outgoing file transfers to LAN peers via TCP.
* Protocol v1 with streaming mode (no per-chunk acknowledgment).
*/
class LanTransferClientService {
private socket: Socket | null = null
private currentPeer?: LocalTransferPeer
private dataHandler?: ReturnType<typeof createDataHandler>
private responseManager = new ResponseManager()
private isConnecting = false
private activeTransfer?: ActiveFileTransfer
private lastConnectOptions?: LocalTransferConnectPayload
private consecutiveJsonErrors = 0
private static readonly MAX_CONSECUTIVE_JSON_ERRORS = 3
private reconnectPromise: Promise<void> | null = null
constructor() {
this.responseManager.setTimeoutCallback(() => void this.disconnect())
}
/**
* Connect to a LAN peer and perform handshake.
*/
public async connectAndHandshake(options: LocalTransferConnectPayload): Promise<LanHandshakeAckMessage> {
if (this.isConnecting) {
throw new Error('LAN transfer client is busy')
}
const peer = localTransferService.getPeerById(options.peerId)
if (!peer) {
throw new Error('Selected LAN peer is no longer available')
}
if (!peer.port) {
throw new Error('Selected peer does not expose a TCP port')
}
const host = pickHost(peer)
if (!host) {
throw new Error('Unable to resolve a reachable host for the peer')
}
await this.disconnect()
this.isConnecting = true
return new Promise<LanHandshakeAckMessage>((resolve, reject) => {
const socket = createConnection({ host, port: peer.port as number }, () => {
logger.info(`Connected to LAN peer ${peer.name} (${host}:${peer.port})`)
socket.setKeepAlive(true, 30_000)
this.socket = socket
this.currentPeer = peer
this.attachSocketListeners(socket)
this.responseManager.waitForResponse(
'handshake_ack',
options.timeoutMs ?? DEFAULT_HANDSHAKE_TIMEOUT_MS,
(payload) => {
const ack = payload as LanHandshakeAckMessage
if (!ack.accepted) {
const message = ack.message || 'Handshake rejected by remote device'
logger.warn(`Handshake rejected by ${peer.name}: ${message}`)
this.broadcastClientEvent({
type: 'error',
message,
timestamp: Date.now()
})
reject(new Error(message))
void this.disconnect()
return
}
logger.info(`Handshake accepted by ${peer.name}`)
socket.setTimeout(0)
this.isConnecting = false
this.lastConnectOptions = options
sendTestPing(this.createConnectionContext())
resolve(ack)
},
(error) => {
this.isConnecting = false
reject(error)
}
)
const handshakeMessage = buildHandshakeMessage()
this.sendControlMessage(handshakeMessage)
})
socket.setTimeout(options.timeoutMs ?? DEFAULT_HANDSHAKE_TIMEOUT_MS, () => {
const error = new Error('Handshake timed out')
logger.error('LAN transfer socket timeout', error)
this.broadcastClientEvent({
type: 'error',
message: error.message,
timestamp: Date.now()
})
reject(error)
socket.destroy(error)
void this.disconnect()
})
socket.once('error', (error) => {
logger.error('LAN transfer socket error', error as Error)
const message = error instanceof Error ? error.message : String(error)
this.broadcastClientEvent({
type: 'error',
message,
timestamp: Date.now()
})
this.isConnecting = false
reject(error instanceof Error ? error : new Error(message))
void this.disconnect()
})
socket.once('close', () => {
logger.info('LAN transfer socket closed')
if (this.socket === socket) {
this.socket = null
this.dataHandler?.resetBuffer()
this.responseManager.rejectAll(new Error('LAN transfer socket closed'))
this.currentPeer = undefined
abortTransfer(this.activeTransfer, new Error('LAN transfer socket closed'))
}
this.isConnecting = false
this.broadcastClientEvent({
type: 'socket_closed',
reason: 'connection_closed',
timestamp: Date.now()
})
})
})
}
/**
* Disconnect from the current peer.
*/
public async disconnect(): Promise<void> {
const socket = this.socket
if (!socket) {
return
}
this.socket = null
this.dataHandler?.resetBuffer()
this.currentPeer = undefined
this.responseManager.rejectAll(new Error('LAN transfer socket disconnected'))
abortTransfer(this.activeTransfer, new Error('LAN transfer socket disconnected'))
const DISCONNECT_TIMEOUT_MS = 3000
await new Promise<void>((resolve) => {
const timeout = setTimeout(() => {
logger.warn('Disconnect timeout, forcing cleanup')
socket.removeAllListeners()
resolve()
}, DISCONNECT_TIMEOUT_MS)
socket.once('close', () => {
clearTimeout(timeout)
resolve()
})
socket.destroy()
})
}
/**
* Dispose the service and clean up all resources.
*/
public dispose(): void {
this.responseManager.rejectAll(new Error('LAN transfer client disposed'))
cleanupTransfer(this.activeTransfer)
this.activeTransfer = undefined
if (this.socket) {
this.socket.destroy()
this.socket = null
}
this.dataHandler?.resetBuffer()
this.isConnecting = false
}
/**
* Send a ZIP file to the connected peer.
*/
public async sendFile(filePath: string): Promise<LanFileCompleteMessage> {
await this.ensureConnection()
if (this.activeTransfer) {
throw new Error('A file transfer is already in progress')
}
// Validate file
const { stats, fileName } = await validateFile(filePath)
// Calculate checksum
logger.info('Calculating file checksum...')
const checksum = await calculateFileChecksum(filePath)
logger.info(`File checksum: ${checksum.substring(0, 16)}...`)
// Connection can drop while validating/checking file; ensure it is still ready before starting transfer.
await this.ensureConnection()
// Initialize transfer state
const transferId = crypto.randomUUID()
this.activeTransfer = createTransferState(transferId, fileName, stats.size, checksum)
logger.info(
`Starting file transfer: ${fileName} (${formatFileSize(stats.size)}, ${this.activeTransfer.totalChunks} chunks)`
)
// Global timeout
const globalTimeoutError = new Error('Transfer timed out (global timeout exceeded)')
const globalTimeoutHandle = setTimeout(() => {
logger.warn('Global transfer timeout exceeded, aborting transfer', { transferId, fileName })
abortTransfer(this.activeTransfer, globalTimeoutError)
}, LAN_TRANSFER_GLOBAL_TIMEOUT_MS)
try {
const result = await this.performFileTransfer(filePath, transferId, fileName)
return result
} catch (error) {
const message = error instanceof Error ? error.message : String(error)
logger.error(`File transfer failed: ${message}`)
this.broadcastClientEvent({
type: 'file_transfer_complete',
transferId,
fileName,
success: false,
error: message,
timestamp: Date.now()
})
throw error
} finally {
clearTimeout(globalTimeoutHandle)
cleanupTransfer(this.activeTransfer)
this.activeTransfer = undefined
}
}
/**
* Cancel the current file transfer.
*/
public cancelTransfer(): void {
if (!this.activeTransfer) {
logger.warn('No active transfer to cancel')
return
}
const { transferId, fileName } = this.activeTransfer
logger.info(`Cancelling file transfer: ${fileName}`)
this.activeTransfer.isCancelled = true
try {
this.sendControlMessage({
type: 'file_cancel',
transferId,
reason: 'Cancelled by user'
})
} catch (error) {
// Expected when connection is already broken
logger.warn('Failed to send cancel message', error as Error)
}
abortTransfer(this.activeTransfer, new Error('Transfer cancelled by user'))
}
// =============================================================================
// Private Methods
// =============================================================================
private async ensureConnection(): Promise<void> {
// Check socket is valid and writable (not just undestroyed)
if (this.socket && !this.socket.destroyed && this.socket.writable && this.currentPeer) {
return
}
if (!this.lastConnectOptions) {
throw new Error('No active connection. Please connect to a peer first.')
}
// Prevent concurrent reconnection attempts
if (this.reconnectPromise) {
logger.debug('Waiting for existing reconnection attempt...')
await this.reconnectPromise
return
}
logger.info('Connection lost, attempting to reconnect...')
this.reconnectPromise = this.connectAndHandshake(this.lastConnectOptions)
.then(() => {
// Handshake succeeded, connection restored
})
.finally(() => {
this.reconnectPromise = null
})
await this.reconnectPromise
}
private async performFileTransfer(
filePath: string,
transferId: string,
fileName: string
): Promise<LanFileCompleteMessage> {
const transfer = this.activeTransfer!
const ctx = this.createFileTransferContext()
// Step 1: Send file_start
sendFileStart(ctx, transfer)
// Step 2: Wait for file_start_ack
const startAck = await waitForFileStartAck(ctx, transferId, transfer.abortController.signal)
if (!startAck.accepted) {
throw new Error(startAck.message || 'Transfer rejected by receiver')
}
logger.info('Received file_start_ack: accepted')
// Step 3: Stream file chunks
await streamFileChunks(this.socket!, filePath, transfer, transfer.abortController.signal, (bytesSent, chunkIndex) =>
this.onTransferProgress(transfer, bytesSent, chunkIndex)
)
// Step 4: Send file_end
sendFileEnd(ctx, transferId)
// Step 5: Wait for file_complete
const result = await waitForFileComplete(ctx, transferId, transfer.abortController.signal)
logger.info(`File transfer ${result.success ? 'completed' : 'failed'}`)
// Broadcast completion
this.broadcastClientEvent({
type: 'file_transfer_complete',
transferId,
fileName,
success: result.success,
filePath: result.filePath,
error: result.error,
timestamp: Date.now()
})
return result
}
private onTransferProgress(transfer: ActiveFileTransfer, bytesSent: number, chunkIndex: number): void {
const progress = (bytesSent / transfer.fileSize) * 100
const elapsed = (Date.now() - transfer.startedAt) / 1000
const speed = elapsed > 0 ? bytesSent / elapsed : 0
this.broadcastClientEvent({
type: 'file_transfer_progress',
transferId: transfer.transferId,
fileName: transfer.fileName,
bytesSent,
totalBytes: transfer.fileSize,
chunkIndex,
totalChunks: transfer.totalChunks,
progress: Math.round(progress * 100) / 100,
speed,
timestamp: Date.now()
})
}
private attachSocketListeners(socket: Socket): void {
this.dataHandler = createDataHandler((line) => this.handleControlLine(line))
socket.on('data', (chunk: Buffer) => {
try {
this.dataHandler?.handleData(chunk)
} catch (error) {
logger.error('Data handler error', error as Error)
void this.disconnect()
}
})
}
private handleControlLine(line: string): void {
let payload: Record<string, unknown>
try {
payload = JSON.parse(line)
this.consecutiveJsonErrors = 0 // Reset on successful parse
} catch {
this.consecutiveJsonErrors++
logger.warn('Received invalid JSON control message', { line, consecutiveErrors: this.consecutiveJsonErrors })
if (this.consecutiveJsonErrors >= LanTransferClientService.MAX_CONSECUTIVE_JSON_ERRORS) {
const message = `Protocol error: ${this.consecutiveJsonErrors} consecutive invalid messages, disconnecting`
logger.error(message)
this.broadcastClientEvent({
type: 'error',
message,
timestamp: Date.now()
})
void this.disconnect()
}
return
}
const type = payload?.type as string | undefined
if (!type) {
logger.warn('Received control message without type', payload)
return
}
// Try to resolve a pending response
const transferId = payload?.transferId as string | undefined
const chunkIndex = payload?.chunkIndex as number | undefined
if (this.responseManager.tryResolve(type, payload, transferId, chunkIndex)) {
return
}
logger.info('Received control message', payload)
if (type === 'pong') {
this.broadcastClientEvent({
type: 'pong',
payload: payload?.payload as string | undefined,
received: payload?.received as boolean | undefined,
timestamp: Date.now()
})
return
}
// Ignore late-arriving file transfer messages
const fileTransferMessageTypes = ['file_start_ack', 'file_complete']
if (fileTransferMessageTypes.includes(type)) {
logger.debug('Ignoring late file transfer message', { type, payload })
return
}
this.broadcastClientEvent({
type: 'error',
message: `Unexpected control message type: ${type}`,
timestamp: Date.now()
})
}
private sendControlMessage(message: Record<string, unknown>): void {
if (!this.socket || this.socket.destroyed || !this.socket.writable) {
throw new Error('Socket is not connected')
}
const payload = JSON.stringify(message)
this.socket.write(`${payload}\n`)
}
private createConnectionContext(): ConnectionContext {
return {
socket: this.socket,
currentPeer: this.currentPeer,
sendControlMessage: (msg) => this.sendControlMessage(msg),
broadcastClientEvent: (event) => this.broadcastClientEvent(event)
}
}
private createFileTransferContext(): FileTransferContext {
return {
...this.createConnectionContext(),
activeTransfer: this.activeTransfer,
setActiveTransfer: (transfer) => {
this.activeTransfer = transfer
},
waitForResponse: (type, timeoutMs, resolve, reject, transferId, chunkIndex, abortSignal) => {
this.responseManager.waitForResponse(type, timeoutMs, resolve, reject, transferId, chunkIndex, abortSignal)
}
}
}
private broadcastClientEvent(event: LanClientEvent): void {
const mainWindow = windowService.getMainWindow()
if (!mainWindow || mainWindow.isDestroyed()) {
return
}
mainWindow.webContents.send(IpcChannel.LocalTransfer_ClientEvent, {
...event,
peerId: event.peerId ?? this.currentPeer?.id,
peerName: event.peerName ?? this.currentPeer?.name
})
}
}
export const lanTransferClientService = new LanTransferClientService()
// Re-export for backward compatibility
export { HANDSHAKE_PROTOCOL_VERSION }

View File

@ -0,0 +1,137 @@
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
// Mock dependencies before importing the service
vi.mock('node:net', async (importOriginal) => {
const actual = (await importOriginal()) as Record<string, unknown>
return {
...actual,
createConnection: vi.fn()
}
})
vi.mock('electron', () => ({
app: {
getName: vi.fn(() => 'Cherry Studio'),
getVersion: vi.fn(() => '1.0.0')
}
}))
vi.mock('../../LocalTransferService', () => ({
localTransferService: {
getPeerById: vi.fn()
}
}))
vi.mock('../../WindowService', () => ({
windowService: {
getMainWindow: vi.fn(() => ({
isDestroyed: () => false,
webContents: {
send: vi.fn()
}
}))
}
}))
// Import after mocks
import { localTransferService } from '../../LocalTransferService'
describe('LanTransferClientService', () => {
beforeEach(() => {
vi.clearAllMocks()
vi.resetModules()
})
afterEach(() => {
vi.resetAllMocks()
})
describe('connectAndHandshake - validation', () => {
it('should throw error when peer is not found', async () => {
vi.mocked(localTransferService.getPeerById).mockReturnValue(undefined)
const { lanTransferClientService } = await import('../LanTransferClientService')
await expect(
lanTransferClientService.connectAndHandshake({
peerId: 'non-existent',
type: 'connect'
})
).rejects.toThrow('Selected LAN peer is no longer available')
})
it('should throw error when peer has no port', async () => {
vi.mocked(localTransferService.getPeerById).mockReturnValue({
id: 'test-peer',
name: 'Test Peer',
addresses: ['192.168.1.100'],
updatedAt: Date.now()
})
const { lanTransferClientService } = await import('../LanTransferClientService')
await expect(
lanTransferClientService.connectAndHandshake({
peerId: 'test-peer',
type: 'connect'
})
).rejects.toThrow('Selected peer does not expose a TCP port')
})
it('should throw error when no reachable host', async () => {
vi.mocked(localTransferService.getPeerById).mockReturnValue({
id: 'test-peer',
name: 'Test Peer',
port: 12345,
addresses: [],
updatedAt: Date.now()
})
const { lanTransferClientService } = await import('../LanTransferClientService')
await expect(
lanTransferClientService.connectAndHandshake({
peerId: 'test-peer',
type: 'connect'
})
).rejects.toThrow('Unable to resolve a reachable host for the peer')
})
})
describe('cancelTransfer', () => {
it('should not throw when no active transfer', async () => {
const { lanTransferClientService } = await import('../LanTransferClientService')
// Should not throw, just log warning
expect(() => lanTransferClientService.cancelTransfer()).not.toThrow()
})
})
describe('dispose', () => {
it('should clean up resources without throwing', async () => {
const { lanTransferClientService } = await import('../LanTransferClientService')
// Should not throw
expect(() => lanTransferClientService.dispose()).not.toThrow()
})
})
describe('sendFile', () => {
it('should throw error when not connected', async () => {
const { lanTransferClientService } = await import('../LanTransferClientService')
await expect(lanTransferClientService.sendFile('/path/to/file.zip')).rejects.toThrow(
'No active connection. Please connect to a peer first.'
)
})
})
describe('HANDSHAKE_PROTOCOL_VERSION', () => {
it('should export protocol version', async () => {
const { HANDSHAKE_PROTOCOL_VERSION } = await import('../LanTransferClientService')
expect(HANDSHAKE_PROTOCOL_VERSION).toBe('1')
})
})
})

View File

@ -0,0 +1,103 @@
import { EventEmitter } from 'node:events'
import type { Socket } from 'node:net'
import { beforeEach, describe, expect, it, vi } from 'vitest'
import { BINARY_TYPE_FILE_CHUNK, sendBinaryChunk } from '../binaryProtocol'
describe('binaryProtocol', () => {
describe('sendBinaryChunk', () => {
let mockSocket: Socket
let writtenBuffers: Buffer[]
beforeEach(() => {
writtenBuffers = []
mockSocket = Object.assign(new EventEmitter(), {
destroyed: false,
writable: true,
write: vi.fn((buffer: Buffer) => {
writtenBuffers.push(Buffer.from(buffer))
return true
}),
cork: vi.fn(),
uncork: vi.fn()
}) as unknown as Socket
})
it('should send binary chunk with correct frame format', () => {
const transferId = 'test-uuid-1234'
const chunkIndex = 5
const data = Buffer.from('test data chunk')
const result = sendBinaryChunk(mockSocket, transferId, chunkIndex, data)
expect(result).toBe(true)
expect(mockSocket.cork).toHaveBeenCalled()
expect(mockSocket.uncork).toHaveBeenCalled()
expect(mockSocket.write).toHaveBeenCalledTimes(2)
// Verify header structure
const header = writtenBuffers[0]
// Magic bytes "CS"
expect(header[0]).toBe(0x43)
expect(header[1]).toBe(0x53)
// Type byte
const typeOffset = 2 + 4 // magic + totalLen
expect(header[typeOffset]).toBe(BINARY_TYPE_FILE_CHUNK)
// TransferId length
const tidLenOffset = typeOffset + 1
const tidLen = header.readUInt16BE(tidLenOffset)
expect(tidLen).toBe(Buffer.from(transferId).length)
// ChunkIndex
const chunkIdxOffset = tidLenOffset + 2 + tidLen
expect(header.readUInt32BE(chunkIdxOffset)).toBe(chunkIndex)
// Data buffer
expect(writtenBuffers[1].toString()).toBe('test data chunk')
})
it('should return false when socket write returns false (backpressure)', () => {
;(mockSocket.write as ReturnType<typeof vi.fn>).mockReturnValueOnce(false)
const result = sendBinaryChunk(mockSocket, 'test-id', 0, Buffer.from('data'))
expect(result).toBe(false)
})
it('should correctly calculate totalLen in frame header', () => {
const transferId = 'uuid-1234'
const data = Buffer.from('chunk data here')
sendBinaryChunk(mockSocket, transferId, 0, data)
const header = writtenBuffers[0]
const totalLen = header.readUInt32BE(2) // After magic bytes
// totalLen = type(1) + tidLen(2) + tid(n) + idx(4) + data(m)
const expectedTotalLen = 1 + 2 + Buffer.from(transferId).length + 4 + data.length
expect(totalLen).toBe(expectedTotalLen)
})
it('should throw error when socket is not writable', () => {
mockSocket.writable = false
expect(() => sendBinaryChunk(mockSocket, 'test-id', 0, Buffer.from('data'))).toThrow('Socket is not writable')
})
it('should throw error when socket is destroyed', () => {
mockSocket.destroyed = true
expect(() => sendBinaryChunk(mockSocket, 'test-id', 0, Buffer.from('data'))).toThrow('Socket is not writable')
})
})
describe('BINARY_TYPE_FILE_CHUNK', () => {
it('should be 0x01', () => {
expect(BINARY_TYPE_FILE_CHUNK).toBe(0x01)
})
})
})

View File

@ -0,0 +1,265 @@
import { EventEmitter } from 'node:events'
import type { Socket } from 'node:net'
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
import {
buildHandshakeMessage,
createDataHandler,
getAbortError,
HANDSHAKE_PROTOCOL_VERSION,
pickHost,
waitForSocketDrain
} from '../../handlers/connection'
// Mock electron app
vi.mock('electron', () => ({
app: {
getName: vi.fn(() => 'Cherry Studio'),
getVersion: vi.fn(() => '1.0.0')
}
}))
describe('connection handlers', () => {
describe('buildHandshakeMessage', () => {
it('should build handshake message with correct structure', () => {
const message = buildHandshakeMessage()
expect(message.type).toBe('handshake')
expect(message.deviceName).toBe('Cherry Studio')
expect(message.version).toBe(HANDSHAKE_PROTOCOL_VERSION)
expect(message.appVersion).toBe('1.0.0')
expect(typeof message.platform).toBe('string')
})
it('should use protocol version 1', () => {
expect(HANDSHAKE_PROTOCOL_VERSION).toBe('1')
})
})
describe('pickHost', () => {
it('should prefer IPv4 addresses', () => {
const peer = {
id: '1',
name: 'Test',
addresses: ['fe80::1', '192.168.1.100', '::1'],
updatedAt: Date.now()
}
expect(pickHost(peer)).toBe('192.168.1.100')
})
it('should fall back to first address if no IPv4', () => {
const peer = {
id: '1',
name: 'Test',
addresses: ['fe80::1', '::1'],
updatedAt: Date.now()
}
expect(pickHost(peer)).toBe('fe80::1')
})
it('should fall back to host property if no addresses', () => {
const peer = {
id: '1',
name: 'Test',
host: 'example.local',
addresses: [],
updatedAt: Date.now()
}
expect(pickHost(peer)).toBe('example.local')
})
it('should return undefined if no addresses or host', () => {
const peer = {
id: '1',
name: 'Test',
addresses: [],
updatedAt: Date.now()
}
expect(pickHost(peer)).toBeUndefined()
})
})
describe('createDataHandler', () => {
it('should parse complete lines from buffer', () => {
const lines: string[] = []
const handler = createDataHandler((line) => lines.push(line))
handler.handleData(Buffer.from('{"type":"test"}\n'))
expect(lines).toEqual(['{"type":"test"}'])
})
it('should handle partial lines across multiple chunks', () => {
const lines: string[] = []
const handler = createDataHandler((line) => lines.push(line))
handler.handleData(Buffer.from('{"type":'))
handler.handleData(Buffer.from('"test"}\n'))
expect(lines).toEqual(['{"type":"test"}'])
})
it('should handle multiple lines in single chunk', () => {
const lines: string[] = []
const handler = createDataHandler((line) => lines.push(line))
handler.handleData(Buffer.from('{"a":1}\n{"b":2}\n'))
expect(lines).toEqual(['{"a":1}', '{"b":2}'])
})
it('should reset buffer', () => {
const lines: string[] = []
const handler = createDataHandler((line) => lines.push(line))
handler.handleData(Buffer.from('partial'))
handler.resetBuffer()
handler.handleData(Buffer.from('{"complete":true}\n'))
expect(lines).toEqual(['{"complete":true}'])
})
it('should trim whitespace from lines', () => {
const lines: string[] = []
const handler = createDataHandler((line) => lines.push(line))
handler.handleData(Buffer.from(' {"type":"test"} \n'))
expect(lines).toEqual(['{"type":"test"}'])
})
it('should skip empty lines', () => {
const lines: string[] = []
const handler = createDataHandler((line) => lines.push(line))
handler.handleData(Buffer.from('\n\n{"type":"test"}\n\n'))
expect(lines).toEqual(['{"type":"test"}'])
})
it('should throw error when buffer exceeds MAX_LINE_BUFFER_SIZE', () => {
const handler = createDataHandler(vi.fn())
// Create a buffer larger than 1MB (MAX_LINE_BUFFER_SIZE)
const largeData = 'x'.repeat(1024 * 1024 + 1)
expect(() => handler.handleData(Buffer.from(largeData))).toThrow('Control message too large')
})
it('should reset buffer after exceeding MAX_LINE_BUFFER_SIZE', () => {
const lines: string[] = []
const handler = createDataHandler((line) => lines.push(line))
// Create a buffer larger than 1MB
const largeData = 'x'.repeat(1024 * 1024 + 1)
try {
handler.handleData(Buffer.from(largeData))
} catch {
// Expected error
}
// Buffer should be reset, so lineBuffer should be empty
expect(handler.lineBuffer).toBe('')
})
})
describe('waitForSocketDrain', () => {
let mockSocket: Socket & EventEmitter
beforeEach(() => {
mockSocket = Object.assign(new EventEmitter(), {
destroyed: false,
writable: true,
write: vi.fn(),
off: vi.fn(),
removeAllListeners: vi.fn()
}) as unknown as Socket & EventEmitter
})
afterEach(() => {
vi.resetAllMocks()
})
it('should throw error when abort signal is already aborted', async () => {
const abortController = new AbortController()
abortController.abort(new Error('Already aborted'))
await expect(waitForSocketDrain(mockSocket, abortController.signal)).rejects.toThrow('Already aborted')
})
it('should throw error when socket is destroyed', async () => {
mockSocket.destroyed = true
const abortController = new AbortController()
await expect(waitForSocketDrain(mockSocket, abortController.signal)).rejects.toThrow('Socket is closed')
})
it('should resolve when drain event is emitted', async () => {
const abortController = new AbortController()
const drainPromise = waitForSocketDrain(mockSocket, abortController.signal)
// Emit drain event after a short delay
setImmediate(() => mockSocket.emit('drain'))
await expect(drainPromise).resolves.toBeUndefined()
})
it('should reject when close event is emitted', async () => {
const abortController = new AbortController()
const drainPromise = waitForSocketDrain(mockSocket, abortController.signal)
setImmediate(() => mockSocket.emit('close'))
await expect(drainPromise).rejects.toThrow('Socket closed while waiting for drain')
})
it('should reject when error event is emitted', async () => {
const abortController = new AbortController()
const drainPromise = waitForSocketDrain(mockSocket, abortController.signal)
setImmediate(() => mockSocket.emit('error', new Error('Network error')))
await expect(drainPromise).rejects.toThrow('Network error')
})
it('should reject when abort signal is triggered', async () => {
const abortController = new AbortController()
const drainPromise = waitForSocketDrain(mockSocket, abortController.signal)
setImmediate(() => abortController.abort(new Error('User cancelled')))
await expect(drainPromise).rejects.toThrow('User cancelled')
})
})
describe('getAbortError', () => {
it('should return Error reason directly', () => {
const originalError = new Error('Original')
const signal = { aborted: true, reason: originalError } as AbortSignal
expect(getAbortError(signal, 'Fallback')).toBe(originalError)
})
it('should create Error from string reason', () => {
const signal = { aborted: true, reason: 'String reason' } as AbortSignal
expect(getAbortError(signal, 'Fallback').message).toBe('String reason')
})
it('should use fallback for empty reason', () => {
const signal = { aborted: true, reason: '' } as AbortSignal
expect(getAbortError(signal, 'Fallback').message).toBe('Fallback')
})
})
})

View File

@ -0,0 +1,210 @@
import { EventEmitter } from 'node:events'
import type * as fs from 'node:fs'
import type { Socket } from 'node:net'
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
import { abortTransfer, cleanupTransfer, createTransferState, formatFileSize, streamFileChunks } from '../../handlers/fileTransfer'
import type { ActiveFileTransfer } from '../../types'
// Mock binaryProtocol
vi.mock('../../binaryProtocol', () => ({
sendBinaryChunk: vi.fn().mockReturnValue(true)
}))
// Mock connection handlers
vi.mock('./connection', () => ({
waitForSocketDrain: vi.fn().mockResolvedValue(undefined),
getAbortError: vi.fn((signal, fallback) => {
const reason = (signal as AbortSignal & { reason?: unknown }).reason
if (reason instanceof Error) return reason
if (typeof reason === 'string' && reason.length > 0) return new Error(reason)
return new Error(fallback)
})
}))
// Note: validateFile and calculateFileChecksum tests are skipped because
// the test environment has globally mocked node:fs and node:os modules.
// These functions are tested through integration tests instead.
describe('fileTransfer handlers', () => {
describe('createTransferState', () => {
it('should create transfer state with correct defaults', () => {
const state = createTransferState('uuid-123', 'test.zip', 1024000, 'abc123')
expect(state.transferId).toBe('uuid-123')
expect(state.fileName).toBe('test.zip')
expect(state.fileSize).toBe(1024000)
expect(state.checksum).toBe('abc123')
expect(state.bytesSent).toBe(0)
expect(state.currentChunk).toBe(0)
expect(state.isCancelled).toBe(false)
expect(state.abortController).toBeInstanceOf(AbortController)
})
it('should calculate totalChunks based on chunk size', () => {
// 512KB chunk size
const state = createTransferState('id', 'test.zip', 1024 * 1024, 'checksum') // 1MB
expect(state.totalChunks).toBe(2) // 1MB / 512KB = 2
})
})
describe('abortTransfer', () => {
it('should abort transfer and destroy stream', () => {
const mockStream = {
destroyed: false,
destroy: vi.fn()
} as unknown as fs.ReadStream
const transfer: ActiveFileTransfer = {
transferId: 'test',
fileName: 'test.zip',
fileSize: 1000,
checksum: 'abc',
totalChunks: 1,
chunkSize: 512000,
bytesSent: 0,
currentChunk: 0,
startedAt: Date.now(),
stream: mockStream,
isCancelled: false,
abortController: new AbortController()
}
const error = new Error('Test abort')
abortTransfer(transfer, error)
expect(transfer.isCancelled).toBe(true)
expect(transfer.abortController.signal.aborted).toBe(true)
expect(mockStream.destroy).toHaveBeenCalledWith(error)
})
it('should handle undefined transfer', () => {
expect(() => abortTransfer(undefined, new Error('test'))).not.toThrow()
})
it('should not abort already aborted controller', () => {
const transfer: ActiveFileTransfer = {
transferId: 'test',
fileName: 'test.zip',
fileSize: 1000,
checksum: 'abc',
totalChunks: 1,
chunkSize: 512000,
bytesSent: 0,
currentChunk: 0,
startedAt: Date.now(),
isCancelled: false,
abortController: new AbortController()
}
transfer.abortController.abort()
// Should not throw when aborting again
expect(() => abortTransfer(transfer, new Error('test'))).not.toThrow()
})
})
describe('cleanupTransfer', () => {
it('should cleanup transfer resources', () => {
const mockStream = {
destroyed: false,
destroy: vi.fn()
} as unknown as fs.ReadStream
const transfer: ActiveFileTransfer = {
transferId: 'test',
fileName: 'test.zip',
fileSize: 1000,
checksum: 'abc',
totalChunks: 1,
chunkSize: 512000,
bytesSent: 0,
currentChunk: 0,
startedAt: Date.now(),
stream: mockStream,
isCancelled: false,
abortController: new AbortController()
}
cleanupTransfer(transfer)
expect(transfer.abortController.signal.aborted).toBe(true)
expect(mockStream.destroy).toHaveBeenCalled()
})
it('should handle undefined transfer', () => {
expect(() => cleanupTransfer(undefined)).not.toThrow()
})
})
describe('formatFileSize', () => {
it('should format 0 bytes', () => {
expect(formatFileSize(0)).toBe('0 B')
})
it('should format bytes', () => {
expect(formatFileSize(500)).toBe('500 B')
})
it('should format kilobytes', () => {
expect(formatFileSize(1024)).toBe('1 KB')
expect(formatFileSize(2048)).toBe('2 KB')
})
it('should format megabytes', () => {
expect(formatFileSize(1024 * 1024)).toBe('1 MB')
expect(formatFileSize(5 * 1024 * 1024)).toBe('5 MB')
})
it('should format gigabytes', () => {
expect(formatFileSize(1024 * 1024 * 1024)).toBe('1 GB')
})
it('should format with decimal precision', () => {
expect(formatFileSize(1536)).toBe('1.5 KB')
expect(formatFileSize(1.5 * 1024 * 1024)).toBe('1.5 MB')
})
})
// Note: streamFileChunks tests require careful mocking of fs.createReadStream
// which is globally mocked in the test environment. These tests verify the
// streaming logic works correctly with mock streams.
describe('streamFileChunks', () => {
let mockSocket: Socket & EventEmitter
let mockProgress: ReturnType<typeof vi.fn>
beforeEach(() => {
vi.clearAllMocks()
mockSocket = Object.assign(new EventEmitter(), {
destroyed: false,
writable: true,
write: vi.fn().mockReturnValue(true),
cork: vi.fn(),
uncork: vi.fn()
}) as unknown as Socket & EventEmitter
mockProgress = vi.fn()
})
afterEach(() => {
vi.resetAllMocks()
})
it('should throw when abort signal is already aborted', async () => {
const transfer = createTransferState('test-id', 'test.zip', 1024, 'checksum')
transfer.abortController.abort(new Error('Already cancelled'))
await expect(
streamFileChunks(mockSocket, '/fake/path.zip', transfer, transfer.abortController.signal, mockProgress)
).rejects.toThrow()
})
// Note: Full integration testing of streamFileChunks with actual file streaming
// requires a real file system, which cannot be easily mocked in ESM.
// The abort signal test above verifies the early abort path.
// Additional streaming tests are covered through integration tests.
})
})

View File

@ -0,0 +1,177 @@
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
import { ResponseManager } from '../responseManager'
describe('ResponseManager', () => {
let manager: ResponseManager
beforeEach(() => {
vi.useFakeTimers()
manager = new ResponseManager()
})
afterEach(() => {
vi.useRealTimers()
})
describe('buildResponseKey', () => {
it('should build key with type only', () => {
expect(manager.buildResponseKey('handshake_ack')).toBe('handshake_ack')
})
it('should build key with type and transferId', () => {
expect(manager.buildResponseKey('file_start_ack', 'uuid-123')).toBe('file_start_ack:uuid-123')
})
it('should build key with type, transferId, and chunkIndex', () => {
expect(manager.buildResponseKey('file_chunk_ack', 'uuid-123', 5)).toBe('file_chunk_ack:uuid-123:5')
})
})
describe('waitForResponse', () => {
it('should resolve when tryResolve is called with matching key', async () => {
const resolvePromise = new Promise<unknown>((resolve, reject) => {
manager.waitForResponse('handshake_ack', 5000, resolve, reject)
})
const payload = { type: 'handshake_ack', accepted: true }
const resolved = manager.tryResolve('handshake_ack', payload)
expect(resolved).toBe(true)
await expect(resolvePromise).resolves.toEqual(payload)
})
it('should reject on timeout', async () => {
const resolvePromise = new Promise<unknown>((resolve, reject) => {
manager.waitForResponse('handshake_ack', 1000, resolve, reject)
})
vi.advanceTimersByTime(1001)
await expect(resolvePromise).rejects.toThrow('Timeout waiting for handshake_ack')
})
it('should call onTimeout callback when timeout occurs', async () => {
const onTimeout = vi.fn()
manager.setTimeoutCallback(onTimeout)
const resolvePromise = new Promise<unknown>((resolve, reject) => {
manager.waitForResponse('test', 1000, resolve, reject)
})
vi.advanceTimersByTime(1001)
await expect(resolvePromise).rejects.toThrow()
expect(onTimeout).toHaveBeenCalled()
})
it('should reject when abort signal is triggered', async () => {
const abortController = new AbortController()
const resolvePromise = new Promise<unknown>((resolve, reject) => {
manager.waitForResponse('test', 10000, resolve, reject, undefined, undefined, abortController.signal)
})
abortController.abort(new Error('User cancelled'))
await expect(resolvePromise).rejects.toThrow('User cancelled')
})
it('should replace existing response with same key', async () => {
const firstReject = vi.fn()
const secondResolve = vi.fn()
const secondReject = vi.fn()
manager.waitForResponse('test', 5000, vi.fn(), firstReject)
manager.waitForResponse('test', 5000, secondResolve, secondReject)
// First should be cleared (no rejection since it's replaced)
const payload = { type: 'test' }
manager.tryResolve('test', payload)
expect(secondResolve).toHaveBeenCalledWith(payload)
})
})
describe('tryResolve', () => {
it('should return false when no matching response', () => {
expect(manager.tryResolve('nonexistent', {})).toBe(false)
})
it('should match with transferId', async () => {
const resolvePromise = new Promise<unknown>((resolve, reject) => {
manager.waitForResponse('file_start_ack', 5000, resolve, reject, 'uuid-123')
})
const payload = { type: 'file_start_ack', transferId: 'uuid-123' }
manager.tryResolve('file_start_ack', payload, 'uuid-123')
await expect(resolvePromise).resolves.toEqual(payload)
})
})
describe('rejectAll', () => {
it('should reject all pending responses', async () => {
const promises = [
new Promise<unknown>((resolve, reject) => {
manager.waitForResponse('test1', 5000, resolve, reject)
}),
new Promise<unknown>((resolve, reject) => {
manager.waitForResponse('test2', 5000, resolve, reject, 'uuid')
})
]
manager.rejectAll(new Error('Connection closed'))
await expect(promises[0]).rejects.toThrow('Connection closed')
await expect(promises[1]).rejects.toThrow('Connection closed')
})
})
describe('clearPendingResponse', () => {
it('should clear specific response by key', () => {
manager.waitForResponse('test', 5000, vi.fn(), vi.fn())
manager.clearPendingResponse('test')
expect(manager.tryResolve('test', {})).toBe(false)
})
it('should clear all responses when no key provided', () => {
manager.waitForResponse('test1', 5000, vi.fn(), vi.fn())
manager.waitForResponse('test2', 5000, vi.fn(), vi.fn())
manager.clearPendingResponse()
expect(manager.tryResolve('test1', {})).toBe(false)
expect(manager.tryResolve('test2', {})).toBe(false)
})
})
describe('getAbortError', () => {
it('should return Error reason directly', () => {
const originalError = new Error('Original error')
const signal = { aborted: true, reason: originalError } as AbortSignal
const error = manager.getAbortError(signal, 'Fallback')
expect(error).toBe(originalError)
})
it('should create Error from string reason', () => {
const signal = { aborted: true, reason: 'String reason' } as AbortSignal
const error = manager.getAbortError(signal, 'Fallback')
expect(error.message).toBe('String reason')
})
it('should use fallback message when no reason', () => {
const signal = { aborted: true } as AbortSignal
const error = manager.getAbortError(signal, 'Fallback message')
expect(error.message).toBe('Fallback message')
})
})
})

View File

@ -0,0 +1,67 @@
import type { Socket } from 'node:net'
/**
* Binary protocol constants (v1)
*/
export const BINARY_TYPE_FILE_CHUNK = 0x01
/**
* Send file chunk as binary frame (protocol v1 - streaming mode)
*
* Frame format:
* ```
*
* Magic TotalLen Type TransferId Len TransferId ChunkIdx Data
* 0x43 0x53 (4B BE) 0x01 (2B BE) (variable) (4B BE) (raw)
*
* ```
*
* @param socket - TCP socket to write to
* @param transferId - UUID of the transfer
* @param chunkIndex - Index of the chunk (0-based)
* @param data - Raw chunk data buffer
* @returns true if data was buffered, false if backpressure should be applied
*/
export function sendBinaryChunk(socket: Socket, transferId: string, chunkIndex: number, data: Buffer): boolean {
if (!socket || socket.destroyed || !socket.writable) {
throw new Error('Socket is not writable')
}
const tidBuffer = Buffer.from(transferId, 'utf8')
const tidLen = tidBuffer.length
// totalLen = type(1) + tidLen(2) + tid(n) + idx(4) + data(m)
const totalLen = 1 + 2 + tidLen + 4 + data.length
const header = Buffer.allocUnsafe(2 + 4 + 1 + 2 + tidLen + 4)
let offset = 0
// Magic (2 bytes): "CS"
header[offset++] = 0x43
header[offset++] = 0x53
// TotalLen (4 bytes, Big-Endian)
header.writeUInt32BE(totalLen, offset)
offset += 4
// Type (1 byte)
header[offset++] = BINARY_TYPE_FILE_CHUNK
// TransferId length (2 bytes, Big-Endian)
header.writeUInt16BE(tidLen, offset)
offset += 2
// TransferId (variable)
tidBuffer.copy(header, offset)
offset += tidLen
// ChunkIndex (4 bytes, Big-Endian)
header.writeUInt32BE(chunkIndex, offset)
socket.cork()
const wroteHeader = socket.write(header)
const wroteData = socket.write(data)
socket.uncork()
return wroteHeader && wroteData
}

View File

@ -0,0 +1,162 @@
import { isIP, type Socket } from 'node:net'
import { platform } from 'node:os'
import { loggerService } from '@logger'
import type { LanHandshakeRequestMessage, LocalTransferPeer } from '@shared/config/types'
import { app } from 'electron'
import type { ConnectionContext } from '../types'
export const HANDSHAKE_PROTOCOL_VERSION = '1'
/** Maximum size for line buffer to prevent memory exhaustion from malicious peers */
const MAX_LINE_BUFFER_SIZE = 1024 * 1024 // 1MB limit for control messages
const logger = loggerService.withContext('LanTransferConnection')
/**
* Build a handshake request message with device info.
*/
export function buildHandshakeMessage(): LanHandshakeRequestMessage {
return {
type: 'handshake',
deviceName: app.getName(),
version: HANDSHAKE_PROTOCOL_VERSION,
platform: platform(),
appVersion: app.getVersion()
}
}
/**
* Pick the best host address from a peer's available addresses.
* Prefers IPv4 addresses over IPv6.
*/
export function pickHost(peer: LocalTransferPeer): string | undefined {
const preferred = peer.addresses?.find((addr) => isIP(addr) === 4) || peer.addresses?.[0]
return preferred || peer.host
}
/**
* Send a test ping message after successful handshake.
*/
export function sendTestPing(ctx: ConnectionContext): void {
const payload = 'hello world'
try {
ctx.sendControlMessage({ type: 'ping', payload })
logger.info('Sent LAN ping test payload')
ctx.broadcastClientEvent({
type: 'ping_sent',
payload,
timestamp: Date.now()
})
} catch (error) {
const message = error instanceof Error ? error.message : String(error)
logger.error('Failed to send LAN test ping', error as Error)
ctx.broadcastClientEvent({
type: 'error',
message,
timestamp: Date.now()
})
}
}
/**
* Attach data listener to socket for receiving control messages.
* Returns a function to parse the line buffer.
*/
export function createDataHandler(onControlLine: (line: string) => void): {
lineBuffer: string
handleData: (chunk: Buffer) => void
resetBuffer: () => void
} {
let lineBuffer = ''
return {
get lineBuffer() {
return lineBuffer
},
handleData(chunk: Buffer) {
lineBuffer += chunk.toString('utf8')
// Prevent memory exhaustion from malicious peers sending data without newlines
if (lineBuffer.length > MAX_LINE_BUFFER_SIZE) {
logger.error('Line buffer exceeded maximum size, resetting')
lineBuffer = ''
throw new Error('Control message too large')
}
let newlineIndex = lineBuffer.indexOf('\n')
while (newlineIndex !== -1) {
const line = lineBuffer.slice(0, newlineIndex).trim()
lineBuffer = lineBuffer.slice(newlineIndex + 1)
if (line.length > 0) {
onControlLine(line)
}
newlineIndex = lineBuffer.indexOf('\n')
}
},
resetBuffer() {
lineBuffer = ''
}
}
}
/**
* Wait for socket to drain (backpressure handling).
*/
export async function waitForSocketDrain(socket: Socket, abortSignal: AbortSignal): Promise<void> {
if (abortSignal.aborted) {
throw getAbortError(abortSignal, 'Transfer aborted while waiting for socket drain')
}
if (socket.destroyed) {
throw new Error('Socket is closed')
}
await new Promise<void>((resolve, reject) => {
const cleanup = () => {
socket.off('drain', onDrain)
socket.off('close', onClose)
socket.off('error', onError)
abortSignal.removeEventListener('abort', onAbort)
}
const onDrain = () => {
cleanup()
resolve()
}
const onClose = () => {
cleanup()
reject(new Error('Socket closed while waiting for drain'))
}
const onError = (error: Error) => {
cleanup()
reject(error)
}
const onAbort = () => {
cleanup()
reject(getAbortError(abortSignal, 'Transfer aborted while waiting for socket drain'))
}
socket.once('drain', onDrain)
socket.once('close', onClose)
socket.once('error', onError)
abortSignal.addEventListener('abort', onAbort, { once: true })
})
}
/**
* Get the error from an abort signal, or create a fallback error.
*/
export function getAbortError(signal: AbortSignal, fallbackMessage: string): Error {
const reason = (signal as AbortSignal & { reason?: unknown }).reason
if (reason instanceof Error) {
return reason
}
if (typeof reason === 'string' && reason.length > 0) {
return new Error(reason)
}
return new Error(fallbackMessage)
}

View File

@ -0,0 +1,267 @@
import * as crypto from 'node:crypto'
import * as fs from 'node:fs'
import type { Socket } from 'node:net'
import * as path from 'node:path'
import { loggerService } from '@logger'
import type {
LanFileCompleteMessage,
LanFileEndMessage,
LanFileStartAckMessage,
LanFileStartMessage
} from '@shared/config/types'
import {
LAN_TRANSFER_CHUNK_SIZE,
LAN_TRANSFER_COMPLETE_TIMEOUT_MS,
LAN_TRANSFER_MAX_FILE_SIZE
} from '@shared/config/types'
import { sendBinaryChunk } from '../binaryProtocol'
import type { ActiveFileTransfer, FileTransferContext } from '../types'
import { getAbortError, waitForSocketDrain } from './connection'
const DEFAULT_FILE_START_ACK_TIMEOUT_MS = 30_000 // 30s for file_start_ack
const logger = loggerService.withContext('LanTransferFileHandler')
/**
* Validate a file for transfer.
* Checks existence, type, extension, and size limits.
*/
export async function validateFile(filePath: string): Promise<{ stats: fs.Stats; fileName: string }> {
let stats: fs.Stats
try {
stats = await fs.promises.stat(filePath)
} catch (error) {
const nodeError = error as NodeJS.ErrnoException
if (nodeError.code === 'ENOENT') {
throw new Error(`File not found: ${filePath}`)
} else if (nodeError.code === 'EACCES') {
throw new Error(`Permission denied: ${filePath}`)
} else if (nodeError.code === 'ENOTDIR') {
throw new Error(`Invalid path: ${filePath}`)
} else {
throw new Error(`Cannot access file: ${filePath} (${nodeError.code || 'unknown error'})`)
}
}
if (!stats.isFile()) {
throw new Error('Path is not a file')
}
const fileName = path.basename(filePath)
const ext = path.extname(fileName).toLowerCase()
if (ext !== '.zip') {
throw new Error('Only ZIP files are supported')
}
if (stats.size > LAN_TRANSFER_MAX_FILE_SIZE) {
throw new Error(`File too large. Maximum size is ${formatFileSize(LAN_TRANSFER_MAX_FILE_SIZE)}`)
}
return { stats, fileName }
}
/**
* Calculate SHA-256 checksum of a file.
*/
export async function calculateFileChecksum(filePath: string): Promise<string> {
return new Promise((resolve, reject) => {
const hash = crypto.createHash('sha256')
const stream = fs.createReadStream(filePath)
stream.on('data', (data) => hash.update(data))
stream.on('end', () => resolve(hash.digest('hex')))
stream.on('error', reject)
})
}
/**
* Create initial transfer state for a new file transfer.
*/
export function createTransferState(
transferId: string,
fileName: string,
fileSize: number,
checksum: string
): ActiveFileTransfer {
const chunkSize = LAN_TRANSFER_CHUNK_SIZE
const totalChunks = Math.ceil(fileSize / chunkSize)
return {
transferId,
fileName,
fileSize,
checksum,
totalChunks,
chunkSize,
bytesSent: 0,
currentChunk: 0,
startedAt: Date.now(),
isCancelled: false,
abortController: new AbortController()
}
}
/**
* Send file_start message to receiver.
*/
export function sendFileStart(ctx: FileTransferContext, transfer: ActiveFileTransfer): void {
const startMessage: LanFileStartMessage = {
type: 'file_start',
transferId: transfer.transferId,
fileName: transfer.fileName,
fileSize: transfer.fileSize,
mimeType: 'application/zip',
checksum: transfer.checksum,
totalChunks: transfer.totalChunks,
chunkSize: transfer.chunkSize
}
ctx.sendControlMessage(startMessage)
logger.info('Sent file_start message')
}
/**
* Wait for file_start_ack from receiver.
*/
export function waitForFileStartAck(
ctx: FileTransferContext,
transferId: string,
abortSignal?: AbortSignal
): Promise<LanFileStartAckMessage> {
return new Promise((resolve, reject) => {
ctx.waitForResponse(
'file_start_ack',
DEFAULT_FILE_START_ACK_TIMEOUT_MS,
(payload) => resolve(payload as LanFileStartAckMessage),
reject,
transferId,
undefined,
abortSignal
)
})
}
/**
* Wait for file_complete from receiver after all chunks sent.
*/
export function waitForFileComplete(
ctx: FileTransferContext,
transferId: string,
abortSignal?: AbortSignal
): Promise<LanFileCompleteMessage> {
return new Promise((resolve, reject) => {
ctx.waitForResponse(
'file_complete',
LAN_TRANSFER_COMPLETE_TIMEOUT_MS,
(payload) => resolve(payload as LanFileCompleteMessage),
reject,
transferId,
undefined,
abortSignal
)
})
}
/**
* Send file_end message to receiver.
*/
export function sendFileEnd(ctx: FileTransferContext, transferId: string): void {
const endMessage: LanFileEndMessage = {
type: 'file_end',
transferId
}
ctx.sendControlMessage(endMessage)
logger.info('Sent file_end message')
}
/**
* Stream file chunks to the receiver (v1 streaming mode - no per-chunk acknowledgment).
*/
export async function streamFileChunks(
socket: Socket,
filePath: string,
transfer: ActiveFileTransfer,
abortSignal: AbortSignal,
onProgress: (bytesSent: number, chunkIndex: number) => void
): Promise<void> {
const { chunkSize, transferId } = transfer
const stream = fs.createReadStream(filePath, { highWaterMark: chunkSize })
transfer.stream = stream
let chunkIndex = 0
let bytesSent = 0
try {
for await (const chunk of stream) {
if (abortSignal.aborted) {
throw getAbortError(abortSignal, 'Transfer aborted')
}
const buffer = Buffer.isBuffer(chunk) ? chunk : Buffer.from(chunk)
bytesSent += buffer.length
// Send chunk as binary frame (v1 streaming) with backpressure handling
const canContinue = sendBinaryChunk(socket, transferId, chunkIndex, buffer)
if (!canContinue) {
await waitForSocketDrain(socket, abortSignal)
}
// Update progress
transfer.bytesSent = bytesSent
transfer.currentChunk = chunkIndex
onProgress(bytesSent, chunkIndex)
chunkIndex++
}
logger.info(`File streaming completed: ${chunkIndex} chunks sent`)
} catch (error) {
logger.error('File streaming failed', error as Error)
throw error
}
}
/**
* Abort an active transfer and clean up resources.
*/
export function abortTransfer(transfer: ActiveFileTransfer | undefined, error: Error): void {
if (!transfer) {
return
}
transfer.isCancelled = true
if (!transfer.abortController.signal.aborted) {
transfer.abortController.abort(error)
}
if (transfer.stream && !transfer.stream.destroyed) {
transfer.stream.destroy(error)
}
}
/**
* Clean up transfer resources without error.
*/
export function cleanupTransfer(transfer: ActiveFileTransfer | undefined): void {
if (!transfer) {
return
}
if (!transfer.abortController.signal.aborted) {
transfer.abortController.abort()
}
if (transfer.stream && !transfer.stream.destroyed) {
transfer.stream.destroy()
}
}
/**
* Format bytes into human-readable size string.
*/
export function formatFileSize(bytes: number): string {
if (bytes === 0) return '0 B'
const k = 1024
const sizes = ['B', 'KB', 'MB', 'GB']
const i = Math.floor(Math.log(bytes) / Math.log(k))
return parseFloat((bytes / Math.pow(k, i)).toFixed(2)) + ' ' + sizes[i]
}

View File

@ -0,0 +1,22 @@
export {
buildHandshakeMessage,
createDataHandler,
getAbortError,
HANDSHAKE_PROTOCOL_VERSION,
pickHost,
sendTestPing,
waitForSocketDrain
} from './connection'
export {
abortTransfer,
calculateFileChecksum,
cleanupTransfer,
createTransferState,
formatFileSize,
sendFileEnd,
sendFileStart,
streamFileChunks,
validateFile,
waitForFileComplete,
waitForFileStartAck
} from './fileTransfer'

View File

@ -0,0 +1,21 @@
/**
* LAN Transfer Client Module
*
* Protocol: v1.0 (streaming mode)
*
* Features:
* - Binary frame format for file chunks (no base64 overhead)
* - Streaming mode (no per-chunk acknowledgment)
* - JSON messages for control flow (handshake, file_start, file_end, etc.)
* - Global timeout protection
* - Backpressure handling
*
* Binary Frame Format:
*
* Magic TotalLen Type TransferId Len TransferId ChunkIdx Data
* 0x43 0x53 (4B BE) 0x01 (2B BE) (variable) (4B BE) (raw)
*
*/
export { HANDSHAKE_PROTOCOL_VERSION, lanTransferClientService } from './LanTransferClientService'
export type { ActiveFileTransfer, ConnectionContext, FileTransferContext, PendingResponse } from './types'

View File

@ -0,0 +1,144 @@
import type { PendingResponse } from './types'
/**
* Manages pending response handlers for awaiting control messages.
* Handles timeouts, abort signals, and cleanup.
*/
export class ResponseManager {
private pendingResponses = new Map<string, PendingResponse>()
private onTimeout?: () => void
/**
* Set a callback to be called when a response times out.
* Typically used to trigger disconnect on timeout.
*/
setTimeoutCallback(callback: () => void): void {
this.onTimeout = callback
}
/**
* Build a composite key for identifying pending responses.
*/
buildResponseKey(type: string, transferId?: string, chunkIndex?: number): string {
const parts = [type]
if (transferId !== undefined) parts.push(transferId)
if (chunkIndex !== undefined) parts.push(String(chunkIndex))
return parts.join(':')
}
/**
* Register a response listener with timeout and optional abort signal.
*/
waitForResponse(
type: string,
timeoutMs: number,
resolve: (payload: unknown) => void,
reject: (error: Error) => void,
transferId?: string,
chunkIndex?: number,
abortSignal?: AbortSignal
): void {
const responseKey = this.buildResponseKey(type, transferId, chunkIndex)
// Clear any existing response with the same key
this.clearPendingResponse(responseKey)
const timeoutHandle = setTimeout(() => {
this.clearPendingResponse(responseKey)
const error = new Error(`Timeout waiting for ${type}`)
reject(error)
this.onTimeout?.()
}, timeoutMs)
const pending: PendingResponse = {
type,
transferId,
chunkIndex,
resolve,
reject,
timeoutHandle,
abortSignal
}
if (abortSignal) {
const abortListener = () => {
this.clearPendingResponse(responseKey)
reject(this.getAbortError(abortSignal, `Aborted while waiting for ${type}`))
}
pending.abortListener = abortListener
abortSignal.addEventListener('abort', abortListener, { once: true })
}
this.pendingResponses.set(responseKey, pending)
}
/**
* Try to resolve a pending response by type and optional identifiers.
* Returns true if a matching response was found and resolved.
*/
tryResolve(type: string, payload: unknown, transferId?: string, chunkIndex?: number): boolean {
const responseKey = this.buildResponseKey(type, transferId, chunkIndex)
const pendingResponse = this.pendingResponses.get(responseKey)
if (pendingResponse) {
const resolver = pendingResponse.resolve
this.clearPendingResponse(responseKey)
resolver(payload)
return true
}
return false
}
/**
* Clear a single pending response by key, or all responses if no key provided.
*/
clearPendingResponse(key?: string): void {
if (key) {
const pending = this.pendingResponses.get(key)
if (pending?.timeoutHandle) {
clearTimeout(pending.timeoutHandle)
}
if (pending?.abortSignal && pending.abortListener) {
pending.abortSignal.removeEventListener('abort', pending.abortListener)
}
this.pendingResponses.delete(key)
} else {
// Clear all pending responses
for (const pending of this.pendingResponses.values()) {
if (pending.timeoutHandle) {
clearTimeout(pending.timeoutHandle)
}
if (pending.abortSignal && pending.abortListener) {
pending.abortSignal.removeEventListener('abort', pending.abortListener)
}
}
this.pendingResponses.clear()
}
}
/**
* Reject all pending responses with the given error.
*/
rejectAll(error: Error): void {
for (const key of Array.from(this.pendingResponses.keys())) {
const pending = this.pendingResponses.get(key)
this.clearPendingResponse(key)
pending?.reject(error)
}
}
/**
* Get the abort error from an abort signal, or create a fallback error.
*/
getAbortError(signal: AbortSignal, fallbackMessage: string): Error {
const reason = (signal as AbortSignal & { reason?: unknown }).reason
if (reason instanceof Error) {
return reason
}
if (typeof reason === 'string' && reason.length > 0) {
return new Error(reason)
}
return new Error(fallbackMessage)
}
}

View File

@ -0,0 +1,65 @@
import type * as fs from 'node:fs'
import type { Socket } from 'node:net'
import type { LanClientEvent, LocalTransferPeer } from '@shared/config/types'
/**
* Pending response handler for awaiting control messages
*/
export type PendingResponse = {
type: string
transferId?: string
chunkIndex?: number
resolve: (payload: unknown) => void
reject: (error: Error) => void
timeoutHandle?: NodeJS.Timeout
abortSignal?: AbortSignal
abortListener?: () => void
}
/**
* Active file transfer state tracking
*/
export type ActiveFileTransfer = {
transferId: string
fileName: string
fileSize: number
checksum: string
totalChunks: number
chunkSize: number
bytesSent: number
currentChunk: number
startedAt: number
stream?: fs.ReadStream
isCancelled: boolean
abortController: AbortController
}
/**
* Context interface for connection handlers
* Provides access to service methods without circular dependencies
*/
export type ConnectionContext = {
socket: Socket | null
currentPeer?: LocalTransferPeer
sendControlMessage: (message: Record<string, unknown>) => void
broadcastClientEvent: (event: LanClientEvent) => void
}
/**
* Context interface for file transfer handlers
* Extends connection context with transfer-specific methods
*/
export type FileTransferContext = ConnectionContext & {
activeTransfer?: ActiveFileTransfer
setActiveTransfer: (transfer: ActiveFileTransfer | undefined) => void
waitForResponse: (
type: string,
timeoutMs: number,
resolve: (payload: unknown) => void,
reject: (error: Error) => void,
transferId?: string,
chunkIndex?: number,
abortSignal?: AbortSignal
) => void
}

View File

@ -1,9 +1,21 @@
import { configManager } from '@main/services/ConfigManager'
import { execFileSync } from 'child_process'
import fs from 'fs'
import path from 'path'
import { beforeEach, describe, expect, it, vi } from 'vitest'
import { findExecutable, findGitBash, validateGitBashPath } from '../process'
import { autoDiscoverGitBash, findExecutable, findGitBash, validateGitBashPath } from '../process'
// Mock configManager
vi.mock('@main/services/ConfigManager', () => ({
ConfigKeys: {
GitBashPath: 'gitBashPath'
},
configManager: {
get: vi.fn(),
set: vi.fn()
}
}))
// Mock dependencies
vi.mock('child_process')
@ -695,4 +707,284 @@ describe.skipIf(process.platform !== 'win32')('process utilities', () => {
})
})
})
describe('autoDiscoverGitBash', () => {
const originalEnvVar = process.env.CLAUDE_CODE_GIT_BASH_PATH
beforeEach(() => {
vi.mocked(configManager.get).mockReset()
vi.mocked(configManager.set).mockReset()
delete process.env.CLAUDE_CODE_GIT_BASH_PATH
})
afterEach(() => {
// Restore original environment variable
if (originalEnvVar !== undefined) {
process.env.CLAUDE_CODE_GIT_BASH_PATH = originalEnvVar
} else {
delete process.env.CLAUDE_CODE_GIT_BASH_PATH
}
})
/**
* Helper to mock fs.existsSync with a set of valid paths
*/
const mockExistingPaths = (...validPaths: string[]) => {
vi.mocked(fs.existsSync).mockImplementation((p) => validPaths.includes(p as string))
}
describe('with no existing config path', () => {
it('should discover and persist Git Bash path when not configured', () => {
const bashPath = 'C:\\Program Files\\Git\\bin\\bash.exe'
const gitPath = 'C:\\Program Files\\Git\\cmd\\git.exe'
vi.mocked(configManager.get).mockReturnValue(undefined)
process.env.ProgramFiles = 'C:\\Program Files'
mockExistingPaths(gitPath, bashPath)
const result = autoDiscoverGitBash()
expect(result).toBe(bashPath)
expect(configManager.set).toHaveBeenCalledWith('gitBashPath', bashPath)
})
it('should return null and not persist when Git Bash is not found', () => {
vi.mocked(configManager.get).mockReturnValue(undefined)
vi.mocked(fs.existsSync).mockReturnValue(false)
vi.mocked(execFileSync).mockImplementation(() => {
throw new Error('Not found')
})
const result = autoDiscoverGitBash()
expect(result).toBeNull()
expect(configManager.set).not.toHaveBeenCalled()
})
})
describe('environment variable precedence', () => {
it('should use env var over valid config path', () => {
const envPath = 'C:\\EnvGit\\bin\\bash.exe'
const configPath = 'C:\\ConfigGit\\bin\\bash.exe'
process.env.CLAUDE_CODE_GIT_BASH_PATH = envPath
vi.mocked(configManager.get).mockReturnValue(configPath)
mockExistingPaths(envPath, configPath)
const result = autoDiscoverGitBash()
// Env var should take precedence
expect(result).toBe(envPath)
// Should not persist env var path (it's a runtime override)
expect(configManager.set).not.toHaveBeenCalled()
})
it('should fall back to config path when env var is invalid', () => {
const envPath = 'C:\\Invalid\\bash.exe'
const configPath = 'C:\\ConfigGit\\bin\\bash.exe'
process.env.CLAUDE_CODE_GIT_BASH_PATH = envPath
vi.mocked(configManager.get).mockReturnValue(configPath)
// Env path is invalid (doesn't exist), only config path exists
mockExistingPaths(configPath)
const result = autoDiscoverGitBash()
// Should fall back to config path
expect(result).toBe(configPath)
expect(configManager.set).not.toHaveBeenCalled()
})
it('should fall back to auto-discovery when both env var and config are invalid', () => {
const envPath = 'C:\\InvalidEnv\\bash.exe'
const configPath = 'C:\\InvalidConfig\\bash.exe'
const discoveredPath = 'C:\\Program Files\\Git\\bin\\bash.exe'
const gitPath = 'C:\\Program Files\\Git\\cmd\\git.exe'
process.env.CLAUDE_CODE_GIT_BASH_PATH = envPath
process.env.ProgramFiles = 'C:\\Program Files'
vi.mocked(configManager.get).mockReturnValue(configPath)
// Both env and config paths are invalid, only standard Git exists
mockExistingPaths(gitPath, discoveredPath)
const result = autoDiscoverGitBash()
expect(result).toBe(discoveredPath)
expect(configManager.set).toHaveBeenCalledWith('gitBashPath', discoveredPath)
})
})
describe('with valid existing config path', () => {
it('should validate and return existing path without re-discovering', () => {
const existingPath = 'C:\\CustomGit\\bin\\bash.exe'
vi.mocked(configManager.get).mockReturnValue(existingPath)
mockExistingPaths(existingPath)
const result = autoDiscoverGitBash()
expect(result).toBe(existingPath)
// Should not call findGitBash or persist again
expect(configManager.set).not.toHaveBeenCalled()
// Should not call execFileSync (which findGitBash would use for discovery)
expect(execFileSync).not.toHaveBeenCalled()
})
it('should not override existing valid config with auto-discovery', () => {
const existingPath = 'C:\\CustomGit\\bin\\bash.exe'
const discoveredPath = 'C:\\Program Files\\Git\\bin\\bash.exe'
vi.mocked(configManager.get).mockReturnValue(existingPath)
mockExistingPaths(existingPath, discoveredPath)
const result = autoDiscoverGitBash()
expect(result).toBe(existingPath)
expect(configManager.set).not.toHaveBeenCalled()
})
})
describe('with invalid existing config path', () => {
it('should attempt auto-discovery when existing path does not exist', () => {
const existingPath = 'C:\\NonExistent\\bin\\bash.exe'
const discoveredPath = 'C:\\Program Files\\Git\\bin\\bash.exe'
const gitPath = 'C:\\Program Files\\Git\\cmd\\git.exe'
vi.mocked(configManager.get).mockReturnValue(existingPath)
process.env.ProgramFiles = 'C:\\Program Files'
// Invalid path doesn't exist, but Git is installed at standard location
mockExistingPaths(gitPath, discoveredPath)
const result = autoDiscoverGitBash()
// Should discover and return the new path
expect(result).toBe(discoveredPath)
// Should persist the discovered path (overwrites invalid)
expect(configManager.set).toHaveBeenCalledWith('gitBashPath', discoveredPath)
})
it('should attempt auto-discovery when existing path is not bash.exe', () => {
const existingPath = 'C:\\CustomGit\\bin\\git.exe'
const discoveredPath = 'C:\\Program Files\\Git\\bin\\bash.exe'
const gitPath = 'C:\\Program Files\\Git\\cmd\\git.exe'
vi.mocked(configManager.get).mockReturnValue(existingPath)
process.env.ProgramFiles = 'C:\\Program Files'
// Invalid path exists but is not bash.exe (validation will fail)
// Git is installed at standard location
mockExistingPaths(existingPath, gitPath, discoveredPath)
const result = autoDiscoverGitBash()
// Should discover and return the new path
expect(result).toBe(discoveredPath)
// Should persist the discovered path (overwrites invalid)
expect(configManager.set).toHaveBeenCalledWith('gitBashPath', discoveredPath)
})
it('should return null when existing path is invalid and discovery fails', () => {
const existingPath = 'C:\\NonExistent\\bin\\bash.exe'
vi.mocked(configManager.get).mockReturnValue(existingPath)
vi.mocked(fs.existsSync).mockReturnValue(false)
vi.mocked(execFileSync).mockImplementation(() => {
throw new Error('Not found')
})
const result = autoDiscoverGitBash()
// Both validation and discovery failed
expect(result).toBeNull()
// Should not persist when discovery fails
expect(configManager.set).not.toHaveBeenCalled()
})
})
describe('config persistence verification', () => {
it('should persist discovered path with correct config key', () => {
const bashPath = 'C:\\Program Files\\Git\\bin\\bash.exe'
const gitPath = 'C:\\Program Files\\Git\\cmd\\git.exe'
vi.mocked(configManager.get).mockReturnValue(undefined)
process.env.ProgramFiles = 'C:\\Program Files'
mockExistingPaths(gitPath, bashPath)
autoDiscoverGitBash()
// Verify the exact call to configManager.set
expect(configManager.set).toHaveBeenCalledTimes(1)
expect(configManager.set).toHaveBeenCalledWith('gitBashPath', bashPath)
})
it('should persist on each discovery when config remains undefined', () => {
const bashPath = 'C:\\Program Files\\Git\\bin\\bash.exe'
const gitPath = 'C:\\Program Files\\Git\\cmd\\git.exe'
vi.mocked(configManager.get).mockReturnValue(undefined)
process.env.ProgramFiles = 'C:\\Program Files'
mockExistingPaths(gitPath, bashPath)
autoDiscoverGitBash()
autoDiscoverGitBash()
// Each call discovers and persists since config remains undefined (mocked)
expect(configManager.set).toHaveBeenCalledTimes(2)
})
})
describe('real-world scenarios', () => {
it('should discover and persist standard Git for Windows installation', () => {
const gitPath = 'C:\\Program Files\\Git\\cmd\\git.exe'
const bashPath = 'C:\\Program Files\\Git\\bin\\bash.exe'
vi.mocked(configManager.get).mockReturnValue(undefined)
process.env.ProgramFiles = 'C:\\Program Files'
mockExistingPaths(gitPath, bashPath)
const result = autoDiscoverGitBash()
expect(result).toBe(bashPath)
expect(configManager.set).toHaveBeenCalledWith('gitBashPath', bashPath)
})
it('should discover portable Git via where.exe and persist', () => {
const gitPath = 'D:\\PortableApps\\Git\\bin\\git.exe'
const bashPath = 'D:\\PortableApps\\Git\\bin\\bash.exe'
vi.mocked(configManager.get).mockReturnValue(undefined)
vi.mocked(fs.existsSync).mockImplementation((p) => {
const pathStr = p?.toString() || ''
// Common git paths don't exist
if (pathStr.includes('Program Files\\Git\\cmd\\git.exe')) return false
if (pathStr.includes('Program Files (x86)\\Git\\cmd\\git.exe')) return false
// Portable bash path exists
if (pathStr === bashPath) return true
return false
})
vi.mocked(execFileSync).mockReturnValue(gitPath)
const result = autoDiscoverGitBash()
expect(result).toBe(bashPath)
expect(configManager.set).toHaveBeenCalledWith('gitBashPath', bashPath)
})
it('should respect user-configured path over auto-discovery', () => {
const userConfiguredPath = 'D:\\MyGit\\bin\\bash.exe'
const systemPath = 'C:\\Program Files\\Git\\bin\\bash.exe'
vi.mocked(configManager.get).mockReturnValue(userConfiguredPath)
mockExistingPaths(userConfiguredPath, systemPath)
const result = autoDiscoverGitBash()
expect(result).toBe(userConfiguredPath)
expect(configManager.set).not.toHaveBeenCalled()
// Verify findGitBash was not called for discovery
expect(execFileSync).not.toHaveBeenCalled()
})
})
})
})

View File

@ -1,4 +1,5 @@
import { loggerService } from '@logger'
import type { GitBashPathInfo, GitBashPathSource } from '@shared/config/constant'
import { HOME_CHERRY_DIR } from '@shared/config/constant'
import { execFileSync, spawn } from 'child_process'
import fs from 'fs'
@ -6,6 +7,7 @@ import os from 'os'
import path from 'path'
import { isWin } from '../constant'
import { ConfigKeys, configManager } from '../services/ConfigManager'
import { getResourcePath } from '.'
const logger = loggerService.withContext('Utils:Process')
@ -59,7 +61,7 @@ export async function getBinaryPath(name?: string): Promise<string> {
export async function isBinaryExists(name: string): Promise<boolean> {
const cmd = await getBinaryPath(name)
return await fs.existsSync(cmd)
return fs.existsSync(cmd)
}
/**
@ -225,3 +227,77 @@ export function validateGitBashPath(customPath?: string | null): string | null {
logger.debug('Validated custom Git Bash path', { path: resolved })
return resolved
}
/**
* Auto-discover and persist Git Bash path if not already configured
* Only called when Git Bash is actually needed
*
* Precedence order:
* 1. CLAUDE_CODE_GIT_BASH_PATH environment variable (highest - runtime override)
* 2. Configured path from settings (manual or auto)
* 3. Auto-discovery via findGitBash (only if no valid config exists)
*/
export function autoDiscoverGitBash(): string | null {
if (!isWin) {
return null
}
// 1. Check environment variable override first (highest priority)
const envOverride = process.env.CLAUDE_CODE_GIT_BASH_PATH
if (envOverride) {
const validated = validateGitBashPath(envOverride)
if (validated) {
logger.debug('Using CLAUDE_CODE_GIT_BASH_PATH override', { path: validated })
return validated
}
logger.warn('CLAUDE_CODE_GIT_BASH_PATH provided but path is invalid', { path: envOverride })
}
// 2. Check if a path is already configured
const existingPath = configManager.get<string | undefined>(ConfigKeys.GitBashPath)
const existingSource = configManager.get<GitBashPathSource | undefined>(ConfigKeys.GitBashPathSource)
if (existingPath) {
const validated = validateGitBashPath(existingPath)
if (validated) {
return validated
}
// Existing path is invalid, try to auto-discover
logger.warn('Existing Git Bash path is invalid, attempting auto-discovery', {
path: existingPath,
source: existingSource
})
}
// 3. Try to find Git Bash via auto-discovery
const discoveredPath = findGitBash()
if (discoveredPath) {
// Persist the discovered path with 'auto' source
configManager.set(ConfigKeys.GitBashPath, discoveredPath)
configManager.set(ConfigKeys.GitBashPathSource, 'auto')
logger.info('Auto-discovered Git Bash path', { path: discoveredPath })
}
return discoveredPath
}
/**
* Get Git Bash path info including source
* If no path is configured, triggers auto-discovery first
*/
export function getGitBashPathInfo(): GitBashPathInfo {
if (!isWin) {
return { path: null, source: null }
}
let path = configManager.get<string | null>(ConfigKeys.GitBashPath) ?? null
let source = configManager.get<GitBashPathSource | null>(ConfigKeys.GitBashPathSource) ?? null
// If no path configured, trigger auto-discovery (handles upgrade from old versions)
if (!path) {
path = autoDiscoverGitBash()
source = path ? 'auto' : null
}
return { path, source }
}

View File

@ -2,9 +2,17 @@ import type { PermissionUpdate } from '@anthropic-ai/claude-agent-sdk'
import { electronAPI } from '@electron-toolkit/preload'
import type { SpanEntity, TokenUsage } from '@mcp-trace/trace-core'
import type { SpanContext } from '@opentelemetry/api'
import type { TerminalConfig, UpgradeChannel } from '@shared/config/constant'
import type { GitBashPathInfo, TerminalConfig, UpgradeChannel } from '@shared/config/constant'
import type { LogLevel, LogSourceWithContext } from '@shared/config/logger'
import type { FileChangeEvent, WebviewKeyEvent } from '@shared/config/types'
import type {
FileChangeEvent,
LanClientEvent,
LanFileCompleteMessage,
LanHandshakeAckMessage,
LocalTransferConnectPayload,
LocalTransferState,
WebviewKeyEvent
} from '@shared/config/types'
import type { MCPServerLogEntry } from '@shared/config/types'
import { IpcChannel } from '@shared/IpcChannel'
import type { Notification } from '@types'
@ -126,6 +134,7 @@ const api = {
getCpuName: () => ipcRenderer.invoke(IpcChannel.System_GetCpuName),
checkGitBash: (): Promise<boolean> => ipcRenderer.invoke(IpcChannel.System_CheckGitBash),
getGitBashPath: (): Promise<string | null> => ipcRenderer.invoke(IpcChannel.System_GetGitBashPath),
getGitBashPathInfo: (): Promise<GitBashPathInfo> => ipcRenderer.invoke(IpcChannel.System_GetGitBashPathInfo),
setGitBashPath: (newPath: string | null): Promise<boolean> =>
ipcRenderer.invoke(IpcChannel.System_SetGitBashPath, newPath)
},
@ -171,7 +180,11 @@ const api = {
listS3Files: (s3Config: S3Config) => ipcRenderer.invoke(IpcChannel.Backup_ListS3Files, s3Config),
deleteS3File: (fileName: string, s3Config: S3Config) =>
ipcRenderer.invoke(IpcChannel.Backup_DeleteS3File, fileName, s3Config),
checkS3Connection: (s3Config: S3Config) => ipcRenderer.invoke(IpcChannel.Backup_CheckS3Connection, s3Config)
checkS3Connection: (s3Config: S3Config) => ipcRenderer.invoke(IpcChannel.Backup_CheckS3Connection, s3Config),
createLanTransferBackup: (data: string): Promise<string> =>
ipcRenderer.invoke(IpcChannel.Backup_CreateLanTransferBackup, data),
deleteTempBackup: (filePath: string): Promise<boolean> =>
ipcRenderer.invoke(IpcChannel.Backup_DeleteTempBackup, filePath)
},
file: {
select: (options?: OpenDialogOptions): Promise<FileMetadata[] | null> =>
@ -588,12 +601,32 @@ const api = {
writeContent: (options: WritePluginContentOptions): Promise<PluginResult<void>> =>
ipcRenderer.invoke(IpcChannel.ClaudeCodePlugin_WriteContent, options)
},
webSocket: {
start: () => ipcRenderer.invoke(IpcChannel.WebSocket_Start),
stop: () => ipcRenderer.invoke(IpcChannel.WebSocket_Stop),
status: () => ipcRenderer.invoke(IpcChannel.WebSocket_Status),
sendFile: (filePath: string) => ipcRenderer.invoke(IpcChannel.WebSocket_SendFile, filePath),
getAllCandidates: () => ipcRenderer.invoke(IpcChannel.WebSocket_GetAllCandidates)
localTransfer: {
getState: (): Promise<LocalTransferState> => ipcRenderer.invoke(IpcChannel.LocalTransfer_ListServices),
startScan: (): Promise<LocalTransferState> => ipcRenderer.invoke(IpcChannel.LocalTransfer_StartScan),
stopScan: (): Promise<LocalTransferState> => ipcRenderer.invoke(IpcChannel.LocalTransfer_StopScan),
connect: (payload: LocalTransferConnectPayload): Promise<LanHandshakeAckMessage> =>
ipcRenderer.invoke(IpcChannel.LocalTransfer_Connect, payload),
disconnect: (): Promise<void> => ipcRenderer.invoke(IpcChannel.LocalTransfer_Disconnect),
onServicesUpdated: (callback: (state: LocalTransferState) => void): (() => void) => {
const channel = IpcChannel.LocalTransfer_ServicesUpdated
const listener = (_: Electron.IpcRendererEvent, state: LocalTransferState) => callback(state)
ipcRenderer.on(channel, listener)
return () => {
ipcRenderer.removeListener(channel, listener)
}
},
onClientEvent: (callback: (event: LanClientEvent) => void): (() => void) => {
const channel = IpcChannel.LocalTransfer_ClientEvent
const listener = (_: Electron.IpcRendererEvent, event: LanClientEvent) => callback(event)
ipcRenderer.on(channel, listener)
return () => {
ipcRenderer.removeListener(channel, listener)
}
},
sendFile: (filePath: string): Promise<LanFileCompleteMessage> =>
ipcRenderer.invoke(IpcChannel.LocalTransfer_SendFile, { filePath }),
cancelTransfer: (): Promise<void> => ipcRenderer.invoke(IpcChannel.LocalTransfer_CancelTransfer)
}
}

View File

@ -142,6 +142,10 @@ export class OpenAIAPIClient extends OpenAIBaseClient<
return { thinking: { type: reasoningEffort ? 'enabled' : 'disabled' } }
}
if (reasoningEffort === 'default') {
return {}
}
if (!reasoningEffort) {
// DeepSeek hybrid inference models, v3.1 and maybe more in the future
// 不同的 provider 有不同的思考控制方式,在这里统一解决
@ -303,7 +307,7 @@ export class OpenAIAPIClient extends OpenAIBaseClient<
// Grok models/Perplexity models/OpenAI models
if (isSupportedReasoningEffortModel(model)) {
// 检查模型是否支持所选选项
const supportedOptions = getModelSupportedReasoningEffortOptions(model)
const supportedOptions = getModelSupportedReasoningEffortOptions(model)?.filter((option) => option !== 'default')
if (supportedOptions?.includes(reasoningEffort)) {
return {
reasoning_effort: reasoningEffort

View File

@ -18,7 +18,7 @@ vi.mock('@renderer/services/AssistantService', () => ({
toolUseMode: assistant.settings?.toolUseMode ?? 'prompt',
defaultModel: assistant.defaultModel,
customParameters: assistant.settings?.customParameters ?? [],
reasoning_effort: assistant.settings?.reasoning_effort,
reasoning_effort: assistant.settings?.reasoning_effort ?? 'default',
reasoning_effort_cache: assistant.settings?.reasoning_effort_cache,
qwenThinkMode: assistant.settings?.qwenThinkMode
})

View File

@ -11,6 +11,7 @@ import { beforeEach, describe, expect, it, vi } from 'vitest'
import {
getAnthropicReasoningParams,
getAnthropicThinkingBudget,
getBedrockReasoningParams,
getCustomParameters,
getGeminiReasoningParams,
@ -89,7 +90,8 @@ vi.mock('@renderer/config/models', async (importOriginal) => {
isQwenAlwaysThinkModel: vi.fn(() => false),
isSupportedThinkingTokenHunyuanModel: vi.fn(() => false),
isSupportedThinkingTokenModel: vi.fn(() => false),
isGPT51SeriesModel: vi.fn(() => false)
isGPT51SeriesModel: vi.fn(() => false),
findTokenLimit: vi.fn(actual.findTokenLimit)
}
})
@ -596,7 +598,7 @@ describe('reasoning utils', () => {
expect(result).toEqual({})
})
it('should return disabled thinking when no reasoning effort', async () => {
it('should return disabled thinking when reasoning effort is none', async () => {
const { isReasoningModel, isSupportedThinkingTokenClaudeModel } = await import('@renderer/config/models')
vi.mocked(isReasoningModel).mockReturnValue(true)
@ -611,7 +613,9 @@ describe('reasoning utils', () => {
const assistant: Assistant = {
id: 'test',
name: 'Test',
settings: {}
settings: {
reasoning_effort: 'none'
}
} as Assistant
const result = getAnthropicReasoningParams(assistant, model)
@ -647,7 +651,7 @@ describe('reasoning utils', () => {
expect(result).toEqual({
thinking: {
type: 'enabled',
budgetTokens: 2048
budgetTokens: 4096
}
})
})
@ -675,7 +679,7 @@ describe('reasoning utils', () => {
expect(result).toEqual({})
})
it('should disable thinking for Flash models without reasoning effort', async () => {
it('should disable thinking for Flash models when reasoning effort is none', async () => {
const { isReasoningModel, isSupportedThinkingTokenGeminiModel } = await import('@renderer/config/models')
vi.mocked(isReasoningModel).mockReturnValue(true)
@ -690,7 +694,9 @@ describe('reasoning utils', () => {
const assistant: Assistant = {
id: 'test',
name: 'Test',
settings: {}
settings: {
reasoning_effort: 'none'
}
} as Assistant
const result = getGeminiReasoningParams(assistant, model)
@ -725,7 +731,7 @@ describe('reasoning utils', () => {
const result = getGeminiReasoningParams(assistant, model)
expect(result).toEqual({
thinkingConfig: {
thinkingBudget: 16448,
thinkingBudget: expect.any(Number),
includeThoughts: true
}
})
@ -889,7 +895,7 @@ describe('reasoning utils', () => {
expect(result).toEqual({
reasoningConfig: {
type: 'enabled',
budgetTokens: 2048
budgetTokens: 4096
}
})
})
@ -990,4 +996,89 @@ describe('reasoning utils', () => {
})
})
})
describe('getAnthropicThinkingBudget', () => {
it('should return undefined when reasoningEffort is undefined', async () => {
const result = getAnthropicThinkingBudget(4096, undefined, 'claude-3-7-sonnet')
expect(result).toBeUndefined()
})
it('should return undefined when reasoningEffort is none', async () => {
const result = getAnthropicThinkingBudget(4096, 'none', 'claude-3-7-sonnet')
expect(result).toBeUndefined()
})
it('should return undefined when tokenLimit is not found', async () => {
const { findTokenLimit } = await import('@renderer/config/models')
vi.mocked(findTokenLimit).mockReturnValue(undefined)
const result = getAnthropicThinkingBudget(4096, 'medium', 'unknown-model')
expect(result).toBeUndefined()
})
it('should calculate budget correctly when maxTokens is provided', async () => {
const { findTokenLimit } = await import('@renderer/config/models')
vi.mocked(findTokenLimit).mockReturnValue({ min: 1024, max: 32768 })
const result = getAnthropicThinkingBudget(4096, 'medium', 'claude-3-7-sonnet')
// EFFORT_RATIO['medium'] = 0.5
// budget = Math.floor((32768 - 1024) * 0.5 + 1024)
// = Math.floor(31744 * 0.5 + 1024) = Math.floor(15872 + 1024) = 16896
// budgetTokens = Math.min(16896, 4096) = 4096
// result = Math.max(1024, 4096) = 4096
expect(result).toBe(4096)
})
it('should use tokenLimit.max when maxTokens is undefined', async () => {
const { findTokenLimit } = await import('@renderer/config/models')
vi.mocked(findTokenLimit).mockReturnValue({ min: 1024, max: 32768 })
const result = getAnthropicThinkingBudget(undefined, 'medium', 'claude-3-7-sonnet')
// When maxTokens is undefined, budget is not constrained by maxTokens
// EFFORT_RATIO['medium'] = 0.5
// budget = Math.floor((32768 - 1024) * 0.5 + 1024)
// = Math.floor(31744 * 0.5 + 1024) = Math.floor(15872 + 1024) = 16896
// result = Math.max(1024, 16896) = 16896
expect(result).toBe(16896)
})
it('should enforce minimum budget of 1024', async () => {
const { findTokenLimit } = await import('@renderer/config/models')
vi.mocked(findTokenLimit).mockReturnValue({ min: 100, max: 1000 })
const result = getAnthropicThinkingBudget(500, 'low', 'claude-3-7-sonnet')
// EFFORT_RATIO['low'] = 0.05
// budget = Math.floor((1000 - 100) * 0.05 + 100)
// = Math.floor(900 * 0.05 + 100) = Math.floor(45 + 100) = 145
// budgetTokens = Math.min(145, 500) = 145
// result = Math.max(1024, 145) = 1024
expect(result).toBe(1024)
})
it('should respect effort ratio for high reasoning effort', async () => {
const { findTokenLimit } = await import('@renderer/config/models')
vi.mocked(findTokenLimit).mockReturnValue({ min: 1024, max: 32768 })
const result = getAnthropicThinkingBudget(8192, 'high', 'claude-3-7-sonnet')
// EFFORT_RATIO['high'] = 0.8
// budget = Math.floor((32768 - 1024) * 0.8 + 1024)
// = Math.floor(31744 * 0.8 + 1024) = Math.floor(25395.2 + 1024) = 26419
// budgetTokens = Math.min(26419, 8192) = 8192
// result = Math.max(1024, 8192) = 8192
expect(result).toBe(8192)
})
it('should use full token limit when maxTokens is undefined and reasoning effort is high', async () => {
const { findTokenLimit } = await import('@renderer/config/models')
vi.mocked(findTokenLimit).mockReturnValue({ min: 1024, max: 32768 })
const result = getAnthropicThinkingBudget(undefined, 'high', 'claude-3-7-sonnet')
// When maxTokens is undefined, budget is not constrained by maxTokens
// EFFORT_RATIO['high'] = 0.8
// budget = Math.floor((32768 - 1024) * 0.8 + 1024)
// = Math.floor(31744 * 0.8 + 1024) = Math.floor(25395.2 + 1024) = 26419
// result = Math.max(1024, 26419) = 26419
expect(result).toBe(26419)
})
})
})

View File

@ -10,6 +10,7 @@ import {
GEMINI_FLASH_MODEL_REGEX,
getModelSupportedReasoningEffortOptions,
isDeepSeekHybridInferenceModel,
isDoubaoSeed18Model,
isDoubaoSeedAfter251015,
isDoubaoThinkingAutoModel,
isGemini3ThinkingTokenModel,
@ -64,7 +65,7 @@ export function getReasoningEffort(assistant: Assistant, model: Model): Reasonin
// reasoningEffort is not set, no extra reasoning setting
// Generally, for every model which supports reasoning control, the reasoning effort won't be undefined.
// It's for some reasoning models that don't support reasoning control, such as deepseek reasoner.
if (!reasoningEffort) {
if (!reasoningEffort || reasoningEffort === 'default') {
return {}
}
@ -329,7 +330,7 @@ export function getReasoningEffort(assistant: Assistant, model: Model): Reasonin
// Grok models/Perplexity models/OpenAI models, use reasoning_effort
if (isSupportedReasoningEffortModel(model)) {
// 检查模型是否支持所选选项
const supportedOptions = getModelSupportedReasoningEffortOptions(model)
const supportedOptions = getModelSupportedReasoningEffortOptions(model)?.filter((option) => option !== 'default')
if (supportedOptions?.includes(reasoningEffort)) {
return {
reasoningEffort
@ -389,7 +390,7 @@ export function getReasoningEffort(assistant: Assistant, model: Model): Reasonin
// Use thinking, doubao, zhipu, etc.
if (isSupportedThinkingTokenDoubaoModel(model)) {
if (isDoubaoSeedAfter251015(model)) {
if (isDoubaoSeedAfter251015(model) || isDoubaoSeed18Model(model)) {
return { reasoningEffort }
}
if (reasoningEffort === 'high') {
@ -427,7 +428,7 @@ export function getOpenAIReasoningParams(
let reasoningEffort = assistant?.settings?.reasoning_effort
if (!reasoningEffort) {
if (!reasoningEffort || reasoningEffort === 'default') {
return {}
}
@ -479,16 +480,14 @@ export function getAnthropicThinkingBudget(
return undefined
}
const budgetTokens = Math.max(
1024,
Math.floor(
Math.min(
(tokenLimit.max - tokenLimit.min) * effortRatio + tokenLimit.min,
(maxTokens || DEFAULT_MAX_TOKENS) * effortRatio
)
)
)
return budgetTokens
const budget = Math.floor((tokenLimit.max - tokenLimit.min) * effortRatio + tokenLimit.min)
let budgetTokens = budget
if (maxTokens !== undefined) {
budgetTokens = Math.min(budget, maxTokens)
}
return Math.max(1024, budgetTokens)
}
/**
@ -505,7 +504,11 @@ export function getAnthropicReasoningParams(
const reasoningEffort = assistant?.settings?.reasoning_effort
if (reasoningEffort === undefined || reasoningEffort === 'none') {
if (!reasoningEffort || reasoningEffort === 'default') {
return {}
}
if (reasoningEffort === 'none') {
return {
thinking: {
type: 'disabled'
@ -560,6 +563,10 @@ export function getGeminiReasoningParams(
const reasoningEffort = assistant?.settings?.reasoning_effort
if (!reasoningEffort || reasoningEffort === 'default') {
return {}
}
// Gemini 推理参数
if (isSupportedThinkingTokenGeminiModel(model)) {
if (reasoningEffort === undefined || reasoningEffort === 'none') {
@ -620,10 +627,6 @@ export function getXAIReasoningParams(assistant: Assistant, model: Model): Pick<
const { reasoning_effort: reasoningEffort } = getAssistantSettings(assistant)
if (!reasoningEffort || reasoningEffort === 'none') {
return {}
}
switch (reasoningEffort) {
case 'auto':
case 'minimal':
@ -634,6 +637,10 @@ export function getXAIReasoningParams(assistant: Assistant, model: Model): Pick<
return { reasoningEffort }
case 'xhigh':
return { reasoningEffort: 'high' }
case 'default':
case 'none':
default:
return {}
}
}
@ -650,7 +657,7 @@ export function getBedrockReasoningParams(
const reasoningEffort = assistant?.settings?.reasoning_effort
if (reasoningEffort === undefined) {
if (reasoningEffort === undefined || reasoningEffort === 'default') {
return {}
}

View File

@ -113,6 +113,18 @@ export function MdiLightbulbOn(props: SVGProps<SVGSVGElement>) {
)
}
export function MdiLightbulbQuestion(props: SVGProps<SVGSVGElement>) {
// {/* Icon from Material Design Icons by Pictogrammers - https://github.com/Templarian/MaterialDesign/blob/master/LICENSE */}
return (
<svg xmlns="http://www.w3.org/2000/svg" width="1em" height="1em" viewBox="0 0 24 24" {...props}>
<path
fill="currentColor"
d="M8 2C11.9 2 15 5.1 15 9C15 11.4 13.8 13.5 12 14.7V17C12 17.6 11.6 18 11 18H5C4.4 18 4 17.6 4 17V14.7C2.2 13.5 1 11.4 1 9C1 5.1 4.1 2 8 2M5 21V20H11V21C11 21.6 10.6 22 10 22H6C5.4 22 5 21.6 5 21M8 4C5.2 4 3 6.2 3 9C3 11.1 4.2 12.8 6 13.6V16H10V13.6C11.8 12.8 13 11.1 13 9C13 6.2 10.8 4 8 4M20.5 14.5V16H19V14.5H20.5M18.5 9.5H17V9C17 7.3 18.3 6 20 6S23 7.3 23 9C23 10 22.5 10.9 21.7 11.4L21.4 11.6C20.8 12 20.5 12.6 20.5 13.3V13.5H19V13.3C19 12.1 19.6 11 20.6 10.4L20.9 10.2C21.3 9.9 21.5 9.5 21.5 9C21.5 8.2 20.8 7.5 20 7.5S18.5 8.2 18.5 9V9.5Z"
/>
</svg>
)
}
export function BingLogo(props: SVGProps<SVGSVGElement>) {
return (
<svg

View File

@ -1,553 +0,0 @@
import { loggerService } from '@logger'
import { AppLogo } from '@renderer/config/env'
import { SettingHelpText, SettingRow } from '@renderer/pages/settings'
import type { WebSocketCandidatesResponse } from '@shared/config/types'
import { Alert, Button, Modal, Progress, Spin } from 'antd'
import { QRCodeSVG } from 'qrcode.react'
import { useCallback, useEffect, useMemo, useState } from 'react'
import { useTranslation } from 'react-i18next'
import { TopView } from '../TopView'
const logger = loggerService.withContext('ExportToPhoneLanPopup')
interface Props {
resolve: (data: any) => void
}
type ConnectionPhase = 'initializing' | 'waiting_qr_scan' | 'connecting' | 'connected' | 'disconnected' | 'error'
type TransferPhase = 'idle' | 'preparing' | 'sending' | 'completed' | 'error'
const LoadingQRCode: React.FC = () => {
const { t } = useTranslation()
return (
<div style={{ display: 'flex', flexDirection: 'column', alignItems: 'center', gap: '12px' }}>
<Spin />
<span style={{ fontSize: '14px', color: 'var(--color-text-2)' }}>
{t('settings.data.export_to_phone.lan.generating_qr')}
</span>
</div>
)
}
const ScanQRCode: React.FC<{ qrCodeValue: string }> = ({ qrCodeValue }) => {
const { t } = useTranslation()
return (
<div style={{ display: 'flex', flexDirection: 'column', alignItems: 'center', gap: '12px' }}>
<QRCodeSVG
marginSize={2}
value={qrCodeValue}
level="H"
size={200}
imageSettings={{
src: AppLogo,
width: 40,
height: 40,
excavate: true
}}
/>
<span style={{ fontSize: '12px', color: 'var(--color-text-2)' }}>
{t('settings.data.export_to_phone.lan.scan_qr')}
</span>
</div>
)
}
const ConnectingAnimation: React.FC = () => {
const { t } = useTranslation()
return (
<div style={{ display: 'flex', flexDirection: 'column', alignItems: 'center', gap: '12px' }}>
<div
style={{
width: '160px',
height: '160px',
display: 'flex',
flexDirection: 'column',
alignItems: 'center',
justifyContent: 'center',
border: '2px dashed var(--color-status-warning)',
borderRadius: '12px',
backgroundColor: 'var(--color-status-warning)'
}}>
<Spin size="large" />
<span style={{ fontSize: '14px', color: 'var(--color-text)', marginTop: '12px' }}>
{t('settings.data.export_to_phone.lan.status.connecting')}
</span>
</div>
</div>
)
}
const ConnectedDisplay: React.FC = () => {
const { t } = useTranslation()
return (
<div style={{ display: 'flex', flexDirection: 'column', alignItems: 'center', gap: '12px' }}>
<div
style={{
width: '160px',
height: '160px',
display: 'flex',
flexDirection: 'column',
alignItems: 'center',
justifyContent: 'center',
border: '2px dashed var(--color-status-success)',
borderRadius: '12px',
backgroundColor: 'var(--color-status-success)'
}}>
<span style={{ fontSize: '48px' }}>📱</span>
<span style={{ fontSize: '14px', color: 'var(--color-text)', marginTop: '8px' }}>
{t('settings.data.export_to_phone.lan.connected')}
</span>
</div>
</div>
)
}
const ErrorQRCode: React.FC<{ error: string | null }> = ({ error }) => {
const { t } = useTranslation()
return (
<div
style={{
display: 'flex',
flexDirection: 'column',
alignItems: 'center',
gap: '12px',
padding: '20px',
border: `1px solid var(--color-error)`,
borderRadius: '8px',
backgroundColor: 'var(--color-error)'
}}>
<span style={{ fontSize: '48px' }}></span>
<span style={{ fontSize: '14px', color: 'var(--color-text)' }}>
{t('settings.data.export_to_phone.lan.connection_failed')}
</span>
{error && <span style={{ fontSize: '12px', color: 'var(--color-text-2)' }}>{error}</span>}
</div>
)
}
const PopupContainer: React.FC<Props> = ({ resolve }) => {
const [isOpen, setIsOpen] = useState(true)
const [connectionPhase, setConnectionPhase] = useState<ConnectionPhase>('initializing')
const [transferPhase, setTransferPhase] = useState<TransferPhase>('idle')
const [qrCodeValue, setQrCodeValue] = useState('')
const [selectedFolderPath, setSelectedFolderPath] = useState<string | null>(null)
const [sendProgress, setSendProgress] = useState(0)
const [error, setError] = useState<string | null>(null)
const [autoCloseCountdown, setAutoCloseCountdown] = useState<number | null>(null)
const { t } = useTranslation()
// 派生状态
const isConnected = connectionPhase === 'connected'
const canSend = isConnected && selectedFolderPath && transferPhase === 'idle'
const isSending = transferPhase === 'preparing' || transferPhase === 'sending'
// 状态文本映射
const connectionStatusText = useMemo(() => {
const statusMap = {
initializing: t('settings.data.export_to_phone.lan.status.initializing'),
waiting_qr_scan: t('settings.data.export_to_phone.lan.status.waiting_qr_scan'),
connecting: t('settings.data.export_to_phone.lan.status.connecting'),
connected: t('settings.data.export_to_phone.lan.status.connected'),
disconnected: t('settings.data.export_to_phone.lan.status.disconnected'),
error: t('settings.data.export_to_phone.lan.status.error')
}
return statusMap[connectionPhase]
}, [connectionPhase, t])
const transferStatusText = useMemo(() => {
const statusMap = {
idle: '',
preparing: t('settings.data.export_to_phone.lan.status.preparing'),
sending: t('settings.data.export_to_phone.lan.status.sending'),
completed: t('settings.data.export_to_phone.lan.status.completed'),
error: t('settings.data.export_to_phone.lan.status.error')
}
return statusMap[transferPhase]
}, [transferPhase, t])
// 状态样式映射
const connectionStatusStyles = useMemo(() => {
const styleMap = {
initializing: {
bg: 'var(--color-background-mute)',
border: 'var(--color-border-mute)'
},
waiting_qr_scan: {
bg: 'var(--color-primary-mute)',
border: 'var(--color-primary-soft)'
},
connecting: { bg: 'var(--color-status-warning)', border: 'var(--color-status-warning)' },
connected: {
bg: 'var(--color-status-success)',
border: 'var(--color-status-success)'
},
disconnected: { bg: 'var(--color-error)', border: 'var(--color-error)' },
error: { bg: 'var(--color-error)', border: 'var(--color-error)' }
}
return styleMap[connectionPhase]
}, [connectionPhase])
const initWebSocket = useCallback(async () => {
try {
setConnectionPhase('initializing')
await window.api.webSocket.start()
const { port, ip } = await window.api.webSocket.status()
if (ip && port) {
const candidatesData = await window.api.webSocket.getAllCandidates()
const optimizeConnectionInfo = () => {
const ipToNumber = (ip: string) => {
return ip.split('.').reduce((acc, octet) => (acc << 8) + parseInt(octet), 0)
}
const compressedData = [
'CSA',
ipToNumber(ip),
candidatesData.map((candidate: WebSocketCandidatesResponse) => ipToNumber(candidate.host)),
port, // 端口号
Date.now() % 86400000
]
return compressedData
}
const compressedData = optimizeConnectionInfo()
const qrCodeValue = JSON.stringify(compressedData)
setQrCodeValue(qrCodeValue)
setConnectionPhase('waiting_qr_scan')
} else {
setError(t('settings.data.export_to_phone.lan.error.no_ip'))
setConnectionPhase('error')
}
} catch (error) {
setError(
`${t('settings.data.export_to_phone.lan.error.init_failed')}: ${error instanceof Error ? error.message : ''}`
)
setConnectionPhase('error')
logger.error('Failed to initialize WebSocket:', error as Error)
}
}, [t])
const handleClientConnected = useCallback((_event: any, data: { connected: boolean }) => {
logger.info(`Client connection status: ${data.connected ? 'connected' : 'disconnected'}`)
if (data.connected) {
setConnectionPhase('connected')
setError(null)
} else {
setConnectionPhase('disconnected')
}
}, [])
const handleMessageReceived = useCallback((_event: any, data: any) => {
logger.info(`Received message from mobile: ${JSON.stringify(data)}`)
}, [])
const handleSendProgress = useCallback(
(_event: any, data: { progress: number }) => {
const progress = data.progress
setSendProgress(progress)
if (transferPhase === 'preparing' && progress > 0) {
setTransferPhase('sending')
}
if (progress >= 100) {
setTransferPhase('completed')
// 启动 3 秒倒计时自动关闭
setAutoCloseCountdown(3)
}
},
[transferPhase]
)
const handleSelectZip = useCallback(async () => {
const result = await window.api.file.select()
if (result) {
setSelectedFolderPath(result[0].path)
}
}, [])
const handleSendZip = useCallback(async () => {
if (!selectedFolderPath) {
setError(t('settings.data.export_to_phone.lan.error.no_file'))
return
}
setTransferPhase('preparing')
setError(null)
setSendProgress(0)
try {
logger.info(`Starting file transfer: ${selectedFolderPath}`)
await window.api.webSocket.sendFile(selectedFolderPath)
} catch (error) {
setError(
`${t('settings.data.export_to_phone.lan.error.send_failed')}: ${error instanceof Error ? error.message : ''}`
)
setTransferPhase('error')
logger.error('Failed to send file:', error as Error)
}
}, [selectedFolderPath, t])
// 尝试关闭弹窗 - 如果正在传输则显示确认
const handleCancel = useCallback(() => {
if (isSending) {
window.modal.confirm({
title: t('settings.data.export_to_phone.lan.confirm_close_title'),
content: t('settings.data.export_to_phone.lan.confirm_close_message'),
centered: true,
okButtonProps: {
danger: true
},
okText: t('settings.data.export_to_phone.lan.force_close'),
onOk: () => setIsOpen(false)
})
} else {
setIsOpen(false)
}
}, [isSending, t])
// 清理并关闭
const handleClose = useCallback(async () => {
try {
// 主动断开 WebSocket 连接
if (isConnected || connectionPhase !== 'disconnected') {
logger.info('Closing popup, stopping WebSocket')
await window.api.webSocket.stop()
}
} catch (error) {
logger.error('Failed to stop WebSocket on close:', error as Error)
}
resolve({})
}, [resolve, isConnected, connectionPhase])
useEffect(() => {
initWebSocket()
const removeClientConnectedListener = window.electron.ipcRenderer.on(
'websocket-client-connected',
handleClientConnected
)
const removeMessageReceivedListener = window.electron.ipcRenderer.on(
'websocket-message-received',
handleMessageReceived
)
const removeSendProgressListener = window.electron.ipcRenderer.on('file-send-progress', handleSendProgress)
return () => {
removeClientConnectedListener()
removeMessageReceivedListener()
removeSendProgressListener()
window.api.webSocket.stop()
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [])
// 自动关闭倒计时
useEffect(() => {
if (autoCloseCountdown === null) return
if (autoCloseCountdown <= 0) {
logger.debug('Auto-closing popup after transfer completion')
setIsOpen(false)
return
}
const timer = setTimeout(() => {
setAutoCloseCountdown(autoCloseCountdown - 1)
}, 1000)
return () => clearTimeout(timer)
}, [autoCloseCountdown])
// 状态指示器组件
const StatusIndicator = useCallback(
() => (
<div
style={{
display: 'flex',
alignItems: 'center',
justifyContent: 'center',
gap: '8px',
padding: '5px 12px',
width: '100%',
backgroundColor: connectionStatusStyles.bg,
border: `1px solid ${connectionStatusStyles.border}`,
marginBottom: 10
}}>
<span style={{ fontSize: '14px', fontWeight: '500', color: 'var(--color-text)' }}>{connectionStatusText}</span>
</div>
),
[connectionStatusStyles, connectionStatusText]
)
// 二维码显示组件 - 使用显式条件渲染以避免类型不匹配
const QRCodeDisplay = useCallback(() => {
switch (connectionPhase) {
case 'waiting_qr_scan':
case 'disconnected':
return <ScanQRCode qrCodeValue={qrCodeValue} />
case 'initializing':
return <LoadingQRCode />
case 'connecting':
return <ConnectingAnimation />
case 'connected':
return <ConnectedDisplay />
case 'error':
return <ErrorQRCode error={error} />
default:
return null
}
}, [connectionPhase, qrCodeValue, error])
// 传输进度组件
const TransferProgress = useCallback(() => {
if (!isSending && transferPhase !== 'completed') return null
return (
<div style={{ paddingTop: '20px' }}>
<div
style={{
display: 'flex',
flexDirection: 'column',
gap: '8px',
padding: '12px',
border: `1px solid var(--color-border)`,
borderRadius: '8px',
backgroundColor: 'var(--color-background-mute)'
}}>
<div
style={{
display: 'flex',
justifyContent: 'space-between',
alignItems: 'center',
fontSize: '14px',
fontWeight: '500'
}}>
<span style={{ color: 'var(--color-text)' }}>
{t('settings.data.export_to_phone.lan.transfer_progress')}
</span>
<span
style={{ color: transferPhase === 'completed' ? 'var(--color-status-success)' : 'var(--color-primary)' }}>
{transferPhase === 'completed' ? '✅ ' + t('common.completed') : `${Math.round(sendProgress)}%`}
</span>
</div>
<Progress
percent={Math.round(sendProgress)}
status={transferPhase === 'completed' ? 'success' : 'active'}
showInfo={false}
/>
</div>
</div>
)
}, [isSending, transferPhase, sendProgress, t])
const AutoCloseCountdown = useCallback(() => {
if (transferPhase !== 'completed' || autoCloseCountdown === null || autoCloseCountdown <= 0) return null
return (
<div
style={{
fontSize: '12px',
color: 'var(--color-text-2)',
textAlign: 'center',
paddingTop: '4px'
}}>
{t('settings.data.export_to_phone.lan.auto_close_tip', { seconds: autoCloseCountdown })}
</div>
)
}, [transferPhase, autoCloseCountdown, t])
// 错误显示组件
const ErrorDisplay = useCallback(() => {
if (!error || transferPhase !== 'error') return null
return (
<div
style={{
padding: '12px',
border: `1px solid var(--color-error)`,
borderRadius: '8px',
backgroundColor: 'var(--color-error)',
textAlign: 'center'
}}>
<span style={{ fontSize: '14px', color: 'var(--color-text)' }}> {error}</span>
</div>
)
}, [error, transferPhase])
return (
<Modal
open={isOpen}
onCancel={handleCancel}
afterClose={handleClose}
title={t('settings.data.export_to_phone.lan.title')}
centered
closable={!isSending}
maskClosable={false}
keyboard={true}
footer={null}
styles={{ body: { paddingBottom: 10 } }}>
<SettingRow>
<StatusIndicator />
</SettingRow>
<Alert message={t('settings.data.export_to_phone.lan.content')} type="info" style={{ borderRadius: 0 }} />
<SettingRow style={{ display: 'flex', justifyContent: 'center', minHeight: '180px', marginBlock: 25 }}>
<QRCodeDisplay />
</SettingRow>
<SettingRow style={{ display: 'flex', alignItems: 'center', marginBlock: 10 }}>
<div style={{ display: 'flex', gap: 10, justifyContent: 'center', width: '100%' }}>
<Button onClick={handleSelectZip} disabled={isSending}>
{t('settings.data.export_to_phone.lan.selectZip')}
</Button>
<Button type="primary" onClick={handleSendZip} disabled={!canSend} loading={isSending}>
{transferStatusText || t('settings.data.export_to_phone.lan.sendZip')}
</Button>
</div>
</SettingRow>
<SettingHelpText
style={{
overflow: 'hidden',
textOverflow: 'ellipsis',
whiteSpace: 'nowrap',
textAlign: 'center'
}}>
{selectedFolderPath || t('settings.data.export_to_phone.lan.noZipSelected')}
</SettingHelpText>
<TransferProgress />
<AutoCloseCountdown />
<ErrorDisplay />
</Modal>
)
}
const TopViewKey = 'ExportToPhoneLanPopup'
export default class ExportToPhoneLanPopup {
static topviewId = 0
static hide() {
TopView.hide(TopViewKey)
}
static show() {
return new Promise<any>((resolve) => {
TopView.show(
<PopupContainer
resolve={(v) => {
resolve(v)
TopView.hide(TopViewKey)
}}
/>,
TopViewKey
)
})
}
}

View File

@ -0,0 +1,97 @@
import { cn } from '@renderer/utils'
import type { FC, KeyboardEventHandler } from 'react'
import { useTranslation } from 'react-i18next'
import { ProgressIndicator } from './ProgressIndicator'
import type { LanDeviceCardProps } from './types'
export const LanDeviceCard: FC<LanDeviceCardProps> = ({
service,
transferState,
isConnected,
handshakeInProgress,
isDisabled,
onSendFile
}) => {
const { t } = useTranslation()
// Device info
const deviceName = service.txt?.modelName || t('common.unknown')
const platform = service.txt?.platform
const appVersion = service.txt?.appVersion
const platformInfo = [platform, appVersion].filter(Boolean).join(' ')
const displayTitle = platformInfo ? `${deviceName} (${platformInfo})` : deviceName
// Address info
const primaryAddress = service.addresses?.[0]
const addressesWithPort = primaryAddress ? (service.port ? `${primaryAddress}:${service.port}` : primaryAddress) : ''
// Progress visibility
const shouldShowProgress =
transferState && ['selecting', 'transferring', 'completed', 'failed'].includes(transferState.status)
// Status text
const statusText = handshakeInProgress
? t('settings.data.export_to_phone.lan.handshake.in_progress')
: isConnected
? t('settings.data.export_to_phone.lan.connected')
: t('settings.data.export_to_phone.lan.send_file')
// Event handlers
const handleClick = () => {
if (isDisabled) return
onSendFile(service.id)
}
const handleKeyDown: KeyboardEventHandler<HTMLDivElement> = (event) => {
if (event.key === 'Enter' || event.key === ' ') {
event.preventDefault()
handleClick()
}
}
return (
<div
role="button"
tabIndex={0}
onClick={handleClick}
onKeyDown={handleKeyDown}
className={cn(
// Base styles
'flex cursor-pointer flex-col gap-2 rounded-xl border p-3 outline-none transition-all duration-[120ms]',
// Hover state
'hover:-translate-y-px hover:border-[var(--color-primary-hover)] hover:shadow-md',
// Focus state
'focus-visible:border-[var(--color-primary)] focus-visible:shadow-[0_0_0_2px_rgba(24,144,255,0.2)]',
// Connected state
isConnected
? 'border-[var(--color-primary)] bg-[rgba(24,144,255,0.04)]'
: 'border-[var(--color-border)] bg-[var(--color-background)]',
// Disabled state
isDisabled && 'pointer-events-none translate-y-0 opacity-70 shadow-none'
)}>
{/* Header */}
<div className="flex items-center justify-between gap-2">
<div className="flex flex-col gap-1">
<div className="break-words font-semibold text-[var(--color-text-1)] text-sm">{displayTitle}</div>
<span className="text-[var(--color-text-2)] text-xs">{statusText}</span>
</div>
</div>
{/* Meta Row - IP Address */}
<div className="flex flex-col gap-1">
<span className="text-[11px] text-[var(--color-text-3)] uppercase tracking-[0.03em]">
{t('settings.data.export_to_phone.lan.ip_addresses')}
</span>
<span className="break-words text-[var(--color-text)] text-xs">{addressesWithPort || t('common.unknown')}</span>
</div>
{/* Footer with Progress */}
<div className="flex flex-wrap items-center justify-between gap-2 text-[11px] text-[var(--color-text-3)]">
{shouldShowProgress && transferState && (
<ProgressIndicator transferState={transferState} handshakeInProgress={handshakeInProgress} />
)}
</div>
</div>
)
}

View File

@ -0,0 +1,55 @@
import { cn } from '@renderer/utils'
import type { FC } from 'react'
import { useTranslation } from 'react-i18next'
import type { ProgressIndicatorProps } from './types'
export const ProgressIndicator: FC<ProgressIndicatorProps> = ({ transferState, handshakeInProgress }) => {
const { t } = useTranslation()
const progressPercent = Math.min(100, Math.max(0, transferState.progress ?? 0))
const progressLabel = (() => {
if (transferState.status === 'failed') {
return transferState.error || t('common.unknown_error')
}
if (transferState.status === 'selecting') {
return handshakeInProgress
? t('settings.data.export_to_phone.lan.handshake.in_progress')
: t('settings.data.export_to_phone.lan.status.preparing')
}
return `${Math.round(progressPercent)}%`
})()
const isFailed = transferState.status === 'failed'
const isCompleted = transferState.status === 'completed'
return (
<div className="flex min-w-[180px] flex-1 flex-col gap-1">
{/* Label Row */}
<div
className={cn(
'flex items-center justify-between gap-1.5 text-[11px]',
isFailed ? 'text-[var(--color-error)]' : 'text-[var(--color-text-2)]'
)}>
<span className="flex-1 overflow-hidden text-ellipsis whitespace-nowrap">{transferState.fileName}</span>
<span className="shrink-0 whitespace-nowrap">{progressLabel}</span>
</div>
{/* Progress Track */}
<div className="relative h-1.5 w-full overflow-hidden rounded-full bg-[var(--color-border)]">
<div
className={cn(
'h-full rounded-full transition-[width] duration-[120ms]',
isFailed
? 'bg-[var(--color-error)]'
: isCompleted
? 'bg-[var(--color-status-success)]'
: 'bg-[var(--color-primary)]'
)}
style={{ width: `${progressPercent}%` }}
/>
</div>
</div>
)
}

View File

@ -0,0 +1,397 @@
import { loggerService } from '@logger'
import { getBackupData } from '@renderer/services/BackupService'
import type { LocalTransferPeer } from '@shared/config/types'
import { useCallback, useEffect, useMemo, useReducer, useRef } from 'react'
import { useTranslation } from 'react-i18next'
import type { LanPeerTransferState, LanTransferAction, LanTransferReducerState } from './types'
const logger = loggerService.withContext('useLanTransfer')
// ==========================================
// Initial State
// ==========================================
export const initialState: LanTransferReducerState = {
open: true,
lanState: null,
lanHandshakePeerId: null,
lastHandshakeResult: null,
fileTransferState: {},
tempBackupPath: null
}
// ==========================================
// Reducer
// ==========================================
export function lanTransferReducer(state: LanTransferReducerState, action: LanTransferAction): LanTransferReducerState {
switch (action.type) {
case 'SET_OPEN':
return { ...state, open: action.payload }
case 'SET_LAN_STATE':
return { ...state, lanState: action.payload }
case 'SET_HANDSHAKE_PEER_ID':
return { ...state, lanHandshakePeerId: action.payload }
case 'SET_HANDSHAKE_RESULT':
return { ...state, lastHandshakeResult: action.payload }
case 'SET_TEMP_BACKUP_PATH':
return { ...state, tempBackupPath: action.payload }
case 'UPDATE_TRANSFER_STATE': {
const { peerId, state: transferState } = action.payload
return {
...state,
fileTransferState: {
...state.fileTransferState,
[peerId]: {
...(state.fileTransferState[peerId] ?? { progress: 0, status: 'idle' as const }),
...transferState
}
}
}
}
case 'SET_TRANSFER_STATE': {
const { peerId, state: transferState } = action.payload
return {
...state,
fileTransferState: {
...state.fileTransferState,
[peerId]: transferState
}
}
}
case 'CLEANUP_STALE_PEERS': {
const activeIds = action.payload
const newFileTransferState: Record<string, LanPeerTransferState> = {}
for (const id of Object.keys(state.fileTransferState)) {
if (activeIds.has(id)) {
newFileTransferState[id] = state.fileTransferState[id]
}
}
return {
...state,
fileTransferState: newFileTransferState,
lastHandshakeResult:
state.lastHandshakeResult && activeIds.has(state.lastHandshakeResult.peerId)
? state.lastHandshakeResult
: null,
lanHandshakePeerId:
state.lanHandshakePeerId && activeIds.has(state.lanHandshakePeerId) ? state.lanHandshakePeerId : null
}
}
case 'RESET_CONNECTION_STATE':
return {
...state,
fileTransferState: {},
lastHandshakeResult: null,
lanHandshakePeerId: null,
tempBackupPath: null
}
default:
return state
}
}
// ==========================================
// Hook Return Type
// ==========================================
export interface UseLanTransferReturn {
// State
state: LanTransferReducerState
// Derived values
lanDevices: LocalTransferPeer[]
isAnyTransferring: boolean
lastError: string | undefined
// Actions
handleSendFile: (peerId: string) => Promise<void>
handleModalCancel: () => void
getTransferState: (peerId: string) => LanPeerTransferState | undefined
isConnected: (peerId: string) => boolean
isHandshakeInProgress: (peerId: string) => boolean
// Dispatch (for advanced use)
dispatch: React.Dispatch<LanTransferAction>
}
// ==========================================
// Hook
// ==========================================
export function useLanTransfer(): UseLanTransferReturn {
const { t } = useTranslation()
const [state, dispatch] = useReducer(lanTransferReducer, initialState)
const isSendingRef = useRef(false)
// ==========================================
// Derived Values
// ==========================================
const lanDevices = useMemo(() => state.lanState?.services ?? [], [state.lanState])
const isAnyTransferring = useMemo(
() => Object.values(state.fileTransferState).some((s) => s.status === 'transferring' || s.status === 'selecting'),
[state.fileTransferState]
)
const lastError = state.lanState?.lastError
// ==========================================
// LAN State Sync
// ==========================================
const syncLanState = useCallback(async () => {
if (!window.api?.localTransfer) {
logger.warn('Local transfer bridge is unavailable')
return
}
try {
const nextState = await window.api.localTransfer.getState()
dispatch({ type: 'SET_LAN_STATE', payload: nextState })
} catch (error) {
logger.error('Failed to sync LAN state', error as Error)
}
}, [])
// ==========================================
// Send File Handler
// ==========================================
const handleSendFile = useCallback(
async (peerId: string) => {
if (!window.api?.localTransfer || isSendingRef.current) {
return
}
isSendingRef.current = true
dispatch({
type: 'SET_TRANSFER_STATE',
payload: { peerId, state: { progress: 0, status: 'selecting' } }
})
let backupPath: string | null = null
try {
// Step 0: Ensure handshake (connect if needed)
if (!state.lastHandshakeResult?.ack.accepted || state.lastHandshakeResult.peerId !== peerId) {
dispatch({ type: 'SET_HANDSHAKE_PEER_ID', payload: peerId })
try {
const ack = await window.api.localTransfer.connect({ peerId })
dispatch({
type: 'SET_HANDSHAKE_RESULT',
payload: { peerId, ack, timestamp: Date.now() }
})
if (!ack.accepted) {
throw new Error(ack.message || t('settings.data.export_to_phone.lan.connection_failed'))
}
} finally {
dispatch({ type: 'SET_HANDSHAKE_PEER_ID', payload: null })
}
}
// Step 1: Create temporary backup
logger.info('Creating temporary backup for LAN transfer...')
const backupData = await getBackupData()
backupPath = await window.api.backup.createLanTransferBackup(backupData)
dispatch({ type: 'SET_TEMP_BACKUP_PATH', payload: backupPath })
// Extract filename from path
const fileName = backupPath.split(/[/\\]/).pop() || 'backup.zip'
// Step 2: Set transferring state
dispatch({
type: 'UPDATE_TRANSFER_STATE',
payload: { peerId, state: { fileName, progress: 0, status: 'transferring' } }
})
// Step 3: Send file
logger.info(`Sending backup file: ${backupPath}`)
const result = await window.api.localTransfer.sendFile(backupPath)
if (result.success) {
dispatch({
type: 'UPDATE_TRANSFER_STATE',
payload: { peerId, state: { progress: 100, status: 'completed' } }
})
} else {
dispatch({
type: 'UPDATE_TRANSFER_STATE',
payload: { peerId, state: { status: 'failed', error: result.error } }
})
}
} catch (error) {
const message = error instanceof Error ? error.message : String(error)
dispatch({
type: 'UPDATE_TRANSFER_STATE',
payload: { peerId, state: { status: 'failed', error: message } }
})
logger.error('Failed to send file', error as Error)
} finally {
// Step 4: Clean up temp file
if (backupPath) {
try {
await window.api.backup.deleteTempBackup(backupPath)
logger.info('Cleaned up temporary backup file')
} catch (cleanupError) {
logger.warn('Failed to clean up temp backup', cleanupError as Error)
}
dispatch({ type: 'SET_TEMP_BACKUP_PATH', payload: null })
}
isSendingRef.current = false
}
},
[state.lastHandshakeResult, t]
)
// ==========================================
// Teardown
// ==========================================
// Use ref to track temp backup path for cleanup without causing effect re-runs
const tempBackupPathRef = useRef<string | null>(null)
tempBackupPathRef.current = state.tempBackupPath
const teardownLan = useCallback(async () => {
if (!window.api?.localTransfer) {
return
}
try {
await window.api.localTransfer.cancelTransfer?.()
} catch (error) {
logger.warn('Failed to cancel LAN transfer on close', error as Error)
}
try {
await window.api.localTransfer.disconnect?.()
} catch (error) {
logger.warn('Failed to disconnect LAN on close', error as Error)
}
// Clean up temp backup if exists (use ref to get current value)
if (tempBackupPathRef.current) {
try {
await window.api.backup.deleteTempBackup(tempBackupPathRef.current)
} catch (error) {
logger.warn('Failed to cleanup temp backup on close', error as Error)
}
}
dispatch({ type: 'RESET_CONNECTION_STATE' })
}, []) // No dependencies - uses ref for current value
const handleModalCancel = useCallback(() => {
void teardownLan()
dispatch({ type: 'SET_OPEN', payload: false })
}, [teardownLan])
// ==========================================
// Effects
// ==========================================
// Initial sync and service listener
useEffect(() => {
if (!window.api?.localTransfer) {
return
}
syncLanState()
const removeListener = window.api.localTransfer.onServicesUpdated((lanState) => {
dispatch({ type: 'SET_LAN_STATE', payload: lanState })
})
return () => {
removeListener?.()
}
}, [syncLanState])
// Client events listener (progress, completion)
useEffect(() => {
if (!window.api?.localTransfer) {
return
}
const removeListener = window.api.localTransfer.onClientEvent((event) => {
const key = event.peerId ?? 'global'
if (event.type === 'file_transfer_progress') {
dispatch({
type: 'UPDATE_TRANSFER_STATE',
payload: {
peerId: key,
state: {
transferId: event.transferId,
fileName: event.fileName,
progress: event.progress,
speed: event.speed,
status: 'transferring'
}
}
})
} else if (event.type === 'file_transfer_complete') {
dispatch({
type: 'UPDATE_TRANSFER_STATE',
payload: {
peerId: key,
state: {
progress: event.success ? 100 : undefined,
status: event.success ? 'completed' : 'failed',
error: event.error
}
}
})
}
})
return () => {
removeListener?.()
}
}, [])
// Cleanup stale peers when services change
useEffect(() => {
const activeIds = new Set(lanDevices.map((s) => s.id))
dispatch({ type: 'CLEANUP_STALE_PEERS', payload: activeIds })
}, [lanDevices])
// Cleanup on unmount only (teardownLan is stable with no deps)
useEffect(() => {
return () => {
void teardownLan()
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [])
// ==========================================
// Helper Functions
// ==========================================
const getTransferState = useCallback((peerId: string) => state.fileTransferState[peerId], [state.fileTransferState])
const isConnected = useCallback(
(peerId: string) =>
state.lastHandshakeResult?.peerId === peerId && state.lastHandshakeResult?.ack.accepted === true,
[state.lastHandshakeResult]
)
const isHandshakeInProgress = useCallback(
(peerId: string) => state.lanHandshakePeerId === peerId,
[state.lanHandshakePeerId]
)
return {
state,
lanDevices,
isAnyTransferring,
lastError,
handleSendFile,
handleModalCancel,
getTransferState,
isConnected,
isHandshakeInProgress,
dispatch
}
}

View File

@ -0,0 +1,37 @@
import { TopView } from '@renderer/components/TopView'
import { getHideCallback, PopupContainer } from './popup'
import type { PopupResolveData } from './types'
// Re-export types for external use
export type { LanPeerTransferState } from './types'
const TopViewKey = 'LanTransferPopup'
export default class LanTransferPopup {
static topviewId = 0
static hide() {
// Try to use the registered callback for proper cleanup, fallback to TopView.hide
const callback = getHideCallback()
if (callback) {
callback()
} else {
TopView.hide(TopViewKey)
}
}
static show() {
return new Promise<PopupResolveData>((resolve) => {
TopView.show(
<PopupContainer
resolve={(v) => {
resolve(v)
TopView.hide(TopViewKey)
}}
/>,
TopViewKey
)
})
}
}

View File

@ -0,0 +1,88 @@
import { Modal } from 'antd'
import { TriangleAlert } from 'lucide-react'
import type { FC } from 'react'
import { useMemo } from 'react'
import { useTranslation } from 'react-i18next'
import { useLanTransfer } from './hook'
import { LanDeviceCard } from './LanDeviceCard'
import type { PopupContainerProps } from './types'
// Module-level callback for external hide access
let hideCallback: (() => void) | null = null
export const setHideCallback = (cb: () => void) => {
hideCallback = cb
}
export const getHideCallback = () => hideCallback
export const PopupContainer: FC<PopupContainerProps> = ({ resolve }) => {
const { t } = useTranslation()
const {
state,
lanDevices,
isAnyTransferring,
lastError,
handleSendFile,
handleModalCancel,
getTransferState,
isConnected,
isHandshakeInProgress
} = useLanTransfer()
const contentTitle = useMemo(() => t('settings.data.export_to_phone.lan.title'), [t])
const onClose = () => resolve({})
// Register hide callback for external access
setHideCallback(handleModalCancel)
return (
<Modal
open={state.open}
onCancel={handleModalCancel}
afterClose={onClose}
footer={null}
centered
title={contentTitle}
transitionName="animation-move-down">
<div className="flex flex-col gap-3">
{/* Error Display */}
{lastError && <div className="text-[var(--color-error)] text-xs">{lastError}</div>}
{/* Device List */}
<div className="mt-2 flex flex-col gap-3">
{lanDevices.length === 0 ? (
// Warning when no devices
<div className="flex w-full items-center gap-2.5 rounded-[10px] border border-[rgba(255,159,41,0.4)] border-dashed bg-[rgba(255,159,41,0.1)] px-3.5 py-3">
<TriangleAlert size={20} className="text-orange-400" />
<span className="flex-1 text-[#ff9f29] text-[13px] leading-[1.4]">
{t('settings.data.export_to_phone.lan.no_connection_warning')}
</span>
</div>
) : (
// Device cards
lanDevices.map((service) => {
const transferState = getTransferState(service.id)
const connected = isConnected(service.id)
const handshakeInProgress = isHandshakeInProgress(service.id)
const isCardDisabled = isAnyTransferring || handshakeInProgress
return (
<LanDeviceCard
key={service.id}
service={service}
transferState={transferState}
isConnected={connected}
handshakeInProgress={handshakeInProgress}
isDisabled={isCardDisabled}
onSendFile={handleSendFile}
/>
)
})
)}
</div>
</div>
</Modal>
)
}

View File

@ -0,0 +1,84 @@
import type { LanHandshakeAckMessage, LocalTransferPeer, LocalTransferState } from '@shared/config/types'
// ==========================================
// Transfer Status
// ==========================================
export type TransferStatus = 'idle' | 'selecting' | 'transferring' | 'completed' | 'failed'
// ==========================================
// Per-Peer Transfer State
// ==========================================
export interface LanPeerTransferState {
transferId?: string
fileName?: string
progress: number
speed?: number
status: TransferStatus
error?: string
}
// ==========================================
// Handshake Result
// ==========================================
export type HandshakeResult = {
peerId: string
ack: LanHandshakeAckMessage
timestamp: number
} | null
// ==========================================
// Reducer State
// ==========================================
export interface LanTransferReducerState {
open: boolean
lanState: LocalTransferState | null
lanHandshakePeerId: string | null
lastHandshakeResult: HandshakeResult
fileTransferState: Record<string, LanPeerTransferState>
tempBackupPath: string | null
}
// ==========================================
// Reducer Actions
// ==========================================
export type LanTransferAction =
| { type: 'SET_OPEN'; payload: boolean }
| { type: 'SET_LAN_STATE'; payload: LocalTransferState | null }
| { type: 'SET_HANDSHAKE_PEER_ID'; payload: string | null }
| { type: 'SET_HANDSHAKE_RESULT'; payload: HandshakeResult }
| { type: 'SET_TEMP_BACKUP_PATH'; payload: string | null }
| { type: 'UPDATE_TRANSFER_STATE'; payload: { peerId: string; state: Partial<LanPeerTransferState> } }
| { type: 'SET_TRANSFER_STATE'; payload: { peerId: string; state: LanPeerTransferState } }
| { type: 'CLEANUP_STALE_PEERS'; payload: Set<string> }
| { type: 'RESET_CONNECTION_STATE' }
// ==========================================
// Component Props
// ==========================================
export interface LanDeviceCardProps {
service: LocalTransferPeer
transferState?: LanPeerTransferState
isConnected: boolean
handshakeInProgress: boolean
isDisabled: boolean
onSendFile: (peerId: string) => void
}
export interface ProgressIndicatorProps {
transferState: LanPeerTransferState
handshakeInProgress: boolean
}
export interface PopupResolveData {
// Empty for now, can be extended
}
export interface PopupContainerProps {
resolve: (data: PopupResolveData) => void
}

View File

@ -3,6 +3,7 @@ import { ErrorBoundary } from '@renderer/components/ErrorBoundary'
import { HelpTooltip } from '@renderer/components/TooltipIcons'
import { TopView } from '@renderer/components/TopView'
import { permissionModeCards } from '@renderer/config/agent'
import { isWin } from '@renderer/config/constant'
import { useAgents } from '@renderer/hooks/agents/useAgents'
import { useUpdateAgent } from '@renderer/hooks/agents/useUpdateAgent'
import SelectAgentBaseModelButton from '@renderer/pages/home/components/SelectAgentBaseModelButton'
@ -16,7 +17,8 @@ import type {
UpdateAgentForm
} from '@renderer/types'
import { AgentConfigurationSchema, isAgentType } from '@renderer/types'
import { Alert, Button, Input, Modal, Select } from 'antd'
import type { GitBashPathInfo } from '@shared/config/constant'
import { Button, Input, Modal, Select } from 'antd'
import { AlertTriangleIcon } from 'lucide-react'
import type { ChangeEvent, FormEvent } from 'react'
import { useCallback, useEffect, useMemo, useRef, useState } from 'react'
@ -59,8 +61,7 @@ const PopupContainer: React.FC<Props> = ({ agent, afterSubmit, resolve }) => {
const isEditing = (agent?: AgentWithTools) => agent !== undefined
const [form, setForm] = useState<BaseAgentForm>(() => buildAgentForm(agent))
const [hasGitBash, setHasGitBash] = useState<boolean>(true)
const [customGitBashPath, setCustomGitBashPath] = useState<string>('')
const [gitBashPathInfo, setGitBashPathInfo] = useState<GitBashPathInfo>({ path: null, source: null })
useEffect(() => {
if (open) {
@ -68,29 +69,15 @@ const PopupContainer: React.FC<Props> = ({ agent, afterSubmit, resolve }) => {
}
}, [agent, open])
const checkGitBash = useCallback(
async (showToast = false) => {
try {
const [gitBashInstalled, savedPath] = await Promise.all([
window.api.system.checkGitBash(),
window.api.system.getGitBashPath().catch(() => null)
])
setCustomGitBashPath(savedPath ?? '')
setHasGitBash(gitBashInstalled)
if (showToast) {
if (gitBashInstalled) {
window.toast.success(t('agent.gitBash.success', 'Git Bash detected successfully!'))
} else {
window.toast.error(t('agent.gitBash.notFound', 'Git Bash not found. Please install it first.'))
}
}
} catch (error) {
logger.error('Failed to check Git Bash:', error as Error)
setHasGitBash(true) // Default to true on error to avoid false warnings
}
},
[t]
)
const checkGitBash = useCallback(async () => {
if (!isWin) return
try {
const pathInfo = await window.api.system.getGitBashPathInfo()
setGitBashPathInfo(pathInfo)
} catch (error) {
logger.error('Failed to check Git Bash:', error as Error)
}
}, [])
useEffect(() => {
checkGitBash()
@ -119,24 +106,22 @@ const PopupContainer: React.FC<Props> = ({ agent, afterSubmit, resolve }) => {
return
}
setCustomGitBashPath(pickedPath)
await checkGitBash(true)
await checkGitBash()
} catch (error) {
logger.error('Failed to pick Git Bash path', error as Error)
window.toast.error(t('agent.gitBash.pick.failed', 'Failed to set Git Bash path'))
}
}, [checkGitBash, t])
const handleClearGitBash = useCallback(async () => {
const handleResetGitBash = useCallback(async () => {
try {
// Clear manual setting and re-run auto-discovery
await window.api.system.setGitBashPath(null)
setCustomGitBashPath('')
await checkGitBash(true)
await checkGitBash()
} catch (error) {
logger.error('Failed to clear Git Bash path', error as Error)
window.toast.error(t('agent.gitBash.pick.failed', 'Failed to set Git Bash path'))
logger.error('Failed to reset Git Bash path', error as Error)
}
}, [checkGitBash, t])
}, [checkGitBash])
const onPermissionModeChange = useCallback((value: PermissionMode) => {
setForm((prev) => {
@ -268,6 +253,12 @@ const PopupContainer: React.FC<Props> = ({ agent, afterSubmit, resolve }) => {
return
}
if (isWin && !gitBashPathInfo.path) {
window.toast.error(t('agent.gitBash.error.required', 'Git Bash path is required on Windows'))
loadingRef.current = false
return
}
if (isEditing(agent)) {
if (!agent) {
loadingRef.current = false
@ -327,7 +318,8 @@ const PopupContainer: React.FC<Props> = ({ agent, afterSubmit, resolve }) => {
t,
updateAgent,
afterSubmit,
addAgent
addAgent,
gitBashPathInfo.path
]
)
@ -346,66 +338,6 @@ const PopupContainer: React.FC<Props> = ({ agent, afterSubmit, resolve }) => {
footer={null}>
<StyledForm onSubmit={onSubmit}>
<FormContent>
{!hasGitBash && (
<Alert
message={t('agent.gitBash.error.title', 'Git Bash Required')}
description={
<div>
<div style={{ marginBottom: 8 }}>
{t(
'agent.gitBash.error.description',
'Git Bash is required to run agents on Windows. The agent cannot function without it. Please install Git for Windows from'
)}{' '}
<a
href="https://git-scm.com/download/win"
onClick={(e) => {
e.preventDefault()
window.api.openWebsite('https://git-scm.com/download/win')
}}
style={{ textDecoration: 'underline' }}>
git-scm.com
</a>
</div>
<Button size="small" onClick={() => checkGitBash(true)}>
{t('agent.gitBash.error.recheck', 'Recheck Git Bash Installation')}
</Button>
<Button size="small" style={{ marginLeft: 8 }} onClick={handlePickGitBash}>
{t('agent.gitBash.pick.button', 'Select Git Bash Path')}
</Button>
</div>
}
type="error"
showIcon
style={{ marginBottom: 16 }}
/>
)}
{hasGitBash && customGitBashPath && (
<Alert
message={t('agent.gitBash.found.title', 'Git Bash configured')}
description={
<div style={{ display: 'flex', flexDirection: 'column', gap: 8 }}>
<div>
{t('agent.gitBash.customPath', {
defaultValue: 'Using custom path: {{path}}',
path: customGitBashPath
})}
</div>
<div style={{ display: 'flex', gap: 8 }}>
<Button size="small" onClick={handlePickGitBash}>
{t('agent.gitBash.pick.button', 'Select Git Bash Path')}
</Button>
<Button size="small" onClick={handleClearGitBash}>
{t('agent.gitBash.clear.button', 'Clear custom path')}
</Button>
</div>
</div>
}
type="success"
showIcon
style={{ marginBottom: 16 }}
/>
)}
<FormRow>
<FormItem style={{ flex: 1 }}>
<Label>
@ -439,6 +371,40 @@ const PopupContainer: React.FC<Props> = ({ agent, afterSubmit, resolve }) => {
/>
</FormItem>
{isWin && (
<FormItem>
<div className="flex items-center gap-2">
<Label>
Git Bash <RequiredMark>*</RequiredMark>
</Label>
<HelpTooltip
title={t(
'agent.gitBash.tooltip',
'Git Bash is required to run agents on Windows. Install from git-scm.com if not available.'
)}
/>
</div>
<GitBashInputWrapper>
<Input
value={gitBashPathInfo.path ?? ''}
readOnly
placeholder={t('agent.gitBash.placeholder', 'Select bash.exe path')}
/>
<Button size="small" onClick={handlePickGitBash}>
{t('common.select', 'Select')}
</Button>
{gitBashPathInfo.source === 'manual' && (
<Button size="small" onClick={handleResetGitBash}>
{t('common.reset', 'Reset')}
</Button>
)}
</GitBashInputWrapper>
{gitBashPathInfo.path && gitBashPathInfo.source === 'auto' && (
<SourceHint>{t('agent.gitBash.autoDiscoveredHint', 'Auto-discovered')}</SourceHint>
)}
</FormItem>
)}
<FormItem>
<Label>
{t('agent.settings.tooling.permissionMode.title', 'Permission mode')} <RequiredMark>*</RequiredMark>
@ -511,7 +477,11 @@ const PopupContainer: React.FC<Props> = ({ agent, afterSubmit, resolve }) => {
<FormFooter>
<Button onClick={onCancel}>{t('common.close')}</Button>
<Button type="primary" htmlType="submit" loading={loadingRef.current} disabled={!hasGitBash}>
<Button
type="primary"
htmlType="submit"
loading={loadingRef.current}
disabled={isWin && !gitBashPathInfo.path}>
{isEditing(agent) ? t('common.confirm') : t('common.add')}
</Button>
</FormFooter>
@ -582,6 +552,21 @@ const FormItem = styled.div`
gap: 8px;
`
const GitBashInputWrapper = styled.div`
display: flex;
gap: 8px;
align-items: center;
input {
flex: 1;
}
`
const SourceHint = styled.span`
font-size: 12px;
color: var(--color-text-3);
`
const Label = styled.label`
font-size: 14px;
color: var(--color-text-1);

View File

@ -631,7 +631,7 @@ describe('Reasoning option configuration', () => {
it('restricts GPT-5 Pro reasoning to high effort only', () => {
expect(MODEL_SUPPORTED_REASONING_EFFORT.gpt5pro).toEqual(['high'])
expect(MODEL_SUPPORTED_OPTIONS.gpt5pro).toEqual(['high'])
expect(MODEL_SUPPORTED_OPTIONS.gpt5pro).toEqual(['default', 'high'])
})
})
@ -733,6 +733,11 @@ describe('getThinkModelType - Comprehensive Coverage', () => {
expect(getThinkModelType(createModel({ id: 'doubao-seed-1-6-lite-251015' }))).toBe('doubao_after_251015')
})
it('should return doubao_after_251015 for Doubao-Seed-1.8 models', () => {
expect(getThinkModelType(createModel({ id: 'doubao-seed-1-8-251215' }))).toBe('doubao_after_251015')
expect(getThinkModelType(createModel({ id: 'doubao-seed-1.8' }))).toBe('doubao_after_251015')
})
it('should return doubao_no_auto for other Doubao thinking models', () => {
expect(getThinkModelType(createModel({ id: 'doubao-1.5-thinking-vision-pro' }))).toBe('doubao_no_auto')
})
@ -863,6 +868,7 @@ describe('getThinkModelType - Comprehensive Coverage', () => {
// auto > after_251015 > no_auto
expect(getThinkModelType(createModel({ id: 'doubao-seed-1.6' }))).toBe('doubao')
expect(getThinkModelType(createModel({ id: 'doubao-seed-1-6-251015' }))).toBe('doubao_after_251015')
expect(getThinkModelType(createModel({ id: 'doubao-seed-1-8-251215' }))).toBe('doubao_after_251015')
expect(getThinkModelType(createModel({ id: 'doubao-1.5-thinking-vision-pro' }))).toBe('doubao_no_auto')
})
@ -1672,10 +1678,26 @@ describe('getModelSupportedReasoningEffortOptions', () => {
describe('OpenAI models', () => {
it('should return correct options for o-series models', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'o3' }))).toEqual(['low', 'medium', 'high'])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'o3-mini' }))).toEqual(['low', 'medium', 'high'])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'o4' }))).toEqual(['low', 'medium', 'high'])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'o3' }))).toEqual([
'default',
'low',
'medium',
'high'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'o3-mini' }))).toEqual([
'default',
'low',
'medium',
'high'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'o4' }))).toEqual([
'default',
'low',
'medium',
'high'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gpt-oss-reasoning' }))).toEqual([
'default',
'low',
'medium',
'high'
@ -1685,17 +1707,22 @@ describe('getModelSupportedReasoningEffortOptions', () => {
it('should return correct options for deep research models', () => {
// Note: Deep research models need to be actual OpenAI reasoning models to be detected
// 'sonar-deep-research' from Perplexity is the primary deep research model
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'sonar-deep-research' }))).toEqual(['medium'])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'sonar-deep-research' }))).toEqual([
'default',
'medium'
])
})
it('should return correct options for GPT-5 models', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gpt-5' }))).toEqual([
'default',
'minimal',
'low',
'medium',
'high'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gpt-5-preview' }))).toEqual([
'default',
'minimal',
'low',
'medium',
@ -1704,17 +1731,22 @@ describe('getModelSupportedReasoningEffortOptions', () => {
})
it('should return correct options for GPT-5 Pro models', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gpt-5-pro' }))).toEqual(['high'])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gpt-5-pro-preview' }))).toEqual(['high'])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gpt-5-pro' }))).toEqual(['default', 'high'])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gpt-5-pro-preview' }))).toEqual([
'default',
'high'
])
})
it('should return correct options for GPT-5 Codex models', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gpt-5-codex' }))).toEqual([
'default',
'low',
'medium',
'high'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gpt-5-codex-mini' }))).toEqual([
'default',
'low',
'medium',
'high'
@ -1723,18 +1755,21 @@ describe('getModelSupportedReasoningEffortOptions', () => {
it('should return correct options for GPT-5.1 models', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gpt-5.1' }))).toEqual([
'default',
'none',
'low',
'medium',
'high'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gpt-5.1-preview' }))).toEqual([
'default',
'none',
'low',
'medium',
'high'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gpt-5.1-mini' }))).toEqual([
'default',
'none',
'low',
'medium',
@ -1744,11 +1779,13 @@ describe('getModelSupportedReasoningEffortOptions', () => {
it('should return correct options for GPT-5.1 Codex models', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gpt-5.1-codex' }))).toEqual([
'default',
'none',
'medium',
'high'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gpt-5.1-codex-mini' }))).toEqual([
'default',
'none',
'medium',
'high'
@ -1758,19 +1795,24 @@ describe('getModelSupportedReasoningEffortOptions', () => {
describe('Grok models', () => {
it('should return correct options for Grok 3 mini', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'grok-3-mini' }))).toEqual(['low', 'high'])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'grok-3-mini' }))).toEqual([
'default',
'low',
'high'
])
})
it('should return correct options for Grok 4 Fast', () => {
expect(
getModelSupportedReasoningEffortOptions(createModel({ id: 'grok-4-fast', provider: 'openrouter' }))
).toEqual(['none', 'auto'])
).toEqual(['default', 'none', 'auto'])
})
})
describe('Gemini models', () => {
it('should return correct options for Gemini Flash models', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gemini-2.5-flash-latest' }))).toEqual([
'default',
'none',
'low',
'medium',
@ -1778,6 +1820,7 @@ describe('getModelSupportedReasoningEffortOptions', () => {
'auto'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gemini-flash-latest' }))).toEqual([
'default',
'none',
'low',
'medium',
@ -1788,12 +1831,14 @@ describe('getModelSupportedReasoningEffortOptions', () => {
it('should return correct options for Gemini Pro models', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gemini-2.5-pro-latest' }))).toEqual([
'default',
'low',
'medium',
'high',
'auto'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gemini-pro-latest' }))).toEqual([
'default',
'low',
'medium',
'high',
@ -1803,11 +1848,13 @@ describe('getModelSupportedReasoningEffortOptions', () => {
it('should return correct options for Gemini 3 models', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gemini-3-flash' }))).toEqual([
'default',
'low',
'medium',
'high'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'gemini-3-pro-preview' }))).toEqual([
'default',
'low',
'medium',
'high'
@ -1818,24 +1865,28 @@ describe('getModelSupportedReasoningEffortOptions', () => {
describe('Qwen models', () => {
it('should return correct options for controllable Qwen models', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'qwen-plus' }))).toEqual([
'default',
'none',
'low',
'medium',
'high'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'qwen-turbo' }))).toEqual([
'default',
'none',
'low',
'medium',
'high'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'qwen-flash' }))).toEqual([
'default',
'none',
'low',
'medium',
'high'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'qwen3-8b' }))).toEqual([
'default',
'none',
'low',
'medium',
@ -1853,11 +1904,13 @@ describe('getModelSupportedReasoningEffortOptions', () => {
describe('Doubao models', () => {
it('should return correct options for auto-thinking Doubao models', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'doubao-seed-1.6' }))).toEqual([
'default',
'none',
'auto',
'high'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'doubao-1-5-thinking-pro-m' }))).toEqual([
'default',
'none',
'auto',
'high'
@ -1866,12 +1919,14 @@ describe('getModelSupportedReasoningEffortOptions', () => {
it('should return correct options for Doubao models after 251015', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'doubao-seed-1-6-251015' }))).toEqual([
'default',
'minimal',
'low',
'medium',
'high'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'doubao-seed-1-6-lite-251015' }))).toEqual([
'default',
'minimal',
'low',
'medium',
@ -1881,6 +1936,7 @@ describe('getModelSupportedReasoningEffortOptions', () => {
it('should return correct options for other Doubao thinking models', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'doubao-1.5-thinking-vision-pro' }))).toEqual([
'default',
'none',
'high'
])
@ -1889,28 +1945,43 @@ describe('getModelSupportedReasoningEffortOptions', () => {
describe('Other providers', () => {
it('should return correct options for Hunyuan models', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'hunyuan-a13b' }))).toEqual(['none', 'auto'])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'hunyuan-a13b' }))).toEqual([
'default',
'none',
'auto'
])
})
it('should return correct options for Zhipu models', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'glm-4.5' }))).toEqual(['none', 'auto'])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'glm-4.6' }))).toEqual(['none', 'auto'])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'glm-4.5' }))).toEqual([
'default',
'none',
'auto'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'glm-4.6' }))).toEqual([
'default',
'none',
'auto'
])
})
it('should return correct options for Perplexity models', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'sonar-deep-research' }))).toEqual(['medium'])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'sonar-deep-research' }))).toEqual([
'default',
'medium'
])
})
it('should return correct options for DeepSeek hybrid models', () => {
expect(
getModelSupportedReasoningEffortOptions(createModel({ id: 'deepseek-v3.1', provider: 'deepseek' }))
).toEqual(['none', 'auto'])
).toEqual(['default', 'none', 'auto'])
expect(
getModelSupportedReasoningEffortOptions(createModel({ id: 'deepseek-v3.2', provider: 'openrouter' }))
).toEqual(['none', 'auto'])
).toEqual(['default', 'none', 'auto'])
expect(
getModelSupportedReasoningEffortOptions(createModel({ id: 'deepseek-chat', provider: 'deepseek' }))
).toEqual(['none', 'auto'])
).toEqual(['default', 'none', 'auto'])
})
})
@ -1925,7 +1996,7 @@ describe('getModelSupportedReasoningEffortOptions', () => {
provider: 'openrouter'
})
)
).toEqual(['none', 'auto'])
).toEqual(['default', 'none', 'auto'])
expect(
getModelSupportedReasoningEffortOptions(
@ -1934,7 +2005,7 @@ describe('getModelSupportedReasoningEffortOptions', () => {
name: 'gpt-5.1'
})
)
).toEqual(['none', 'low', 'medium', 'high'])
).toEqual(['default', 'none', 'low', 'medium', 'high'])
// Qwen models work well for name-based fallback
expect(
@ -1944,7 +2015,7 @@ describe('getModelSupportedReasoningEffortOptions', () => {
name: 'qwen-plus'
})
)
).toEqual(['none', 'low', 'medium', 'high'])
).toEqual(['default', 'none', 'low', 'medium', 'high'])
})
it('should use id result when id matches', () => {
@ -1955,7 +2026,7 @@ describe('getModelSupportedReasoningEffortOptions', () => {
name: 'Different Name'
})
)
).toEqual(['none', 'low', 'medium', 'high'])
).toEqual(['default', 'none', 'low', 'medium', 'high'])
expect(
getModelSupportedReasoningEffortOptions(
@ -1964,20 +2035,27 @@ describe('getModelSupportedReasoningEffortOptions', () => {
name: 'Some other name'
})
)
).toEqual(['low', 'medium', 'high'])
).toEqual(['default', 'low', 'medium', 'high'])
})
})
describe('Case sensitivity', () => {
it('should handle case insensitive model IDs', () => {
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'GPT-5.1' }))).toEqual([
'default',
'none',
'low',
'medium',
'high'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'O3-MINI' }))).toEqual(['low', 'medium', 'high'])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'O3-MINI' }))).toEqual([
'default',
'low',
'medium',
'high'
])
expect(getModelSupportedReasoningEffortOptions(createModel({ id: 'Gemini-2.5-Flash-Latest' }))).toEqual([
'default',
'none',
'low',
'medium',

View File

@ -746,6 +746,12 @@ export const SYSTEM_MODELS: Record<SystemProviderId | 'defaultModel', Model[]> =
}
],
doubao: [
{
id: 'doubao-seed-1-8-251215',
provider: 'doubao',
name: 'Doubao-Seed-1.8',
group: 'Doubao-Seed-1.8'
},
{
id: 'doubao-1-5-vision-pro-32k-250115',
provider: 'doubao',

View File

@ -59,31 +59,31 @@ export const MODEL_SUPPORTED_REASONING_EFFORT = {
// 模型类型到支持选项的映射表
export const MODEL_SUPPORTED_OPTIONS: ThinkingOptionConfig = {
default: ['none', ...MODEL_SUPPORTED_REASONING_EFFORT.default] as const,
o: MODEL_SUPPORTED_REASONING_EFFORT.o,
openai_deep_research: MODEL_SUPPORTED_REASONING_EFFORT.openai_deep_research,
gpt5: [...MODEL_SUPPORTED_REASONING_EFFORT.gpt5] as const,
gpt5pro: MODEL_SUPPORTED_REASONING_EFFORT.gpt5pro,
gpt5_codex: MODEL_SUPPORTED_REASONING_EFFORT.gpt5_codex,
gpt5_1: MODEL_SUPPORTED_REASONING_EFFORT.gpt5_1,
gpt5_1_codex: MODEL_SUPPORTED_REASONING_EFFORT.gpt5_1_codex,
gpt5_2: MODEL_SUPPORTED_REASONING_EFFORT.gpt5_2,
gpt5_1_codex_max: MODEL_SUPPORTED_REASONING_EFFORT.gpt5_1_codex_max,
gpt52pro: MODEL_SUPPORTED_REASONING_EFFORT.gpt52pro,
grok: MODEL_SUPPORTED_REASONING_EFFORT.grok,
grok4_fast: ['none', ...MODEL_SUPPORTED_REASONING_EFFORT.grok4_fast] as const,
gemini: ['none', ...MODEL_SUPPORTED_REASONING_EFFORT.gemini] as const,
gemini_pro: MODEL_SUPPORTED_REASONING_EFFORT.gemini_pro,
gemini3: MODEL_SUPPORTED_REASONING_EFFORT.gemini3,
qwen: ['none', ...MODEL_SUPPORTED_REASONING_EFFORT.qwen] as const,
qwen_thinking: MODEL_SUPPORTED_REASONING_EFFORT.qwen_thinking,
doubao: ['none', ...MODEL_SUPPORTED_REASONING_EFFORT.doubao] as const,
doubao_no_auto: ['none', ...MODEL_SUPPORTED_REASONING_EFFORT.doubao_no_auto] as const,
doubao_after_251015: MODEL_SUPPORTED_REASONING_EFFORT.doubao_after_251015,
hunyuan: ['none', ...MODEL_SUPPORTED_REASONING_EFFORT.hunyuan] as const,
zhipu: ['none', ...MODEL_SUPPORTED_REASONING_EFFORT.zhipu] as const,
perplexity: MODEL_SUPPORTED_REASONING_EFFORT.perplexity,
deepseek_hybrid: ['none', ...MODEL_SUPPORTED_REASONING_EFFORT.deepseek_hybrid] as const
default: ['default', 'none', ...MODEL_SUPPORTED_REASONING_EFFORT.default] as const,
o: ['default', ...MODEL_SUPPORTED_REASONING_EFFORT.o] as const,
openai_deep_research: ['default', ...MODEL_SUPPORTED_REASONING_EFFORT.openai_deep_research] as const,
gpt5: ['default', ...MODEL_SUPPORTED_REASONING_EFFORT.gpt5] as const,
gpt5pro: ['default', ...MODEL_SUPPORTED_REASONING_EFFORT.gpt5pro] as const,
gpt5_codex: ['default', ...MODEL_SUPPORTED_REASONING_EFFORT.gpt5_codex] as const,
gpt5_1: ['default', ...MODEL_SUPPORTED_REASONING_EFFORT.gpt5_1] as const,
gpt5_1_codex: ['default', ...MODEL_SUPPORTED_REASONING_EFFORT.gpt5_1_codex] as const,
gpt5_2: ['default', ...MODEL_SUPPORTED_REASONING_EFFORT.gpt5_2] as const,
gpt5_1_codex_max: ['default', ...MODEL_SUPPORTED_REASONING_EFFORT.gpt5_1_codex_max] as const,
gpt52pro: ['default', ...MODEL_SUPPORTED_REASONING_EFFORT.gpt52pro] as const,
grok: ['default', ...MODEL_SUPPORTED_REASONING_EFFORT.grok] as const,
grok4_fast: ['default', 'none', ...MODEL_SUPPORTED_REASONING_EFFORT.grok4_fast] as const,
gemini: ['default', 'none', ...MODEL_SUPPORTED_REASONING_EFFORT.gemini] as const,
gemini_pro: ['default', ...MODEL_SUPPORTED_REASONING_EFFORT.gemini_pro] as const,
gemini3: ['default', ...MODEL_SUPPORTED_REASONING_EFFORT.gemini3] as const,
qwen: ['default', 'none', ...MODEL_SUPPORTED_REASONING_EFFORT.qwen] as const,
qwen_thinking: ['default', ...MODEL_SUPPORTED_REASONING_EFFORT.qwen_thinking] as const,
doubao: ['default', 'none', ...MODEL_SUPPORTED_REASONING_EFFORT.doubao] as const,
doubao_no_auto: ['default', 'none', ...MODEL_SUPPORTED_REASONING_EFFORT.doubao_no_auto] as const,
doubao_after_251015: ['default', ...MODEL_SUPPORTED_REASONING_EFFORT.doubao_after_251015] as const,
hunyuan: ['default', 'none', ...MODEL_SUPPORTED_REASONING_EFFORT.hunyuan] as const,
zhipu: ['default', 'none', ...MODEL_SUPPORTED_REASONING_EFFORT.zhipu] as const,
perplexity: ['default', ...MODEL_SUPPORTED_REASONING_EFFORT.perplexity] as const,
deepseek_hybrid: ['default', 'none', ...MODEL_SUPPORTED_REASONING_EFFORT.deepseek_hybrid] as const
} as const
const withModelIdAndNameAsId = <T>(model: Model, fn: (model: Model) => T): { idResult: T; nameResult: T } => {
@ -146,7 +146,7 @@ const _getThinkModelType = (model: Model): ThinkingModelType => {
} else if (isSupportedThinkingTokenDoubaoModel(model)) {
if (isDoubaoThinkingAutoModel(model)) {
thinkingModelType = 'doubao'
} else if (isDoubaoSeedAfter251015(model)) {
} else if (isDoubaoSeedAfter251015(model) || isDoubaoSeed18Model(model)) {
thinkingModelType = 'doubao_after_251015'
} else {
thinkingModelType = 'doubao_no_auto'
@ -191,20 +191,28 @@ const _getModelSupportedReasoningEffortOptions = (model: Model): ReasoningEffort
* - The model is null/undefined
* - The model doesn't support reasoning effort or thinking tokens
*
* All reasoning models support the 'default' option (always the first element),
* which represents no additional configuration for thinking behavior.
*
* @example
* // OpenAI o-series models support low, medium, high
* // OpenAI o-series models support default, low, medium, high
* getModelSupportedReasoningEffortOptions({ id: 'o3-mini', ... })
* // Returns: ['low', 'medium', 'high']
* // Returns: ['default', 'low', 'medium', 'high']
* // 'default' = no additional configuration for thinking behavior
*
* @example
* // GPT-5.1 models support none, low, medium, high
* // GPT-5.1 models support default, none, low, medium, high
* getModelSupportedReasoningEffortOptions({ id: 'gpt-5.1', ... })
* // Returns: ['none', 'low', 'medium', 'high']
* // Returns: ['default', 'none', 'low', 'medium', 'high']
* // 'default' = no additional configuration
* // 'none' = explicitly disable reasoning
*
* @example
* // Gemini Flash models support none, low, medium, high, auto
* // Gemini Flash models support default, none, low, medium, high, auto
* getModelSupportedReasoningEffortOptions({ id: 'gemini-2.5-flash-latest', ... })
* // Returns: ['none', 'low', 'medium', 'high', 'auto']
* // Returns: ['default', 'none', 'low', 'medium', 'high', 'auto']
* // 'default' = no additional configuration
* // 'auto' = let the model automatically decide
*
* @example
* // Non-reasoning models return undefined
@ -214,7 +222,7 @@ const _getModelSupportedReasoningEffortOptions = (model: Model): ReasoningEffort
* @example
* // Name fallback when id doesn't match
* getModelSupportedReasoningEffortOptions({ id: 'custom-id', name: 'gpt-5.1', ... })
* // Returns: ['none', 'low', 'medium', 'high']
* // Returns: ['default', 'none', 'low', 'medium', 'high']
*/
export const getModelSupportedReasoningEffortOptions = (
model: Model | undefined | null
@ -449,7 +457,7 @@ export function isQwenAlwaysThinkModel(model?: Model): boolean {
// Doubao 支持思考模式的模型正则
export const DOUBAO_THINKING_MODEL_REGEX =
/doubao-(?:1[.-]5-thinking-vision-pro|1[.-]5-thinking-pro-m|seed-1[.-]6(?:-flash)?(?!-(?:thinking)(?:-|$))|seed-code(?:-preview)?(?:-\d+)?)(?:-[\w-]+)*/i
/doubao-(?:1[.-]5-thinking-vision-pro|1[.-]5-thinking-pro-m|seed-1[.-][68](?:-flash)?(?!-(?:thinking)(?:-|$))|seed-code(?:-preview)?(?:-\d+)?)(?:-[\w-]+)*/i
// 支持 auto 的 Doubao 模型 doubao-seed-1.6-xxx doubao-seed-1-6-xxx doubao-1-5-thinking-pro-m-xxx
// Auto thinking is no longer supported after version 251015, see https://console.volcengine.com/ark/region:ark+cn-beijing/model/detail?Id=doubao-seed-1-6
@ -467,6 +475,11 @@ export function isDoubaoSeedAfter251015(model: Model): boolean {
return result
}
export function isDoubaoSeed18Model(model: Model): boolean {
const pattern = /doubao-seed-1[.-]8(?:-[\w-]+)?/i
return pattern.test(model.id) || pattern.test(model.name)
}
export function isSupportedThinkingTokenDoubaoModel(model?: Model): boolean {
if (!model) {
return false

View File

@ -25,7 +25,7 @@ export const FUNCTION_CALLING_MODELS = [
'learnlm(?:-[\\w-]+)?',
'gemini(?:-[\\w-]+)?', // 提前排除了gemini的嵌入模型
'grok-3(?:-[\\w-]+)?',
'doubao-seed-1[.-]6(?:-[\\w-]+)?',
'doubao-seed-1[.-][68](?:-[\\w-]+)?',
'doubao-seed-code(?:-[\\w-]+)?',
'kimi-k2(?:-[\\w-]+)?',
'ling-\\w+(?:-[\\w-]+)?',

View File

@ -45,7 +45,7 @@ const visionAllowedModels = [
'deepseek-vl(?:[\\w-]+)?',
'kimi-latest',
'gemma-3(?:-[\\w-]+)',
'doubao-seed-1[.-]6(?:-[\\w-]+)?',
'doubao-seed-1[.-][68](?:-[\\w-]+)?',
'doubao-seed-code(?:-[\\w-]+)?',
'kimi-thinking-preview',
`gemma3(?:[-:\\w]+)?`,

View File

@ -5,7 +5,7 @@
*/
import { loggerService } from '@logger'
import type { AgentType, BuiltinMCPServerName, BuiltinOcrProviderId, ThinkingOption } from '@renderer/types'
import type { AgentType, BuiltinMCPServerName, BuiltinOcrProviderId } from '@renderer/types'
import { BuiltinMCPServerNames } from '@renderer/types'
import i18n from './index'
@ -310,20 +310,6 @@ export const getHttpMessageLabel = (key: string): string => {
return getLabel(httpMessageKeyMap, key)
}
const reasoningEffortOptionsKeyMap: Record<ThinkingOption, string> = {
none: 'assistants.settings.reasoning_effort.off',
minimal: 'assistants.settings.reasoning_effort.minimal',
high: 'assistants.settings.reasoning_effort.high',
low: 'assistants.settings.reasoning_effort.low',
medium: 'assistants.settings.reasoning_effort.medium',
auto: 'assistants.settings.reasoning_effort.default',
xhigh: 'assistants.settings.reasoning_effort.xhigh'
} as const
export const getReasoningEffortOptionsLabel = (key: string): string => {
return getLabel(reasoningEffortOptionsKeyMap, key)
}
const fileFieldKeyMap = {
created_at: 'files.created_at',
size: 'files.size',
@ -344,7 +330,8 @@ const builtInMcpDescriptionKeyMap: Record<BuiltinMCPServerName, string> = {
[BuiltinMCPServerNames.difyKnowledge]: 'settings.mcp.builtinServersDescriptions.dify_knowledge',
[BuiltinMCPServerNames.python]: 'settings.mcp.builtinServersDescriptions.python',
[BuiltinMCPServerNames.didiMCP]: 'settings.mcp.builtinServersDescriptions.didi_mcp',
[BuiltinMCPServerNames.browser]: 'settings.mcp.builtinServersDescriptions.browser'
[BuiltinMCPServerNames.browser]: 'settings.mcp.builtinServersDescriptions.browser',
[BuiltinMCPServerNames.nowledgeMem]: 'settings.mcp.builtinServersDescriptions.nowledge_mem'
} as const
export const getBuiltInMcpServerDescriptionLabel = (key: string): string => {

View File

@ -32,6 +32,7 @@
},
"gitBash": {
"autoDetected": "Using auto-detected Git Bash",
"autoDiscoveredHint": "Auto-discovered",
"clear": {
"button": "Clear custom path"
},
@ -39,6 +40,7 @@
"error": {
"description": "Git Bash is required to run agents on Windows. The agent cannot function without it. Please install Git for Windows from",
"recheck": "Recheck Git Bash Installation",
"required": "Git Bash path is required on Windows",
"title": "Git Bash Required"
},
"found": {
@ -51,7 +53,9 @@
"invalidPath": "Selected file is not a valid Git Bash executable (bash.exe).",
"title": "Select Git Bash executable"
},
"success": "Git Bash detected successfully!"
"placeholder": "Select bash.exe path",
"success": "Git Bash detected successfully!",
"tooltip": "Git Bash is required to run agents on Windows. Install from git-scm.com if not available."
},
"input": {
"placeholder": "Enter your message here, send with {{key}} - @ select path, / select command"
@ -544,14 +548,23 @@
"more": "Assistant Settings",
"prompt": "Prompt Settings",
"reasoning_effort": {
"auto": "Auto",
"auto_description": "Flexibly determine reasoning effort",
"default": "Default",
"default_description": "Depend on the model's default behavior, without any configuration.",
"high": "High",
"high_description": "High level reasoning",
"label": "Reasoning effort",
"low": "Low",
"low_description": "Low level reasoning",
"medium": "Medium",
"medium_description": "Medium level reasoning",
"minimal": "Minimal",
"minimal_description": "Minimal reasoning",
"off": "Off",
"xhigh": "Extra High"
"off_description": "Disable reasoning",
"xhigh": "Extra High",
"xhigh_description": "Extra high level reasoning"
},
"regular_phrases": {
"add": "Add Phrase",
@ -3218,24 +3231,43 @@
},
"content": "Export some data, including chat logs and settings. Please note that the backup process may take some time. Thank you for your patience.",
"lan": {
"auto_close_tip": "Auto-closing in {{seconds}} seconds...",
"confirm_close_message": "File transfer is in progress. Closing will interrupt the transfer. Are you sure you want to force close?",
"confirm_close_title": "Confirm Close",
"connected": "Connected",
"connection_failed": "Connection failed",
"content": "Please ensure your computer and phone are on the same network for LAN transfer. Open the Cherry Studio App to scan this QR code.",
"content": "Please ensure your computer and phone are on the same network for LAN transfer.",
"device_list_title": "Local network devices",
"discovered_devices": "Discovered devices",
"error": {
"file_too_large": "File too large, maximum 500MB supported",
"init_failed": "Initialization failed",
"invalid_file_type": "Only ZIP files are supported",
"no_file": "No file selected",
"no_ip": "Unable to get IP address",
"not_connected": "Please complete handshake first",
"send_failed": "Failed to send file"
},
"force_close": "Force Close",
"generating_qr": "Generating QR code...",
"noZipSelected": "No compressed file selected",
"scan_qr": "Please scan QR code with your phone",
"selectZip": "Select a compressed file",
"sendZip": "Begin data recovery",
"file_transfer": {
"cancelled": "Transfer cancelled",
"failed": "File transfer failed: {{message}}",
"progress": "Sending... {{progress}}%",
"success": "File sent successfully"
},
"handshake": {
"button": "Handshake",
"failed": "Handshake failed: {{message}}",
"in_progress": "Handshaking...",
"success": "Handshake completed with {{device}}",
"test_message_received": "Received pong from {{device}}",
"test_message_sent": "Sent hello world test payload"
},
"idle_hint": "Scan paused. Start scanning to find Cherry Studio peers on your LAN.",
"ip_addresses": "IP addresses",
"last_seen": "Last seen at {{time}}",
"metadata": "Metadata",
"no_connection_warning": "Please open LAN Transfer on Cherry Studio mobile",
"no_devices": "No LAN peers found yet",
"scan_devices": "Scan devices",
"scanning_hint": "Scanning your local network for Cherry Studio peers...",
"send_file": "Send File",
"status": {
"completed": "Transfer completed",
"connected": "Connected",
@ -3244,9 +3276,11 @@
"error": "Connection error",
"initializing": "Initializing connection...",
"preparing": "Preparing transfer...",
"sending": "Transferring {{progress}}%",
"waiting_qr_scan": "Please scan QR code to connect"
"sending": "Transferring {{progress}}%"
},
"status_badge_idle": "Idle",
"status_badge_scanning": "Scanning",
"stop_scan": "Stop scan",
"title": "LAN transmission",
"transfer_progress": "Transfer progress"
},
@ -3926,6 +3960,7 @@
"mcp_auto_install": "Automatically install MCP service (beta)",
"memory": "Persistent memory implementation based on a local knowledge graph. This enables the model to remember user-related information across different conversations. Requires configuring the MEMORY_FILE_PATH environment variable.",
"no": "No description",
"nowledge_mem": "Requires Nowledge Mem app running locally. Keeps AI chats, tools, notes, agents, and files in private memory on your computer. Download from https://mem.nowledge.co/",
"python": "Execute Python code in a secure sandbox environment. Run Python with Pyodide, supporting most standard libraries and scientific computing packages",
"sequentialthinking": "A MCP server implementation that provides tools for dynamic and reflective problem solving through structured thinking processes"
},

View File

@ -32,6 +32,7 @@
},
"gitBash": {
"autoDetected": "使用自动检测的 Git Bash",
"autoDiscoveredHint": "自动发现",
"clear": {
"button": "清除自定义路径"
},
@ -39,6 +40,7 @@
"error": {
"description": "在 Windows 上运行智能体需要 Git Bash。没有它智能体无法运行。请从以下地址安装 Git for Windows",
"recheck": "重新检测 Git Bash 安装",
"required": "在 Windows 上需要配置 Git Bash 路径",
"title": "需要 Git Bash"
},
"found": {
@ -51,7 +53,9 @@
"invalidPath": "选择的文件不是有效的 Git Bash 可执行文件bash.exe。",
"title": "选择 Git Bash 可执行文件"
},
"success": "成功检测到 Git Bash"
"placeholder": "选择 bash.exe 路径",
"success": "成功检测到 Git Bash",
"tooltip": "在 Windows 上运行智能体需要 Git Bash。如果未安装请从 git-scm.com 下载安装。"
},
"input": {
"placeholder": "在这里输入消息,按 {{key}} 发送 - @ 选择路径, / 选择命令"
@ -544,14 +548,23 @@
"more": "助手设置",
"prompt": "提示词设置",
"reasoning_effort": {
"auto": "自动",
"auto_description": "灵活决定推理力度",
"default": "默认",
"default_description": "依赖模型默认行为,不作任何配置",
"high": "沉思",
"high_description": "高强度推理",
"label": "思维链长度",
"low": "浮想",
"low_description": "低强度推理",
"medium": "斟酌",
"medium_description": "中强度推理",
"minimal": "微念",
"minimal_description": "最小程度的思考",
"off": "关闭",
"xhigh": "穷究"
"off_description": "禁用推理",
"xhigh": "穷究",
"xhigh_description": "超高强度推理"
},
"regular_phrases": {
"add": "添加短语",
@ -3218,24 +3231,43 @@
},
"content": "导出部分数据,包括聊天记录、设置。请注意,备份过程可能需要一些时间,感谢您的耐心等待。",
"lan": {
"auto_close_tip": "{{seconds}} 秒后自动关闭...",
"confirm_close_message": "文件正在传输中,关闭将中断传输。确定要强制关闭吗?",
"confirm_close_title": "确认关闭",
"connected": "连接成功",
"connection_failed": "连接失败",
"content": "请确保电脑和手机处于同一网络以使用局域网传输。请打开 Cherry Studio App 扫描此二维码。",
"content": "请确保电脑和手机处于同一网络以使用局域网传输。",
"device_list_title": "局域网设备列表",
"discovered_devices": "已发现的设备",
"error": {
"file_too_large": "文件过大,最大支持 500MB",
"init_failed": "初始化失败",
"invalid_file_type": "仅支持 ZIP 文件",
"no_file": "未选择文件",
"no_ip": "无法获取 IP 地址",
"not_connected": "请先完成握手连接",
"send_failed": "发送文件失败"
},
"force_close": "强制关闭",
"generating_qr": "正在生成二维码...",
"noZipSelected": "未选择压缩文件",
"scan_qr": "请使用手机扫码连接",
"selectZip": "选择压缩文件",
"sendZip": "开始恢复数据",
"file_transfer": {
"cancelled": "传输已取消",
"failed": "文件发送失败: {{message}}",
"progress": "发送中... {{progress}}%",
"success": "文件发送成功"
},
"handshake": {
"button": "握手测试",
"failed": "握手失败:{{message}}",
"in_progress": "正在握手...",
"success": "已与 {{device}} 建立握手",
"test_message_received": "已收到 {{device}} 的 pong 响应",
"test_message_sent": "已发送 hello world 测试数据"
},
"idle_hint": "扫描已暂停。开始扫描以发现局域网中的 Cherry Studio 设备。",
"ip_addresses": "IP 地址",
"last_seen": "最后活动:{{time}}",
"metadata": "元数据",
"no_connection_warning": "请在 Cherry Studio 移动端打开局域网传输",
"no_devices": "尚未发现局域网设备",
"scan_devices": "扫描设备",
"scanning_hint": "正在扫描局域网中的 Cherry Studio 设备...",
"send_file": "发送文件",
"status": {
"completed": "传输完成",
"connected": "连接成功",
@ -3244,9 +3276,11 @@
"error": "连接出错",
"initializing": "正在初始化连接...",
"preparing": "准备传输中...",
"sending": "传输中 {{progress}}%",
"waiting_qr_scan": "请扫描二维码连接"
"sending": "传输中 {{progress}}%"
},
"status_badge_idle": "空闲",
"status_badge_scanning": "扫描中",
"stop_scan": "停止扫描",
"title": "局域网传输",
"transfer_progress": "传输进度"
},
@ -3926,6 +3960,7 @@
"mcp_auto_install": "自动安装 MCP 服务(测试版)",
"memory": "基于本地知识图谱的持久性记忆基础实现。这使得模型能够在不同对话间记住用户的相关信息。需要配置 MEMORY_FILE_PATH 环境变量。",
"no": "无描述",
"nowledge_mem": "需要本地运行 Nowledge Mem 应用。将 AI 对话、工具、笔记、智能体和文件保存在本地计算机的私有记忆中。请从 https://mem.nowledge.co/ 下载",
"python": "在安全的沙盒环境中执行 Python 代码。使用 Pyodide 运行 Python支持大多数标准库和科学计算包",
"sequentialthinking": "一个 MCP 服务器实现,提供了通过结构化思维过程进行动态和反思性问题解决的工具"
},

View File

@ -32,6 +32,7 @@
},
"gitBash": {
"autoDetected": "使用自動偵測的 Git Bash",
"autoDiscoveredHint": "自動發現",
"clear": {
"button": "清除自訂路徑"
},
@ -39,6 +40,7 @@
"error": {
"description": "在 Windows 上執行 Agent 需要 Git Bash。沒有它 Agent 無法運作。請從以下網址安裝 Git for Windows",
"recheck": "重新偵測 Git Bash 安裝",
"required": "在 Windows 上需要設定 Git Bash 路徑",
"title": "需要 Git Bash"
},
"found": {
@ -51,7 +53,9 @@
"invalidPath": "選擇的檔案不是有效的 Git Bash 可執行檔bash.exe。",
"title": "選擇 Git Bash 可執行檔"
},
"success": "成功偵測到 Git Bash"
"placeholder": "選擇 bash.exe 路徑",
"success": "成功偵測到 Git Bash",
"tooltip": "在 Windows 上執行 Agent 需要 Git Bash。如未安裝請從 git-scm.com 下載安裝。"
},
"input": {
"placeholder": "在這裡輸入您的訊息,使用 {{key}} 傳送 - @ 選擇路徑,/ 選擇命令"
@ -544,14 +548,23 @@
"more": "助手設定",
"prompt": "提示詞設定",
"reasoning_effort": {
"auto": "自動",
"auto_description": "彈性決定推理投入的心力",
"default": "預設",
"default_description": "依賴模型的預設行為,無需任何配置。",
"high": "盡力思考",
"high_description": "高級推理",
"label": "思維鏈長度",
"low": "稍微思考",
"low_description": "低階推理",
"medium": "正常思考",
"medium_description": "中等程度推理",
"minimal": "最少思考",
"minimal_description": "最少推理",
"off": "關閉",
"xhigh": "極力思考"
"off_description": "禁用推理",
"xhigh": "極力思考",
"xhigh_description": "超高階推理"
},
"regular_phrases": {
"add": "新增短語",
@ -3218,24 +3231,43 @@
},
"content": "匯出部分資料,包括聊天記錄與設定。請注意,備份過程可能需要一些時間,感謝耐心等候。",
"lan": {
"auto_close_tip": "將於 {{seconds}} 秒後自動關閉...",
"confirm_close_message": "檔案傳輸正在進行中。關閉將會中斷傳輸。您確定要強制關閉嗎?",
"confirm_close_title": "確認關閉",
"connected": "已連線",
"connection_failed": "連線失敗",
"content": "請確保電腦和手機處於同一網路以使用區域網路傳輸。請開啟 Cherry Studio App 掃描此 QR 碼。",
"content": "請確保電腦和手機處於同一網路以使用區域網路傳輸。",
"device_list_title": "區域網路裝置",
"discovered_devices": "已發現的裝置",
"error": {
"file_too_large": "檔案過大,僅支援最大 500MB",
"init_failed": "初始化失敗",
"invalid_file_type": "僅支援 ZIP 檔案",
"no_file": "未選擇檔案",
"no_ip": "無法取得 IP 位址",
"not_connected": "請先完成握手",
"send_failed": "無法傳送檔案"
},
"force_close": "強制關閉",
"generating_qr": "正在產生 QR 碼...",
"noZipSelected": "未選取壓縮檔案",
"scan_qr": "請使用手機掃描 QR 碼",
"selectZip": "選擇壓縮檔案",
"sendZip": "開始還原資料",
"file_transfer": {
"cancelled": "傳輸已取消",
"failed": "檔案傳輸失敗:{{message}}",
"progress": "傳送中... {{progress}}%",
"success": "檔案傳送成功"
},
"handshake": {
"button": "握手",
"failed": "握手失敗:{{message}}",
"in_progress": "握手中...",
"success": "已與 {{device}} 完成握手",
"test_message_received": "收到來自 {{device}} 的 pong",
"test_message_sent": "已送出 hello world 測試封包"
},
"idle_hint": "掃描已暫停。開始掃描以尋找區域網路中的 Cherry Studio 裝置。",
"ip_addresses": "IP 位址",
"last_seen": "上次看到:{{time}}",
"metadata": "中繼資料",
"no_connection_warning": "請在 Cherry Studio 行動裝置開啟區域網路傳輸",
"no_devices": "尚未找到區域網路節點",
"scan_devices": "掃描裝置",
"scanning_hint": "正在掃描區域網路中的 Cherry Studio 裝置...",
"send_file": "傳送檔案",
"status": {
"completed": "傳輸完成",
"connected": "已連線",
@ -3244,9 +3276,11 @@
"error": "連線錯誤",
"initializing": "正在初始化連線...",
"preparing": "正在準備傳輸...",
"sending": "傳輸中 {{progress}}%",
"waiting_qr_scan": "請掃描 QR 碼以連線"
"sending": "傳輸中 {{progress}}%"
},
"status_badge_idle": "閒置",
"status_badge_scanning": "掃描中",
"stop_scan": "停止掃描",
"title": "區域網路傳輸",
"transfer_progress": "傳輸進度"
},
@ -3926,6 +3960,7 @@
"mcp_auto_install": "自動安裝 MCP 服務(測試版)",
"memory": "基於本機知識圖譜的持久性記憶基礎實做。這使得模型能夠在不同對話間記住使用者的相關資訊。需要設定 MEMORY_FILE_PATH 環境變數。",
"no": "無描述",
"nowledge_mem": "需要本機執行 Nowledge Mem 應用程式。將 AI 對話、工具、筆記、代理和檔案保存在電腦上的私人記憶體中。請從 https://mem.nowledge.co/ 下載",
"python": "在安全的沙盒環境中執行 Python 程式碼。使用 Pyodide 執行 Python支援大多數標準函式庫和科學計算套件",
"sequentialthinking": "一個 MCP 伺服器實做,提供了透過結構化思維過程進行動態和反思性問題解決的工具"
},

View File

@ -32,6 +32,7 @@
},
"gitBash": {
"autoDetected": "Automatisch ermitteltes Git Bash wird verwendet",
"autoDiscoveredHint": "[to be translated]:Auto-discovered",
"clear": {
"button": "Benutzerdefinierten Pfad löschen"
},
@ -39,6 +40,7 @@
"error": {
"description": "Git Bash ist erforderlich, um Agents unter Windows auszuführen. Der Agent kann ohne es nicht funktionieren. Bitte installieren Sie Git für Windows von",
"recheck": "Überprüfe die Git Bash-Installation erneut",
"required": "[to be translated]:Git Bash path is required on Windows",
"title": "Git Bash erforderlich"
},
"found": {
@ -51,7 +53,9 @@
"invalidPath": "Die ausgewählte Datei ist keine gültige Git Bash ausführbare Datei (bash.exe).",
"title": "Git Bash ausführbare Datei auswählen"
},
"success": "Git Bash erfolgreich erkannt!"
"placeholder": "[to be translated]:Select bash.exe path",
"success": "Git Bash erfolgreich erkannt!",
"tooltip": "[to be translated]:Git Bash is required to run agents on Windows. Install from git-scm.com if not available."
},
"input": {
"placeholder": "Gib hier deine Nachricht ein, senden mit {{key}} @ Pfad auswählen, / Befehl auswählen"
@ -544,14 +548,23 @@
"more": "Assistenteneinstellungen",
"prompt": "Prompt-Einstellungen",
"reasoning_effort": {
"auto": "Auto",
"auto_description": "Denkaufwand flexibel bestimmen",
"default": "Standard",
"default_description": "Vom Standardverhalten des Modells abhängen, ohne Konfiguration.",
"high": "Tiefes Nachdenken",
"high_description": "Ganzheitliches Denken",
"label": "Gedankenkettenlänge",
"low": "Spontan",
"low_description": "Geringfügige Argumentation",
"medium": "Überlegt",
"medium_description": "Denken auf mittlerem Niveau",
"minimal": "Minimal",
"minimal_description": "Minimales Denken",
"off": "Aus",
"xhigh": "Extra hoch"
"off_description": "Denken deaktivieren",
"xhigh": "Extra hoch",
"xhigh_description": "Extra hohes Denkvermögen"
},
"regular_phrases": {
"add": "Phrase hinzufügen",
@ -3218,24 +3231,43 @@
},
"content": "Exportieren Sie einige Daten, einschließlich Chat-Protokollen und Einstellungen. Bitte beachten Sie, dass der Sicherungsvorgang einige Zeit in Anspruch nehmen kann. Vielen Dank für Ihre Geduld.",
"lan": {
"auto_close_tip": "Automatisches Schließen in {{seconds}} Sekunden...",
"confirm_close_message": "Dateiübertragung läuft. Beim Schließen wird die Übertragung unterbrochen. Möchten Sie wirklich das Schließen erzwingen?",
"confirm_close_title": "Schließen bestätigen",
"connected": "Verbunden",
"connection_failed": "Verbindung fehlgeschlagen",
"content": "Bitte stelle sicher, dass sich dein Computer und dein Telefon im selben Netzwerk befinden, um eine LAN-Übertragung durchzuführen. Öffne die Cherry Studio App, um diesen QR-Code zu scannen.",
"device_list_title": "[to be translated]:Local network devices",
"discovered_devices": "[to be translated]:Discovered devices",
"error": {
"file_too_large": "[to be translated]:File too large, maximum 500MB supported",
"init_failed": "Initialisierung fehlgeschlagen",
"invalid_file_type": "[to be translated]:Only ZIP files are supported",
"no_file": "Keine Datei ausgewählt",
"no_ip": "IP-Adresse kann nicht abgerufen werden",
"not_connected": "[to be translated]:Please complete handshake first",
"send_failed": "Fehler beim Senden der Datei"
},
"force_close": "Erzwungenes Schließen",
"generating_qr": "QR-Code wird generiert...",
"noZipSelected": "Keine komprimierte Datei ausgewählt",
"scan_qr": "Bitte scannen Sie den QR-Code mit Ihrem Telefon.",
"selectZip": "Wählen Sie eine komprimierte Datei",
"sendZip": "Datenwiederherstellung beginnen",
"file_transfer": {
"cancelled": "[to be translated]:Transfer cancelled",
"failed": "[to be translated]:File transfer failed: {{message}}",
"progress": "[to be translated]:Sending... {{progress}}%",
"success": "[to be translated]:File sent successfully"
},
"handshake": {
"button": "[to be translated]:Handshake",
"failed": "[to be translated]:Handshake failed: {{message}}",
"in_progress": "[to be translated]:Handshaking...",
"success": "[to be translated]:Handshake completed with {{device}}",
"test_message_received": "[to be translated]:Received pong from {{device}}",
"test_message_sent": "[to be translated]:Sent hello world test payload"
},
"idle_hint": "[to be translated]:Scan paused. Start scanning to find Cherry Studio peers on your LAN.",
"ip_addresses": "[to be translated]:IP addresses",
"last_seen": "[to be translated]:Last seen at {{time}}",
"metadata": "[to be translated]:Metadata",
"no_connection_warning": "[to be translated]:Please open LAN Transfer on Cherry Studio mobile",
"no_devices": "[to be translated]:No LAN peers found yet",
"scan_devices": "[to be translated]:Scan devices",
"scanning_hint": "[to be translated]:Scanning your local network for Cherry Studio peers...",
"send_file": "[to be translated]:Send File",
"status": {
"completed": "Übertragung abgeschlossen",
"connected": "Verbunden",
@ -3244,9 +3276,11 @@
"error": "Verbindungsfehler",
"initializing": "Verbindung wird initialisiert...",
"preparing": "Übertragung wird vorbereitet...",
"sending": "Übertrage {{progress}}%",
"waiting_qr_scan": "Bitte QR-Code scannen, um zu verbinden"
"sending": "Übertrage {{progress}}%"
},
"status_badge_idle": "[to be translated]:Idle",
"status_badge_scanning": "[to be translated]:Scanning",
"stop_scan": "[to be translated]:Stop scan",
"title": "LAN-Übertragung",
"transfer_progress": "Übertragungsfortschritt"
},
@ -3926,6 +3960,7 @@
"mcp_auto_install": "MCP-Service automatisch installieren (Beta-Version)",
"memory": "MCP-Server mit persistenter Erinnerungsbasis auf lokalem Wissensgraphen, der Informationen über verschiedene Dialoge hinweg speichert. MEMORY_FILE_PATH-Umgebungsvariable muss konfiguriert werden",
"no": "Keine Beschreibung",
"nowledge_mem": "Erfordert lokal laufende Nowledge Mem App. Speichert KI-Chats, Tools, Notizen, Agenten und Dateien in einem privaten Speicher auf Ihrem Computer. Download unter https://mem.nowledge.co/",
"python": "Python-Code in einem sicheren Sandbox-Umgebung ausführen. Verwendung von Pyodide für Python, Unterstützung für die meisten Standardbibliotheken und wissenschaftliche Pakete",
"sequentialthinking": "MCP-Server-Implementierung mit strukturiertem Denkprozess, der dynamische und reflektierende Problemlösungen ermöglicht"
},

View File

@ -32,6 +32,7 @@
},
"gitBash": {
"autoDetected": "Χρησιμοποιείται αυτόματα εντοπισμένο Git Bash",
"autoDiscoveredHint": "[to be translated]:Auto-discovered",
"clear": {
"button": "Διαγραφή προσαρμοσμένης διαδρομής"
},
@ -39,6 +40,7 @@
"error": {
"description": "Το Git Bash απαιτείται για την εκτέλεση πρακτόρων στα Windows. Ο πράκτορας δεν μπορεί να λειτουργήσει χωρίς αυτό. Παρακαλούμε εγκαταστήστε το Git για Windows από",
"recheck": "Επανέλεγχος Εγκατάστασης του Git Bash",
"required": "[to be translated]:Git Bash path is required on Windows",
"title": "Απαιτείται Git Bash"
},
"found": {
@ -51,7 +53,9 @@
"invalidPath": "Το επιλεγμένο αρχείο δεν είναι έγκυρο εκτελέσιμο Git Bash (bash.exe).",
"title": "Επιλογή εκτελέσιμου Git Bash"
},
"success": "Το Git Bash εντοπίστηκε με επιτυχία!"
"placeholder": "[to be translated]:Select bash.exe path",
"success": "Το Git Bash εντοπίστηκε με επιτυχία!",
"tooltip": "[to be translated]:Git Bash is required to run agents on Windows. Install from git-scm.com if not available."
},
"input": {
"placeholder": "Εισάγετε το μήνυμά σας εδώ, στείλτε με {{key}} - @ επιλέξτε διαδρομή, / επιλέξτε εντολή"
@ -544,14 +548,23 @@
"more": "Ρυθμίσεις Βοηθού",
"prompt": "Ρυθμίσεις προκαλύμματος",
"reasoning_effort": {
"auto": "Αυτοκίνητο",
"auto_description": "Ευέλικτος καθορισμός της προσπάθειας συλλογισμού",
"default": "Προεπιλογή",
"default_description": "Εξαρτηθείτε από την προεπιλεγμένη συμπεριφορά του μοντέλου, χωρίς καμία διαμόρφωση.",
"high": "Μεγάλο",
"high_description": "Υψηλού επιπέδου συλλογισμός",
"label": "Μήκος λογισμικού αλυσίδας",
"low": "Μικρό",
"low_description": "Χαμηλού επιπέδου συλλογιστική",
"medium": "Μεσαίο",
"medium_description": "Αιτιολόγηση μεσαίου επιπέδου",
"minimal": "ελάχιστος",
"minimal_description": "Ελάχιστος συλλογισμός",
"off": "Απενεργοποίηση",
"xhigh": "Εξαιρετικά Υψηλή"
"off_description": "Απενεργοποίηση λογικής",
"xhigh": "Εξαιρετικά Υψηλή",
"xhigh_description": "Εξαιρετικά υψηλού επιπέδου συλλογισμός"
},
"regular_phrases": {
"add": "Προσθήκη φράσης",
@ -3218,24 +3231,43 @@
},
"content": "Εξαγωγή μέρους των δεδομένων, συμπεριλαμβανομένων των ιστορικών συνομιλιών και των ρυθμίσεων. Σημειώστε ότι η διαδικασία δημιουργίας αντιγράφων ασφαλείας ενδέχεται να διαρκέσει κάποιο χρονικό διάστημα, ευχαριστούμε για την υπομονή σας.",
"lan": {
"auto_close_tip": "Αυτόματο κλείσιμο σε {{seconds}} δευτερόλεπτα...",
"confirm_close_message": "Η μεταφορά αρχείων είναι σε εξέλιξη. Το κλείσιμο θα διακόψει τη μεταφορά. Είστε σίγουροι ότι θέλετε να κλείσετε βίαια;",
"confirm_close_title": "Επιβεβαίωση Κλεισίματος",
"connected": "Συνδεδεμένος",
"connection_failed": "Η σύνδεση απέτυχε",
"content": "Βεβαιωθείτε ότι ο υπολογιστής και το κινητό βρίσκονται στο ίδιο δίκτυο για να χρησιμοποιήσετε τη μεταφορά LAN. Ανοίξτε την εφαρμογή Cherry Studio και σαρώστε αυτόν τον κωδικό QR.",
"device_list_title": "[to be translated]:Local network devices",
"discovered_devices": "[to be translated]:Discovered devices",
"error": {
"file_too_large": "[to be translated]:File too large, maximum 500MB supported",
"init_failed": "Η αρχικοποίηση απέτυχε",
"invalid_file_type": "[to be translated]:Only ZIP files are supported",
"no_file": "Κανένα αρχείο δεν επιλέχθηκε",
"no_ip": "Αδυναμία λήψης διεύθυνσης IP",
"not_connected": "[to be translated]:Please complete handshake first",
"send_failed": "Αποτυχία αποστολής αρχείου"
},
"force_close": "Κλείσιμο με βία",
"generating_qr": "Δημιουργία κώδικα QR...",
"noZipSelected": "Δεν επιλέχθηκε συμπιεσμένο αρχείο",
"scan_qr": "Παρακαλώ σαρώστε τον κωδικό QR με το τηλέφωνό σας",
"selectZip": "Επιλέξτε συμπιεσμένο αρχείο",
"sendZip": "Έναρξη ανάκτησης δεδομένων",
"file_transfer": {
"cancelled": "[to be translated]:Transfer cancelled",
"failed": "[to be translated]:File transfer failed: {{message}}",
"progress": "[to be translated]:Sending... {{progress}}%",
"success": "[to be translated]:File sent successfully"
},
"handshake": {
"button": "[to be translated]:Handshake",
"failed": "[to be translated]:Handshake failed: {{message}}",
"in_progress": "[to be translated]:Handshaking...",
"success": "[to be translated]:Handshake completed with {{device}}",
"test_message_received": "[to be translated]:Received pong from {{device}}",
"test_message_sent": "[to be translated]:Sent hello world test payload"
},
"idle_hint": "[to be translated]:Scan paused. Start scanning to find Cherry Studio peers on your LAN.",
"ip_addresses": "[to be translated]:IP addresses",
"last_seen": "[to be translated]:Last seen at {{time}}",
"metadata": "[to be translated]:Metadata",
"no_connection_warning": "[to be translated]:Please open LAN Transfer on Cherry Studio mobile",
"no_devices": "[to be translated]:No LAN peers found yet",
"scan_devices": "[to be translated]:Scan devices",
"scanning_hint": "[to be translated]:Scanning your local network for Cherry Studio peers...",
"send_file": "[to be translated]:Send File",
"status": {
"completed": "Η μεταφορά ολοκληρώθηκε",
"connected": "Συνδεδεμένος",
@ -3244,9 +3276,11 @@
"error": "Σφάλμα σύνδεσης",
"initializing": "Αρχικοποίηση σύνδεσης...",
"preparing": "Προετοιμασία μεταφοράς...",
"sending": "Μεταφορά {{progress}}%",
"waiting_qr_scan": "Παρακαλώ σαρώστε τον κωδικό QR για σύνδεση"
"sending": "Μεταφορά {{progress}}%"
},
"status_badge_idle": "[to be translated]:Idle",
"status_badge_scanning": "[to be translated]:Scanning",
"stop_scan": "[to be translated]:Stop scan",
"title": "Μεταφορά τοπικού δικτύου",
"transfer_progress": "Πρόοδος μεταφοράς"
},
@ -3926,6 +3960,7 @@
"mcp_auto_install": "Αυτόματη εγκατάσταση υπηρεσίας MCP (προβολή)",
"memory": "Βασική υλοποίηση μόνιμης μνήμης με βάση τοπικό γράφημα γνώσης. Αυτό επιτρέπει στο μοντέλο να θυμάται πληροφορίες σχετικές με τον χρήστη ανάμεσα σε διαφορετικές συνομιλίες. Απαιτείται η ρύθμιση της μεταβλητής περιβάλλοντος MEMORY_FILE_PATH.",
"no": "Χωρίς περιγραφή",
"nowledge_mem": "[to be translated]:Requires Nowledge Mem app running locally. Keeps AI chats, tools, notes, agents, and files in private memory on your computer. Download from https://mem.nowledge.co/",
"python": "Εκτελέστε κώδικα Python σε ένα ασφαλές περιβάλλον sandbox. Χρησιμοποιήστε το Pyodide για να εκτελέσετε Python, υποστηρίζοντας την πλειονότητα των βιβλιοθηκών της τυπικής βιβλιοθήκης και των πακέτων επιστημονικού υπολογισμού",
"sequentialthinking": "ένας εξυπηρετητής MCP που υλοποιείται, παρέχοντας εργαλεία για δυναμική και αναστοχαστική επίλυση προβλημάτων μέσω δομημένων διαδικασιών σκέψης"
},

View File

@ -32,6 +32,7 @@
},
"gitBash": {
"autoDetected": "Usando Git Bash detectado automáticamente",
"autoDiscoveredHint": "[to be translated]:Auto-discovered",
"clear": {
"button": "Borrar ruta personalizada"
},
@ -39,6 +40,7 @@
"error": {
"description": "Se requiere Git Bash para ejecutar agentes en Windows. El agente no puede funcionar sin él. Instale Git para Windows desde",
"recheck": "Volver a verificar la instalación de Git Bash",
"required": "[to be translated]:Git Bash path is required on Windows",
"title": "Git Bash Requerido"
},
"found": {
@ -51,7 +53,9 @@
"invalidPath": "El archivo seleccionado no es un ejecutable válido de Git Bash (bash.exe).",
"title": "Seleccionar ejecutable de Git Bash"
},
"success": "¡Git Bash detectado con éxito!"
"placeholder": "[to be translated]:Select bash.exe path",
"success": "¡Git Bash detectado con éxito!",
"tooltip": "[to be translated]:Git Bash is required to run agents on Windows. Install from git-scm.com if not available."
},
"input": {
"placeholder": "Introduce tu mensaje aquí, envía con {{key}} - @ seleccionar ruta, / seleccionar comando"
@ -544,14 +548,23 @@
"more": "Configuración del Asistente",
"prompt": "Configuración de Palabras Clave",
"reasoning_effort": {
"auto": "Automóvil",
"auto_description": "Determinar flexiblemente el esfuerzo de razonamiento",
"default": "Por defecto",
"default_description": "Depender del comportamiento predeterminado del modelo, sin ninguna configuración.",
"high": "Largo",
"high_description": "Razonamiento de alto nivel",
"label": "Longitud de Cadena de Razonamiento",
"low": "Corto",
"low_description": "Razonamiento de bajo nivel",
"medium": "Medio",
"medium_description": "Razonamiento de nivel medio",
"minimal": "minimal",
"minimal_description": "Razonamiento mínimo",
"off": "Apagado",
"xhigh": "Extra Alta"
"off_description": "Deshabilitar razonamiento",
"xhigh": "Extra Alta",
"xhigh_description": "Razonamiento de extra alto nivel"
},
"regular_phrases": {
"add": "Agregar frase",
@ -3218,24 +3231,43 @@
},
"content": "Exportar parte de los datos, incluidos los registros de chat y la configuración. Tenga en cuenta que el proceso de copia de seguridad puede tardar un tiempo; gracias por su paciencia.",
"lan": {
"auto_close_tip": "Cierre automático en {{seconds}} segundos...",
"confirm_close_message": "La transferencia de archivos está en progreso. Cerrar interrumpirá la transferencia. ¿Estás seguro de que quieres forzar el cierre?",
"confirm_close_title": "Confirmar Cierre",
"connected": "Conectado",
"connection_failed": "Conexión fallida",
"content": "Asegúrate de que el ordenador y el móvil estén en la misma red para usar la transferencia por LAN. Abre la aplicación Cherry Studio y escanea este código QR.",
"device_list_title": "[to be translated]:Local network devices",
"discovered_devices": "[to be translated]:Discovered devices",
"error": {
"file_too_large": "[to be translated]:File too large, maximum 500MB supported",
"init_failed": "Falló la inicialización",
"invalid_file_type": "[to be translated]:Only ZIP files are supported",
"no_file": "Ningún archivo seleccionado",
"no_ip": "No se puede obtener la dirección IP",
"not_connected": "[to be translated]:Please complete handshake first",
"send_failed": "Error al enviar el archivo"
},
"force_close": "Cerrar forzosamente",
"generating_qr": "Generando código QR...",
"noZipSelected": "No se ha seleccionado ningún archivo comprimido",
"scan_qr": "Por favor, escanea el código QR con tu teléfono",
"selectZip": "Seleccionar archivo comprimido",
"sendZip": "Comenzar la recuperación de datos",
"file_transfer": {
"cancelled": "[to be translated]:Transfer cancelled",
"failed": "[to be translated]:File transfer failed: {{message}}",
"progress": "[to be translated]:Sending... {{progress}}%",
"success": "[to be translated]:File sent successfully"
},
"handshake": {
"button": "[to be translated]:Handshake",
"failed": "[to be translated]:Handshake failed: {{message}}",
"in_progress": "[to be translated]:Handshaking...",
"success": "[to be translated]:Handshake completed with {{device}}",
"test_message_received": "[to be translated]:Received pong from {{device}}",
"test_message_sent": "[to be translated]:Sent hello world test payload"
},
"idle_hint": "[to be translated]:Scan paused. Start scanning to find Cherry Studio peers on your LAN.",
"ip_addresses": "[to be translated]:IP addresses",
"last_seen": "[to be translated]:Last seen at {{time}}",
"metadata": "[to be translated]:Metadata",
"no_connection_warning": "[to be translated]:Please open LAN Transfer on Cherry Studio mobile",
"no_devices": "[to be translated]:No LAN peers found yet",
"scan_devices": "[to be translated]:Scan devices",
"scanning_hint": "[to be translated]:Scanning your local network for Cherry Studio peers...",
"send_file": "[to be translated]:Send File",
"status": {
"completed": "Transferencia completada",
"connected": "Conectado",
@ -3244,9 +3276,11 @@
"error": "Error de conexión",
"initializing": "Inicializando conexión...",
"preparing": "Preparando transferencia...",
"sending": "Transfiriendo {{progress}}%",
"waiting_qr_scan": "Por favor, escanea el código QR para conectarte"
"sending": "Transfiriendo {{progress}}%"
},
"status_badge_idle": "[to be translated]:Idle",
"status_badge_scanning": "[to be translated]:Scanning",
"stop_scan": "[to be translated]:Stop scan",
"title": "Transferencia de red local",
"transfer_progress": "Progreso de transferencia"
},
@ -3926,6 +3960,7 @@
"mcp_auto_install": "Instalación automática del servicio MCP (versión beta)",
"memory": "Implementación básica de memoria persistente basada en un grafo de conocimiento local. Esto permite que el modelo recuerde información relevante del usuario entre diferentes conversaciones. Es necesario configurar la variable de entorno MEMORY_FILE_PATH.",
"no": "sin descripción",
"nowledge_mem": "[to be translated]:Requires Nowledge Mem app running locally. Keeps AI chats, tools, notes, agents, and files in private memory on your computer. Download from https://mem.nowledge.co/",
"python": "Ejecuta código Python en un entorno sandbox seguro. Usa Pyodide para ejecutar Python, compatible con la mayoría de las bibliotecas estándar y paquetes de cálculo científico.",
"sequentialthinking": "Una implementación de servidor MCP que proporciona herramientas para la resolución dinámica y reflexiva de problemas mediante un proceso de pensamiento estructurado"
},

View File

@ -32,6 +32,7 @@
},
"gitBash": {
"autoDetected": "Utilisation de Git Bash détecté automatiquement",
"autoDiscoveredHint": "[to be translated]:Auto-discovered",
"clear": {
"button": "Effacer le chemin personnalisé"
},
@ -39,6 +40,7 @@
"error": {
"description": "Git Bash est requis pour exécuter des agents sur Windows. L'agent ne peut pas fonctionner sans. Veuillez installer Git pour Windows depuis",
"recheck": "Revérifier l'installation de Git Bash",
"required": "[to be translated]:Git Bash path is required on Windows",
"title": "Git Bash requis"
},
"found": {
@ -51,7 +53,9 @@
"invalidPath": "Le fichier sélectionné n'est pas un exécutable Git Bash valide (bash.exe).",
"title": "Sélectionner l'exécutable Git Bash"
},
"success": "Git Bash détecté avec succès !"
"placeholder": "[to be translated]:Select bash.exe path",
"success": "Git Bash détecté avec succès !",
"tooltip": "[to be translated]:Git Bash is required to run agents on Windows. Install from git-scm.com if not available."
},
"input": {
"placeholder": "Entrez votre message ici, envoyez avec {{key}} - @ sélectionner le chemin, / sélectionner la commande"
@ -544,14 +548,23 @@
"more": "Paramètres de l'assistant",
"prompt": "Paramètres de l'invite",
"reasoning_effort": {
"auto": "Auto",
"auto_description": "Déterminer de manière flexible l'effort de raisonnement",
"default": "Par défaut",
"default_description": "Dépendre du comportement par défaut du modèle, sans aucune configuration.",
"high": "Long",
"high_description": "Raisonnement de haut niveau",
"label": "Longueur de la chaîne de raisonnement",
"low": "Court",
"low_description": "Raisonnement de bas niveau",
"medium": "Moyen",
"medium_description": "Raisonnement de niveau moyen",
"minimal": "minimal",
"minimal_description": "Réflexion minimale",
"off": "Off",
"xhigh": "Très élevée"
"off_description": "Désactiver le raisonnement",
"xhigh": "Très élevée",
"xhigh_description": "Raisonnement de très haut niveau"
},
"regular_phrases": {
"add": "Добавить фразу",
@ -3218,24 +3231,43 @@
},
"content": "Exporter une partie des données, incluant les historiques de discussion et les paramètres. Veuillez noter que le processus de sauvegarde peut prendre un certain temps ; merci pour votre patience.",
"lan": {
"auto_close_tip": "Fermeture automatique dans {{seconds}} secondes...",
"confirm_close_message": "Le transfert de fichier est en cours. Fermer interrompra le transfert. Êtes-vous sûr de vouloir forcer la fermeture ?",
"confirm_close_title": "Confirmer la fermeture",
"connected": "Connecté",
"connection_failed": "Échec de la connexion",
"content": "Assurez-vous que l'ordinateur et le téléphone sont connectés au même réseau pour utiliser le transfert en réseau local. Ouvrez l'application Cherry Studio et scannez ce code QR.",
"device_list_title": "[to be translated]:Local network devices",
"discovered_devices": "[to be translated]:Discovered devices",
"error": {
"file_too_large": "[to be translated]:File too large, maximum 500MB supported",
"init_failed": "Échec de l'initialisation",
"invalid_file_type": "[to be translated]:Only ZIP files are supported",
"no_file": "Aucun fichier sélectionné",
"no_ip": "Impossible d'obtenir l'adresse IP",
"not_connected": "[to be translated]:Please complete handshake first",
"send_failed": "Échec de l'envoi du fichier"
},
"force_close": "Fermer de force",
"generating_qr": "Génération du code QR...",
"noZipSelected": "Aucun fichier compressé sélectionné",
"scan_qr": "Veuillez scanner le code QR avec votre téléphone",
"selectZip": "Sélectionner le fichier compressé",
"sendZip": "Commencer la restauration des données",
"file_transfer": {
"cancelled": "[to be translated]:Transfer cancelled",
"failed": "[to be translated]:File transfer failed: {{message}}",
"progress": "[to be translated]:Sending... {{progress}}%",
"success": "[to be translated]:File sent successfully"
},
"handshake": {
"button": "[to be translated]:Handshake",
"failed": "[to be translated]:Handshake failed: {{message}}",
"in_progress": "[to be translated]:Handshaking...",
"success": "[to be translated]:Handshake completed with {{device}}",
"test_message_received": "[to be translated]:Received pong from {{device}}",
"test_message_sent": "[to be translated]:Sent hello world test payload"
},
"idle_hint": "[to be translated]:Scan paused. Start scanning to find Cherry Studio peers on your LAN.",
"ip_addresses": "[to be translated]:IP addresses",
"last_seen": "[to be translated]:Last seen at {{time}}",
"metadata": "[to be translated]:Metadata",
"no_connection_warning": "[to be translated]:Please open LAN Transfer on Cherry Studio mobile",
"no_devices": "[to be translated]:No LAN peers found yet",
"scan_devices": "[to be translated]:Scan devices",
"scanning_hint": "[to be translated]:Scanning your local network for Cherry Studio peers...",
"send_file": "[to be translated]:Send File",
"status": {
"completed": "Transfert terminé",
"connected": "Connecté",
@ -3244,9 +3276,11 @@
"error": "Erreur de connexion",
"initializing": "Initialisation de la connexion...",
"preparing": "Préparation du transfert...",
"sending": "Transfert {{progress}} %",
"waiting_qr_scan": "Veuillez scanner le code QR pour vous connecter"
"sending": "Transfert {{progress}} %"
},
"status_badge_idle": "[to be translated]:Idle",
"status_badge_scanning": "[to be translated]:Scanning",
"stop_scan": "[to be translated]:Stop scan",
"title": "Transmission en réseau local",
"transfer_progress": "Progression du transfert"
},
@ -3926,6 +3960,7 @@
"mcp_auto_install": "Installation automatique du service MCP (version bêta)",
"memory": "Implémentation de base de mémoire persistante basée sur un graphe de connaissances local. Cela permet au modèle de se souvenir des informations relatives à l'utilisateur entre différentes conversations. Nécessite la configuration de la variable d'environnement MEMORY_FILE_PATH.",
"no": "sans description",
"nowledge_mem": "[to be translated]:Requires Nowledge Mem app running locally. Keeps AI chats, tools, notes, agents, and files in private memory on your computer. Download from https://mem.nowledge.co/",
"python": "Exécutez du code Python dans un environnement bac à sable sécurisé. Utilisez Pyodide pour exécuter Python, prenant en charge la plupart des bibliothèques standard et des packages de calcul scientifique.",
"sequentialthinking": "Un serveur MCP qui fournit des outils permettant une résolution dynamique et réflexive des problèmes à travers un processus de pensée structuré"
},

View File

@ -32,6 +32,7 @@
},
"gitBash": {
"autoDetected": "自動検出されたGit Bashを使用中",
"autoDiscoveredHint": "[to be translated]:Auto-discovered",
"clear": {
"button": "カスタムパスをクリア"
},
@ -39,6 +40,7 @@
"error": {
"description": "Windowsでエージェントを実行するにはGit Bashが必要です。これがないとエージェントは動作しません。以下からGit for Windowsをインストールしてください。",
"recheck": "Git Bashのインストールを再確認してください",
"required": "[to be translated]:Git Bash path is required on Windows",
"title": "Git Bashが必要です"
},
"found": {
@ -51,7 +53,9 @@
"invalidPath": "選択されたファイルは有効なGit Bash実行ファイルbash.exeではありません。",
"title": "Git Bash実行ファイルを選択"
},
"success": "Git Bashが正常に検出されました"
"placeholder": "[to be translated]:Select bash.exe path",
"success": "Git Bashが正常に検出されました",
"tooltip": "[to be translated]:Git Bash is required to run agents on Windows. Install from git-scm.com if not available."
},
"input": {
"placeholder": "メッセージをここに入力し、{{key}}で送信 - @でパスを選択、/でコマンドを選択"
@ -544,14 +548,23 @@
"more": "アシスタント設定",
"prompt": "プロンプト設定",
"reasoning_effort": {
"auto": "自動",
"auto_description": "推論にかける労力を柔軟に調整する",
"default": "デフォルト",
"default_description": "設定なしで、モデルの既定の動作に依存する。",
"high": "最大限の思考",
"high_description": "高度な推論",
"label": "思考連鎖の長さ",
"low": "少しの思考",
"low_description": "低レベル推論",
"medium": "普通の思考",
"medium_description": "中レベル推論",
"minimal": "最小限の思考",
"minimal_description": "最小限の推論",
"off": "オフ",
"xhigh": "超高"
"off_description": "推論を無効にする",
"xhigh": "超高",
"xhigh_description": "超高度な推論"
},
"regular_phrases": {
"add": "プロンプトを追加",
@ -3218,24 +3231,43 @@
},
"content": "一部のデータ、チャット履歴や設定をエクスポートします。バックアップには時間がかかる場合がありますので、しばらくお待ちください。",
"lan": {
"auto_close_tip": "{{seconds}}秒後に自動的に閉じます...",
"confirm_close_message": "ファイル転送が進行中です。閉じると転送が中断されます。強制終了してもよろしいですか?",
"confirm_close_title": "閉じることを確認",
"connected": "接続済み",
"connection_failed": "接続に失敗しました",
"content": "コンピューターとスマートフォンが同じネットワークに接続されていることを確認し、ローカルエリアネットワーク転送を使用してください。Cherry Studioアプリを開き、このQRコードをスキャンしてください。",
"device_list_title": "[to be translated]:Local network devices",
"discovered_devices": "[to be translated]:Discovered devices",
"error": {
"file_too_large": "[to be translated]:File too large, maximum 500MB supported",
"init_failed": "初期化に失敗しました",
"invalid_file_type": "[to be translated]:Only ZIP files are supported",
"no_file": "ファイルが選択されていません",
"no_ip": "IPアドレスを取得できません",
"not_connected": "[to be translated]:Please complete handshake first",
"send_failed": "ファイルの送信に失敗しました"
},
"force_close": "強制終了",
"generating_qr": "QRコードを生成中...",
"noZipSelected": "圧縮ファイルが選択されていません",
"scan_qr": "携帯電話でQRコードをスキャンしてください",
"selectZip": "圧縮ファイルを選択",
"sendZip": "データの復元を開始します",
"file_transfer": {
"cancelled": "[to be translated]:Transfer cancelled",
"failed": "[to be translated]:File transfer failed: {{message}}",
"progress": "[to be translated]:Sending... {{progress}}%",
"success": "[to be translated]:File sent successfully"
},
"handshake": {
"button": "[to be translated]:Handshake",
"failed": "[to be translated]:Handshake failed: {{message}}",
"in_progress": "[to be translated]:Handshaking...",
"success": "[to be translated]:Handshake completed with {{device}}",
"test_message_received": "[to be translated]:Received pong from {{device}}",
"test_message_sent": "[to be translated]:Sent hello world test payload"
},
"idle_hint": "[to be translated]:Scan paused. Start scanning to find Cherry Studio peers on your LAN.",
"ip_addresses": "[to be translated]:IP addresses",
"last_seen": "[to be translated]:Last seen at {{time}}",
"metadata": "[to be translated]:Metadata",
"no_connection_warning": "[to be translated]:Please open LAN Transfer on Cherry Studio mobile",
"no_devices": "[to be translated]:No LAN peers found yet",
"scan_devices": "[to be translated]:Scan devices",
"scanning_hint": "[to be translated]:Scanning your local network for Cherry Studio peers...",
"send_file": "[to be translated]:Send File",
"status": {
"completed": "転送完了",
"connected": "接続済み",
@ -3244,9 +3276,11 @@
"error": "接続エラー",
"initializing": "接続を初期化中...",
"preparing": "転送準備中...",
"sending": "転送中 {{progress}}%",
"waiting_qr_scan": "QRコードをスキャンして接続してください"
"sending": "転送中 {{progress}}%"
},
"status_badge_idle": "[to be translated]:Idle",
"status_badge_scanning": "[to be translated]:Scanning",
"stop_scan": "[to be translated]:Stop scan",
"title": "LAN転送",
"transfer_progress": "転送進行"
},
@ -3926,6 +3960,7 @@
"mcp_auto_install": "MCPサービスの自動インストールベータ版",
"memory": "ローカルのナレッジグラフに基づく永続的なメモリの基本的な実装です。これにより、モデルは異なる会話間でユーザーの関連情報を記憶できるようになります。MEMORY_FILE_PATH 環境変数の設定が必要です。",
"no": "説明なし",
"nowledge_mem": "Nowledge Mem アプリをローカルで実行する必要があります。AI チャット、ツール、ート、エージェント、ファイルをコンピューター上のプライベートメモリに保存します。https://mem.nowledge.co/ からダウンロードしてください",
"python": "安全なサンドボックス環境でPythonコードを実行します。Pyodideを使用してPythonを実行し、ほとんどの標準ライブラリと科学計算パッケージをサポートしています。",
"sequentialthinking": "構造化された思考プロセスを通じて動的かつ反省的な問題解決を行うためのツールを提供するMCPサーバーの実装"
},

View File

@ -32,6 +32,7 @@
},
"gitBash": {
"autoDetected": "Usando Git Bash detectado automaticamente",
"autoDiscoveredHint": "[to be translated]:Auto-discovered",
"clear": {
"button": "Limpar caminho personalizado"
},
@ -39,6 +40,7 @@
"error": {
"description": "O Git Bash é necessário para executar agentes no Windows. O agente não pode funcionar sem ele. Por favor, instale o Git para Windows a partir de",
"recheck": "Reverificar a Instalação do Git Bash",
"required": "[to be translated]:Git Bash path is required on Windows",
"title": "Git Bash Necessário"
},
"found": {
@ -51,7 +53,9 @@
"invalidPath": "O arquivo selecionado não é um executável válido do Git Bash (bash.exe).",
"title": "Selecionar executável do Git Bash"
},
"success": "Git Bash detectado com sucesso!"
"placeholder": "[to be translated]:Select bash.exe path",
"success": "Git Bash detectado com sucesso!",
"tooltip": "[to be translated]:Git Bash is required to run agents on Windows. Install from git-scm.com if not available."
},
"input": {
"placeholder": "Digite sua mensagem aqui, envie com {{key}} - @ selecionar caminho, / selecionar comando"
@ -544,14 +548,23 @@
"more": "Configurações do Assistente",
"prompt": "Configurações de Prompt",
"reasoning_effort": {
"auto": "Automóvel",
"auto_description": "Determinar flexivelmente o esforço de raciocínio",
"default": "Padrão",
"default_description": "Depender do comportamento padrão do modelo, sem qualquer configuração.",
"high": "Longo",
"high_description": "Raciocínio de alto nível",
"label": "Comprimento da Cadeia de Raciocínio",
"low": "Curto",
"low_description": "Raciocínio de baixo nível",
"medium": "Médio",
"medium_description": "Raciocínio de nível médio",
"minimal": "mínimo",
"minimal_description": "Raciocínio mínimo",
"off": "Desligado",
"xhigh": "Extra Alta"
"off_description": "Desabilitar raciocínio",
"xhigh": "Extra Alta",
"xhigh_description": "Raciocínio de altíssimo nível"
},
"regular_phrases": {
"add": "Adicionar Frase",
@ -3218,24 +3231,43 @@
},
"content": "Exportar parte dos dados, incluindo registros de conversas e configurações. Observe que o processo de backup pode demorar um pouco; agradecemos sua paciência.",
"lan": {
"auto_close_tip": "Fechando automaticamente em {{seconds}} segundos...",
"confirm_close_message": "Transferência de arquivo em andamento. Fechar irá interromper a transferência. Tem certeza de que deseja forçar o fechamento?",
"confirm_close_title": "Confirmar Fechamento",
"connected": "Conectado",
"connection_failed": "Falha na conexão",
"content": "Certifique-se de que o computador e o telefone estejam na mesma rede para usar a transferência via LAN. Abra o aplicativo Cherry Studio e escaneie este código QR.",
"device_list_title": "[to be translated]:Local network devices",
"discovered_devices": "[to be translated]:Discovered devices",
"error": {
"file_too_large": "[to be translated]:File too large, maximum 500MB supported",
"init_failed": "Falha na inicialização",
"invalid_file_type": "[to be translated]:Only ZIP files are supported",
"no_file": "Nenhum arquivo selecionado",
"no_ip": "Incapaz de obter endereço IP",
"not_connected": "[to be translated]:Please complete handshake first",
"send_failed": "Falha ao enviar arquivo"
},
"force_close": "Forçar Fechamento",
"generating_qr": "Gerando código QR...",
"noZipSelected": "Nenhum arquivo de compressão selecionado",
"scan_qr": "Por favor, escaneie o código QR com o seu telefone",
"selectZip": "Selecionar arquivo compactado",
"sendZip": "Iniciar recuperação de dados",
"file_transfer": {
"cancelled": "[to be translated]:Transfer cancelled",
"failed": "[to be translated]:File transfer failed: {{message}}",
"progress": "[to be translated]:Sending... {{progress}}%",
"success": "[to be translated]:File sent successfully"
},
"handshake": {
"button": "[to be translated]:Handshake",
"failed": "[to be translated]:Handshake failed: {{message}}",
"in_progress": "[to be translated]:Handshaking...",
"success": "[to be translated]:Handshake completed with {{device}}",
"test_message_received": "[to be translated]:Received pong from {{device}}",
"test_message_sent": "[to be translated]:Sent hello world test payload"
},
"idle_hint": "[to be translated]:Scan paused. Start scanning to find Cherry Studio peers on your LAN.",
"ip_addresses": "[to be translated]:IP addresses",
"last_seen": "[to be translated]:Last seen at {{time}}",
"metadata": "[to be translated]:Metadata",
"no_connection_warning": "[to be translated]:Please open LAN Transfer on Cherry Studio mobile",
"no_devices": "[to be translated]:No LAN peers found yet",
"scan_devices": "[to be translated]:Scan devices",
"scanning_hint": "[to be translated]:Scanning your local network for Cherry Studio peers...",
"send_file": "[to be translated]:Send File",
"status": {
"completed": "Transferência concluída",
"connected": "Conectado",
@ -3244,9 +3276,11 @@
"error": "Erro de conexão",
"initializing": "Inicializando conexão...",
"preparing": "Preparando transferência...",
"sending": "Transferindo {{progress}}%",
"waiting_qr_scan": "Por favor, escaneie o código QR para conectar"
"sending": "Transferindo {{progress}}%"
},
"status_badge_idle": "[to be translated]:Idle",
"status_badge_scanning": "[to be translated]:Scanning",
"stop_scan": "[to be translated]:Stop scan",
"title": "transmissão de rede local",
"transfer_progress": "Progresso da transferência"
},
@ -3926,6 +3960,7 @@
"mcp_auto_install": "Instalação automática do serviço MCP (beta)",
"memory": "Implementação base de memória persistente baseada em grafos de conhecimento locais. Isso permite que o modelo lembre informações relevantes do utilizador entre diferentes conversas. É necessário configurar a variável de ambiente MEMORY_FILE_PATH.",
"no": "sem descrição",
"nowledge_mem": "Requer a aplicação Nowledge Mem em execução localmente. Mantém conversas de IA, ferramentas, notas, agentes e ficheiros numa memória privada no seu computador. Transfira de https://mem.nowledge.co/",
"python": "Executar código Python num ambiente sandbox seguro. Utilizar Pyodide para executar Python, suportando a maioria das bibliotecas padrão e pacotes de computação científica",
"sequentialthinking": "Uma implementação de servidor MCP que fornece ferramentas para resolução dinâmica e reflexiva de problemas através de um processo de pensamento estruturado"
},

View File

@ -32,6 +32,7 @@
},
"gitBash": {
"autoDetected": "Используется автоматически обнаруженный Git Bash",
"autoDiscoveredHint": "[to be translated]:Auto-discovered",
"clear": {
"button": "Очистить пользовательский путь"
},
@ -39,6 +40,7 @@
"error": {
"description": "Для запуска агентов в Windows требуется Git Bash. Без него агент не может работать. Пожалуйста, установите Git для Windows с",
"recheck": "Повторная проверка установки Git Bash",
"required": "[to be translated]:Git Bash path is required on Windows",
"title": "Требуется Git Bash"
},
"found": {
@ -51,7 +53,9 @@
"invalidPath": "Выбранный файл не является допустимым исполняемым файлом Git Bash (bash.exe).",
"title": "Выберите исполняемый файл Git Bash"
},
"success": "Git Bash успешно обнаружен!"
"placeholder": "[to be translated]:Select bash.exe path",
"success": "Git Bash успешно обнаружен!",
"tooltip": "[to be translated]:Git Bash is required to run agents on Windows. Install from git-scm.com if not available."
},
"input": {
"placeholder": "Введите ваше сообщение здесь, отправьте с помощью {{key}} — @ выбрать путь, / выбрать команду"
@ -544,14 +548,23 @@
"more": "Настройки ассистента",
"prompt": "Настройки промптов",
"reasoning_effort": {
"auto": "Авто",
"auto_description": "Гибко определяйте усилие на рассуждение",
"default": "По умолчанию",
"default_description": "Полагаться на поведение модели по умолчанию, без какой-либо конфигурации.",
"high": "Стараюсь думать",
"high_description": "Высокоуровневое рассуждение",
"label": "Настройки размышлений",
"low": "Меньше думать",
"low_description": "Низкоуровневое рассуждение",
"medium": "Среднее",
"medium_description": "Средний уровень рассуждения",
"minimal": "минимальный",
"minimal_description": "Минимальное рассуждение",
"off": "Выключить",
"xhigh": "Сверхвысокое"
"off_description": "Отключить рассуждение",
"xhigh": "Сверхвысокое",
"xhigh_description": "Высочайший уровень рассуждений"
},
"regular_phrases": {
"add": "Добавить подсказку",
@ -3218,24 +3231,43 @@
},
"content": "Экспорт части данных, включая историю чатов и настройки. Обратите внимание, процесс резервного копирования может занять некоторое время, благодарим за ваше терпение.",
"lan": {
"auto_close_tip": "Автоматическое закрытие через {{seconds}} секунд...",
"confirm_close_message": "Передача файла в процессе. Закрытие прервет передачу. Вы уверены, что хотите принудительно закрыть?",
"confirm_close_title": "Подтвердить закрытие",
"connected": "Подключено",
"connection_failed": "Соединение не удалось",
"content": "Убедитесь, что компьютер и телефон подключены к одной сети, чтобы использовать локальную передачу. Откройте приложение Cherry Studio и отсканируйте этот QR-код.",
"device_list_title": "[to be translated]:Local network devices",
"discovered_devices": "[to be translated]:Discovered devices",
"error": {
"file_too_large": "[to be translated]:File too large, maximum 500MB supported",
"init_failed": "Инициализация не удалась",
"invalid_file_type": "[to be translated]:Only ZIP files are supported",
"no_file": "Файл не выбран",
"no_ip": "Не удалось получить IP-адрес",
"not_connected": "[to be translated]:Please complete handshake first",
"send_failed": "Не удалось отправить файл"
},
"force_close": "Принудительное закрытие",
"generating_qr": "Генерация QR-кода...",
"noZipSelected": "Архив не выбран",
"scan_qr": "Пожалуйста, отсканируйте QR-код с помощью вашего телефона",
"selectZip": "Выберите архив",
"sendZip": "Начать восстановление данных",
"file_transfer": {
"cancelled": "[to be translated]:Transfer cancelled",
"failed": "[to be translated]:File transfer failed: {{message}}",
"progress": "[to be translated]:Sending... {{progress}}%",
"success": "[to be translated]:File sent successfully"
},
"handshake": {
"button": "[to be translated]:Handshake",
"failed": "[to be translated]:Handshake failed: {{message}}",
"in_progress": "[to be translated]:Handshaking...",
"success": "[to be translated]:Handshake completed with {{device}}",
"test_message_received": "[to be translated]:Received pong from {{device}}",
"test_message_sent": "[to be translated]:Sent hello world test payload"
},
"idle_hint": "[to be translated]:Scan paused. Start scanning to find Cherry Studio peers on your LAN.",
"ip_addresses": "[to be translated]:IP addresses",
"last_seen": "[to be translated]:Last seen at {{time}}",
"metadata": "[to be translated]:Metadata",
"no_connection_warning": "[to be translated]:Please open LAN Transfer on Cherry Studio mobile",
"no_devices": "[to be translated]:No LAN peers found yet",
"scan_devices": "[to be translated]:Scan devices",
"scanning_hint": "[to be translated]:Scanning your local network for Cherry Studio peers...",
"send_file": "[to be translated]:Send File",
"status": {
"completed": "Перевод завершён",
"connected": "Подключено",
@ -3244,9 +3276,11 @@
"error": "Ошибка подключения",
"initializing": "Инициализация соединения...",
"preparing": "Подготовка передачи...",
"sending": "Передача {{progress}}%",
"waiting_qr_scan": "Пожалуйста, отсканируйте QR-код для подключения"
"sending": "Передача {{progress}}%"
},
"status_badge_idle": "[to be translated]:Idle",
"status_badge_scanning": "[to be translated]:Scanning",
"stop_scan": "[to be translated]:Stop scan",
"title": "Передача по локальной сети",
"transfer_progress": "Прогресс передачи"
},
@ -3926,6 +3960,7 @@
"mcp_auto_install": "Автоматическая установка службы MCP (бета-версия)",
"memory": "реализация постоянной памяти на основе локального графа знаний. Это позволяет модели запоминать информацию о пользователе между различными диалогами. Требуется настроить переменную среды MEMORY_FILE_PATH.",
"no": "без описания",
"nowledge_mem": "Требуется запущенное локально приложение Nowledge Mem. Хранит чаты ИИ, инструменты, заметки, агентов и файлы в приватной памяти на вашем компьютере. Скачать можно на https://mem.nowledge.co/",
"python": "Выполняйте код Python в безопасной песочнице. Запускайте Python с помощью Pyodide, поддерживается большинство стандартных библиотек и пакетов для научных вычислений",
"sequentialthinking": "MCP серверная реализация, предоставляющая инструменты для динамического и рефлексивного решения проблем посредством структурированного мыслительного процесса"
},

View File

@ -6,7 +6,8 @@ import {
MdiLightbulbOn30,
MdiLightbulbOn50,
MdiLightbulbOn80,
MdiLightbulbOn90
MdiLightbulbOn90,
MdiLightbulbQuestion
} from '@renderer/components/Icons/SVGIcon'
import { QuickPanelReservedSymbol, useQuickPanel } from '@renderer/components/QuickPanel'
import {
@ -18,7 +19,6 @@ import {
MODEL_SUPPORTED_OPTIONS
} from '@renderer/config/models'
import { useAssistant } from '@renderer/hooks/useAssistant'
import { getReasoningEffortOptionsLabel } from '@renderer/i18n/label'
import type { ToolQuickPanelApi } from '@renderer/pages/home/Inputbar/types'
import type { Model, ThinkingOption } from '@renderer/types'
import { Tooltip } from 'antd'
@ -88,19 +88,48 @@ const ThinkingButton: FC<Props> = ({ quickPanel, model, assistantId }): ReactEle
[updateAssistantSettings, assistant.enableWebSearch, model, t]
)
const reasoningEffortOptionLabelMap = {
default: t('assistants.settings.reasoning_effort.default'),
none: t('assistants.settings.reasoning_effort.off'),
minimal: t('assistants.settings.reasoning_effort.minimal'),
high: t('assistants.settings.reasoning_effort.high'),
low: t('assistants.settings.reasoning_effort.low'),
medium: t('assistants.settings.reasoning_effort.medium'),
auto: t('assistants.settings.reasoning_effort.auto'),
xhigh: t('assistants.settings.reasoning_effort.xhigh')
} as const satisfies Record<ThinkingOption, string>
const reasoningEffortDescriptionMap = {
default: t('assistants.settings.reasoning_effort.default_description'),
none: t('assistants.settings.reasoning_effort.off_description'),
minimal: t('assistants.settings.reasoning_effort.minimal_description'),
low: t('assistants.settings.reasoning_effort.low_description'),
medium: t('assistants.settings.reasoning_effort.medium_description'),
high: t('assistants.settings.reasoning_effort.high_description'),
xhigh: t('assistants.settings.reasoning_effort.xhigh_description'),
auto: t('assistants.settings.reasoning_effort.auto_description')
} as const satisfies Record<ThinkingOption, string>
const panelItems = useMemo(() => {
// 使用表中定义的选项创建UI选项
return supportedOptions.map((option) => ({
level: option,
label: getReasoningEffortOptionsLabel(option),
description: '',
label: reasoningEffortOptionLabelMap[option],
description: reasoningEffortDescriptionMap[option],
icon: ThinkingIcon({ option }),
isSelected: currentReasoningEffort === option,
action: () => onThinkingChange(option)
}))
}, [currentReasoningEffort, supportedOptions, onThinkingChange])
}, [
supportedOptions,
reasoningEffortOptionLabelMap,
reasoningEffortDescriptionMap,
currentReasoningEffort,
onThinkingChange
])
const isThinkingEnabled = currentReasoningEffort !== undefined && currentReasoningEffort !== 'none'
const isThinkingEnabled =
currentReasoningEffort !== undefined && currentReasoningEffort !== 'none' && currentReasoningEffort !== 'default'
const disableThinking = useCallback(() => {
onThinkingChange('none')
@ -197,8 +226,9 @@ const ThinkingIcon = (props: { option?: ThinkingOption; isFixedReasoning?: boole
case 'none':
IconComponent = MdiLightbulbOffOutline
break
case 'default':
default:
IconComponent = MdiLightbulbOffOutline
IconComponent = MdiLightbulbQuestion
break
}
}

View File

@ -11,7 +11,7 @@ import { NutstoreIcon } from '@renderer/components/Icons/NutstoreIcons'
import { HStack } from '@renderer/components/Layout'
import ListItem from '@renderer/components/ListItem'
import BackupPopup from '@renderer/components/Popups/BackupPopup'
import ExportToPhoneLanPopup from '@renderer/components/Popups/ExportToPhoneLanPopup'
import LanTransferPopup from '@renderer/components/Popups/LanTransferPopup'
import RestorePopup from '@renderer/components/Popups/RestorePopup'
import { useTheme } from '@renderer/context/ThemeProvider'
import { useKnowledgeFiles } from '@renderer/hooks/useKnowledgeFiles'
@ -628,11 +628,12 @@ const DataSettings: FC = () => {
<SettingRow>
<SettingRowTitle>{t('settings.data.export_to_phone.title')}</SettingRowTitle>
<HStack gap="5px" justifyContent="space-between">
<Button onClick={ExportToPhoneLanPopup.show} icon={<WifiOutlined size={14} />}>
<Button onClick={LanTransferPopup.show} icon={<WifiOutlined size={14} />}>
{t('settings.data.export_to_phone.lan.title')}
</Button>
</HStack>
</SettingRow>
<SettingDivider />
</SettingGroup>
<SettingGroup theme={theme}>
<SettingTitle>{t('settings.data.data.title')}</SettingTitle>

View File

@ -38,7 +38,8 @@ export const DEFAULT_ASSISTANT_SETTINGS = {
enableTopP: false,
// It would gracefully fallback to prompt if not supported by model.
toolUseMode: 'function',
customParameters: []
customParameters: [],
reasoning_effort: 'default'
} as const satisfies AssistantSettings
export function getDefaultAssistant(): Assistant {
@ -186,7 +187,7 @@ export const getAssistantSettings = (assistant: Assistant): AssistantSettings =>
streamOutput: assistant?.settings?.streamOutput ?? true,
toolUseMode: assistant?.settings?.toolUseMode ?? 'function',
defaultModel: assistant?.defaultModel ?? undefined,
reasoning_effort: assistant?.settings?.reasoning_effort ?? undefined,
reasoning_effort: assistant?.settings?.reasoning_effort ?? 'default',
customParameters: assistant?.settings?.customParameters ?? []
}
}

View File

@ -67,7 +67,7 @@ const persistedReducer = persistReducer(
{
key: 'cherry-studio',
storage,
version: 186,
version: 187,
blacklist: ['runtime', 'messages', 'messageBlocks', 'tabs', 'toolPermissions'],
migrate
},

View File

@ -183,6 +183,16 @@ export const builtinMCPServers: BuiltinMCPServer[] = [
provider: 'CherryAI',
installSource: 'builtin',
isTrusted: true
},
{
id: nanoid(),
name: BuiltinMCPServerNames.nowledgeMem,
reference: 'https://mem.nowledge.co/',
type: 'inMemory',
isActive: false,
provider: 'Nowledge',
installSource: 'builtin',
isTrusted: true
}
] as const

View File

@ -3038,6 +3038,20 @@ const migrateConfig = {
logger.error('migrate 186 error', error as Error)
return state
}
},
'187': (state: RootState) => {
try {
state.assistants.assistants.forEach((assistant) => {
if (assistant.settings && assistant.settings.reasoning_effort === undefined) {
assistant.settings.reasoning_effort = 'default'
}
})
logger.info('migrate 187 success')
return state
} catch (error) {
logger.error('migrate 187 error', error as Error)
return state
}
}
}

View File

@ -108,7 +108,7 @@ const ThinkModelTypes = [
'deepseek_hybrid'
] as const
export type ReasoningEffortOption = NonNullable<OpenAI.ReasoningEffort> | 'auto'
export type ReasoningEffortOption = NonNullable<OpenAI.ReasoningEffort> | 'auto' | 'default'
export type ThinkingOption = ReasoningEffortOption
export type ThinkingModelType = (typeof ThinkModelTypes)[number]
export type ThinkingOptionConfig = Record<ThinkingModelType, ThinkingOption[]>
@ -120,6 +120,8 @@ export function isThinkModelType(type: string): type is ThinkingModelType {
}
export const EFFORT_RATIO: EffortRatio = {
// 'default' is not expected to be used.
default: 0,
none: 0.01,
minimal: 0.05,
low: 0.05,
@ -140,12 +142,11 @@ export type AssistantSettings = {
streamOutput: boolean
defaultModel?: Model
customParameters?: AssistantSettingCustomParameters[]
reasoning_effort?: ReasoningEffortOption
/** 使 reasoning effort, .
*
* TODO: 目前 reasoning_effort === undefined
* / cache
*
reasoning_effort: ReasoningEffortOption
/**
* Preserve the effective reasoning effort (not 'default') from the last use of a thinking model which supports thinking control,
* and restore it when switching back from a non-thinking or fixed reasoning model.
* FIXME: It should be managed by external cache service instead of being stored in the assistant
*/
reasoning_effort_cache?: ReasoningEffortOption
qwenThinkMode?: boolean
@ -750,7 +751,8 @@ export const BuiltinMCPServerNames = {
difyKnowledge: '@cherry/dify-knowledge',
python: '@cherry/python',
didiMCP: '@cherry/didi-mcp',
browser: '@cherry/browser'
browser: '@cherry/browser',
nowledgeMem: '@cherry/nowledge-mem'
} as const
export type BuiltinMCPServerName = (typeof BuiltinMCPServerNames)[keyof typeof BuiltinMCPServerNames]

218
yarn.lock
View File

@ -4451,6 +4451,13 @@ __metadata:
languageName: node
linkType: hard
"@leichtgewicht/ip-codec@npm:^2.0.1":
version: 2.0.5
resolution: "@leichtgewicht/ip-codec@npm:2.0.5"
checksum: 10c0/14a0112bd59615eef9e3446fea018045720cd3da85a98f801a685a818b0d96ef2a1f7227e8d271def546b2e2a0fe91ef915ba9dc912ab7967d2317b1a051d66b
languageName: node
linkType: hard
"@lezer/common@npm:^1.0.0, @lezer/common@npm:^1.0.2, @lezer/common@npm:^1.0.3, @lezer/common@npm:^1.1.0, @lezer/common@npm:^1.2.0, @lezer/common@npm:^1.2.1":
version: 1.2.3
resolution: "@lezer/common@npm:1.2.3"
@ -7243,13 +7250,6 @@ __metadata:
languageName: node
linkType: hard
"@socket.io/component-emitter@npm:~3.1.0":
version: 3.1.2
resolution: "@socket.io/component-emitter@npm:3.1.2"
checksum: 10c0/c4242bad66f67e6f7b712733d25b43cbb9e19a595c8701c3ad99cbeb5901555f78b095e24852f862fffb43e96f1d8552e62def885ca82ae1bb05da3668fd87d7
languageName: node
linkType: hard
"@standard-schema/spec@npm:^1.0.0":
version: 1.0.0
resolution: "@standard-schema/spec@npm:1.0.0"
@ -8267,7 +8267,7 @@ __metadata:
languageName: node
linkType: hard
"@types/cors@npm:^2.8.12, @types/cors@npm:^2.8.19":
"@types/cors@npm:^2.8.19":
version: 2.8.19
resolution: "@types/cors@npm:2.8.19"
dependencies:
@ -8824,15 +8824,6 @@ __metadata:
languageName: node
linkType: hard
"@types/node@npm:>=10.0.0":
version: 24.3.1
resolution: "@types/node@npm:24.3.1"
dependencies:
undici-types: "npm:~7.10.0"
checksum: 10c0/99b86fc32294fcd61136ca1f771026443a1e370e9f284f75e243b29299dd878e18c193deba1ce29a374932db4e30eb80826e1049b9aad02d36f5c30b94b6f928
languageName: node
linkType: hard
"@types/node@npm:^18.11.18":
version: 18.19.86
resolution: "@types/node@npm:18.19.86"
@ -10190,6 +10181,7 @@ __metadata:
archiver: "npm:^7.0.1"
async-mutex: "npm:^0.5.0"
axios: "npm:^1.7.3"
bonjour-service: "npm:^1.3.0"
browser-image-compression: "npm:^2.0.2"
chardet: "npm:^2.1.0"
check-disk-space: "npm:3.4.0"
@ -10278,7 +10270,6 @@ __metadata:
pdf-lib: "npm:^1.17.1"
pdf-parse: "npm:^1.1.1"
proxy-agent: "npm:^6.5.0"
qrcode.react: "npm:^4.2.0"
react: "npm:^19.2.0"
react-dom: "npm:^19.2.0"
react-error-boundary: "npm:^6.0.0"
@ -10310,7 +10301,6 @@ __metadata:
selection-hook: "npm:^1.0.12"
sharp: "npm:^0.34.3"
shiki: "npm:^3.12.0"
socket.io: "npm:^4.8.1"
strict-url-sanitise: "npm:^0.0.1"
string-width: "npm:^7.2.0"
striptags: "npm:^3.2.0"
@ -10373,16 +10363,6 @@ __metadata:
languageName: node
linkType: hard
"accepts@npm:~1.3.4":
version: 1.3.8
resolution: "accepts@npm:1.3.8"
dependencies:
mime-types: "npm:~2.1.34"
negotiator: "npm:0.6.3"
checksum: 10c0/3a35c5f5586cfb9a21163ca47a5f77ac34fa8ceb5d17d2fa2c0d81f41cbd7f8c6fa52c77e2c039acc0f4d09e71abdc51144246900f6bef5e3c4b333f77d89362
languageName: node
linkType: hard
"acorn-jsx@npm:^5.3.2":
version: 5.3.2
resolution: "acorn-jsx@npm:5.3.2"
@ -11021,13 +11001,6 @@ __metadata:
languageName: node
linkType: hard
"base64id@npm:2.0.0, base64id@npm:~2.0.0":
version: 2.0.0
resolution: "base64id@npm:2.0.0"
checksum: 10c0/6919efd237ed44b9988cbfc33eca6f173a10e810ce50292b271a1a421aac7748ef232a64d1e6032b08f19aae48dce6ee8f66c5ae2c9e5066c82b884861d4d453
languageName: node
linkType: hard
"basic-ftp@npm:^5.0.2":
version: 5.0.5
resolution: "basic-ftp@npm:5.0.5"
@ -11143,6 +11116,16 @@ __metadata:
languageName: node
linkType: hard
"bonjour-service@npm:^1.3.0":
version: 1.3.0
resolution: "bonjour-service@npm:1.3.0"
dependencies:
fast-deep-equal: "npm:^3.1.3"
multicast-dns: "npm:^7.2.5"
checksum: 10c0/5721fd9f9bb968e9cc16c1e8116d770863dd2329cb1f753231de1515870648c225142b7eefa71f14a5c22bc7b37ddd7fdeb018700f28a8c936d50d4162d433c7
languageName: node
linkType: hard
"boolbase@npm:^1.0.0":
version: 1.0.0
resolution: "boolbase@npm:1.0.0"
@ -11246,7 +11229,7 @@ __metadata:
languageName: node
linkType: hard
"buffer-equal-constant-time@npm:1.0.1":
"buffer-equal-constant-time@npm:^1.0.1":
version: 1.0.1
resolution: "buffer-equal-constant-time@npm:1.0.1"
checksum: 10c0/fb2294e64d23c573d0dd1f1e7a466c3e978fe94a4e0f8183937912ca374619773bef8e2aceb854129d2efecbbc515bbd0cc78d2734a3e3031edb0888531bbc8e
@ -12254,7 +12237,7 @@ __metadata:
languageName: node
linkType: hard
"cookie@npm:^0.7.1, cookie@npm:~0.7.2":
"cookie@npm:^0.7.1":
version: 0.7.2
resolution: "cookie@npm:0.7.2"
checksum: 10c0/9596e8ccdbf1a3a88ae02cf5ee80c1c50959423e1022e4e60b91dd87c622af1da309253d8abdb258fb5e3eacb4f08e579dc58b4897b8087574eee0fd35dfa5d2
@ -12291,7 +12274,7 @@ __metadata:
languageName: node
linkType: hard
"cors@npm:^2.8.5, cors@npm:~2.8.5":
"cors@npm:^2.8.5":
version: 2.8.5
resolution: "cors@npm:2.8.5"
dependencies:
@ -12961,18 +12944,6 @@ __metadata:
languageName: node
linkType: hard
"debug@npm:~4.3.1, debug@npm:~4.3.2, debug@npm:~4.3.4":
version: 4.3.7
resolution: "debug@npm:4.3.7"
dependencies:
ms: "npm:^2.1.3"
peerDependenciesMeta:
supports-color:
optional: true
checksum: 10c0/1471db19c3b06d485a622d62f65947a19a23fbd0dd73f7fd3eafb697eec5360cde447fb075919987899b1a2096e85d35d4eb5a4de09a57600ac9cf7e6c8e768b
languageName: node
linkType: hard
"decamelize@npm:1.2.0":
version: 1.2.0
resolution: "decamelize@npm:1.2.0"
@ -13361,6 +13332,15 @@ __metadata:
languageName: node
linkType: hard
"dns-packet@npm:^5.2.2":
version: 5.6.1
resolution: "dns-packet@npm:5.6.1"
dependencies:
"@leichtgewicht/ip-codec": "npm:^2.0.1"
checksum: 10c0/8948d3d03063fb68e04a1e386875f8c3bcc398fc375f535f2b438fad8f41bf1afa6f5e70893ba44f4ae884c089247e0a31045722fa6ff0f01d228da103f1811d
languageName: node
linkType: hard
"doctrine@npm:3.0.0":
version: 3.0.0
resolution: "doctrine@npm:3.0.0"
@ -13929,30 +13909,6 @@ __metadata:
languageName: node
linkType: hard
"engine.io-parser@npm:~5.2.1":
version: 5.2.3
resolution: "engine.io-parser@npm:5.2.3"
checksum: 10c0/ed4900d8dbef470ab3839ccf3bfa79ee518ea8277c7f1f2759e8c22a48f64e687ea5e474291394d0c94f84054749fd93f3ef0acb51fa2f5f234cc9d9d8e7c536
languageName: node
linkType: hard
"engine.io@npm:~6.6.0":
version: 6.6.4
resolution: "engine.io@npm:6.6.4"
dependencies:
"@types/cors": "npm:^2.8.12"
"@types/node": "npm:>=10.0.0"
accepts: "npm:~1.3.4"
base64id: "npm:2.0.0"
cookie: "npm:~0.7.2"
cors: "npm:~2.8.5"
debug: "npm:~4.3.1"
engine.io-parser: "npm:~5.2.1"
ws: "npm:~8.17.1"
checksum: 10c0/845761163f8ea7962c049df653b75dafb6b3693ad6f59809d4474751d7b0392cbf3dc2730b8a902ff93677a91fd28711d34ab29efd348a8a4b49c6b0724021ab
languageName: node
linkType: hard
"enhanced-resolve@npm:^5.18.3":
version: 5.18.3
resolution: "enhanced-resolve@npm:5.18.3"
@ -17178,24 +17134,24 @@ __metadata:
languageName: node
linkType: hard
"jwa@npm:^2.0.0":
version: 2.0.0
resolution: "jwa@npm:2.0.0"
"jwa@npm:^2.0.1":
version: 2.0.1
resolution: "jwa@npm:2.0.1"
dependencies:
buffer-equal-constant-time: "npm:1.0.1"
buffer-equal-constant-time: "npm:^1.0.1"
ecdsa-sig-formatter: "npm:1.0.11"
safe-buffer: "npm:^5.0.1"
checksum: 10c0/6baab823b93c038ba1d2a9e531984dcadbc04e9eb98d171f4901b7a40d2be15961a359335de1671d78cb6d987f07cbe5d350d8143255977a889160c4d90fcc3c
checksum: 10c0/ab3ebc6598e10dc11419d4ed675c9ca714a387481466b10e8a6f3f65d8d9c9237e2826f2505280a739cf4cbcf511cb288eeec22b5c9c63286fc5a2e4f97e78cf
languageName: node
linkType: hard
"jws@npm:^4.0.0":
version: 4.0.0
resolution: "jws@npm:4.0.0"
version: 4.0.1
resolution: "jws@npm:4.0.1"
dependencies:
jwa: "npm:^2.0.0"
jwa: "npm:^2.0.1"
safe-buffer: "npm:^5.0.1"
checksum: 10c0/f1ca77ea5451e8dc5ee219cb7053b8a4f1254a79cb22417a2e1043c1eb8a569ae118c68f24d72a589e8a3dd1824697f47d6bd4fb4bebb93a3bdf53545e721661
checksum: 10c0/6be1ed93023aef570ccc5ea8d162b065840f3ef12f0d1bb3114cade844de7a357d5dc558201d9a65101e70885a6fa56b17462f520e6b0d426195510618a154d0
languageName: node
linkType: hard
@ -19131,7 +19087,7 @@ __metadata:
languageName: node
linkType: hard
"mime-types@npm:^2.1.12, mime-types@npm:^2.1.35, mime-types@npm:~2.1.34":
"mime-types@npm:^2.1.12, mime-types@npm:^2.1.35":
version: 2.1.35
resolution: "mime-types@npm:2.1.35"
dependencies:
@ -19535,6 +19491,18 @@ __metadata:
languageName: node
linkType: hard
"multicast-dns@npm:^7.2.5":
version: 7.2.5
resolution: "multicast-dns@npm:7.2.5"
dependencies:
dns-packet: "npm:^5.2.2"
thunky: "npm:^1.0.2"
bin:
multicast-dns: cli.js
checksum: 10c0/5120171d4bdb1577764c5afa96e413353bff530d1b37081cb29cccc747f989eb1baf40574fe8e27060fc1aef72b59c042f72b9b208413de33bcf411343c69057
languageName: node
linkType: hard
"mustache@npm:^4.2.0":
version: 4.2.0
resolution: "mustache@npm:4.2.0"
@ -19618,13 +19586,6 @@ __metadata:
languageName: node
linkType: hard
"negotiator@npm:0.6.3":
version: 0.6.3
resolution: "negotiator@npm:0.6.3"
checksum: 10c0/3ec9fd413e7bf071c937ae60d572bc67155262068ed522cf4b3be5edbe6ddf67d095ec03a3a14ebf8fc8e95f8e1d61be4869db0dbb0de696f6b837358bd43fc2
languageName: node
linkType: hard
"negotiator@npm:^1.0.0":
version: 1.0.0
resolution: "negotiator@npm:1.0.0"
@ -21314,15 +21275,6 @@ __metadata:
languageName: node
linkType: hard
"qrcode.react@npm:^4.2.0":
version: 4.2.0
resolution: "qrcode.react@npm:4.2.0"
peerDependencies:
react: ^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0
checksum: 10c0/68c691d130e5fda2f57cee505ed7aea840e7d02033100687b764601f9595e1116e34c13876628a93e1a5c2b85e4efc27d30b2fda72e2050c02f3e1c4e998d248
languageName: node
linkType: hard
"qs@npm:^6.14.0":
version: 6.14.0
resolution: "qs@npm:6.14.0"
@ -23599,41 +23551,6 @@ __metadata:
languageName: node
linkType: hard
"socket.io-adapter@npm:~2.5.2":
version: 2.5.5
resolution: "socket.io-adapter@npm:2.5.5"
dependencies:
debug: "npm:~4.3.4"
ws: "npm:~8.17.1"
checksum: 10c0/04a5a2a9c4399d1b6597c2afc4492ab1e73430cc124ab02b09e948eabf341180b3866e2b61b5084cb899beb68a4db7c328c29bda5efb9207671b5cb0bc6de44e
languageName: node
linkType: hard
"socket.io-parser@npm:~4.2.4":
version: 4.2.4
resolution: "socket.io-parser@npm:4.2.4"
dependencies:
"@socket.io/component-emitter": "npm:~3.1.0"
debug: "npm:~4.3.1"
checksum: 10c0/9383b30358fde4a801ea4ec5e6860915c0389a091321f1c1f41506618b5cf7cd685d0a31c587467a0c4ee99ef98c2b99fb87911f9dfb329716c43b587f29ca48
languageName: node
linkType: hard
"socket.io@npm:^4.8.1":
version: 4.8.1
resolution: "socket.io@npm:4.8.1"
dependencies:
accepts: "npm:~1.3.4"
base64id: "npm:~2.0.0"
cors: "npm:~2.8.5"
debug: "npm:~4.3.2"
engine.io: "npm:~6.6.0"
socket.io-adapter: "npm:~2.5.2"
socket.io-parser: "npm:~4.2.4"
checksum: 10c0/acf931a2bb235be96433b71da3d8addc63eeeaa8acabd33dc8d64e12287390a45f1e9f389a73cf7dc336961cd491679741b7a016048325c596835abbcc017ca9
languageName: node
linkType: hard
"socks-proxy-agent@npm:^8.0.3, socks-proxy-agent@npm:^8.0.5":
version: 8.0.5
resolution: "socks-proxy-agent@npm:8.0.5"
@ -24421,6 +24338,13 @@ __metadata:
languageName: node
linkType: hard
"thunky@npm:^1.0.2":
version: 1.1.0
resolution: "thunky@npm:1.1.0"
checksum: 10c0/369764f39de1ce1de2ba2fa922db4a3f92e9c7f33bcc9a713241bc1f4a5238b484c17e0d36d1d533c625efb00e9e82c3e45f80b47586945557b45abb890156d2
languageName: node
linkType: hard
"tiktok-video-element@npm:^0.1.0":
version: 0.1.1
resolution: "tiktok-video-element@npm:0.1.1"
@ -25123,13 +25047,6 @@ __metadata:
languageName: node
linkType: hard
"undici-types@npm:~7.10.0":
version: 7.10.0
resolution: "undici-types@npm:7.10.0"
checksum: 10c0/8b00ce50e235fe3cc601307f148b5e8fb427092ee3b23e8118ec0a5d7f68eca8cee468c8fc9f15cbb2cf2a3797945ebceb1cbd9732306a1d00e0a9b6afa0f635
languageName: node
linkType: hard
"undici@npm:6.21.2":
version: 6.21.2
resolution: "undici@npm:6.21.2"
@ -26157,21 +26074,6 @@ __metadata:
languageName: node
linkType: hard
"ws@npm:~8.17.1":
version: 8.17.1
resolution: "ws@npm:8.17.1"
peerDependencies:
bufferutil: ^4.0.1
utf-8-validate: ">=5.0.2"
peerDependenciesMeta:
bufferutil:
optional: true
utf-8-validate:
optional: true
checksum: 10c0/f4a49064afae4500be772abdc2211c8518f39e1c959640457dcee15d4488628620625c783902a52af2dd02f68558da2868fd06e6fd0e67ebcd09e6881b1b5bfe
languageName: node
linkType: hard
"xlsx@https://cdn.sheetjs.com/xlsx-0.20.2/xlsx-0.20.2.tgz":
version: 0.20.2
resolution: "xlsx@https://cdn.sheetjs.com/xlsx-0.20.2/xlsx-0.20.2.tgz"