cherry-studio/packages/ai-sdk-provider
SuYao 981bb9f451
fix: update deepseek logic to match deepseek v3.2 (#11648)
* fix: update deepseek dependency to version 1.0.31 and improve provider creation logging

* chore

* feat: deepseek official hybrid infer

* fix: deepseek-v3.2-speciale tooluse and reasoning

* fix: 添加固定推理模型支持并更新相关逻辑

* refactor: simplify logic

* feat: aihubmix

* all system_providers

* feat: cherryin

* temp fix

* fix: address PR review feedback for DeepSeek v3.2 implementation

- Add default case in buildCherryInProviderOptions to fallback to genericProviderOptions
- Add clarifying comment for switch fall-through in reasoning.ts
- Add comprehensive test coverage for isFixedReasoningModel (negative cases)
- Add test coverage for new provider whitelist (deepseek, cherryin, new-api, aihubmix, sophnet, dmxapi)
- Add test coverage for isDeepSeekHybridInferenceModel prefix patterns
- Verify function calling logic works correctly via regex matching after removing provider-based checks
- Use includes() for deepseek-chat matching to support potential variants

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: remove unnecessary fall-through case for unknown providers in getReasoningEffort

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-12-04 19:13:51 +08:00
..
src fix: update deepseek logic to match deepseek v3.2 (#11648) 2025-12-04 19:13:51 +08:00
package.json fix: update deepseek logic to match deepseek v3.2 (#11648) 2025-12-04 19:13:51 +08:00
README.md feat: add @cherrystudio/ai-sdk-provider package and integrate (#10715) 2025-11-12 18:16:27 +08:00
tsconfig.json feat: add @cherrystudio/ai-sdk-provider package and integrate (#10715) 2025-11-12 18:16:27 +08:00
tsdown.config.ts feat: add @cherrystudio/ai-sdk-provider package and integrate (#10715) 2025-11-12 18:16:27 +08:00

@cherrystudio/ai-sdk-provider

CherryIN provider bundle for the Vercel AI SDK.
It exposes the CherryIN OpenAI-compatible entrypoints and dynamically routes Anthropic and Gemini model ids to their CherryIN upstream equivalents.

Installation

npm install ai @cherrystudio/ai-sdk-provider @ai-sdk/anthropic @ai-sdk/google @ai-sdk/openai
# or
yarn add ai @cherrystudio/ai-sdk-provider @ai-sdk/anthropic @ai-sdk/google @ai-sdk/openai

Note

: This package requires peer dependencies ai, @ai-sdk/anthropic, @ai-sdk/google, and @ai-sdk/openai to be installed.

Usage

import { createCherryIn, cherryIn } from '@cherrystudio/ai-sdk-provider'

const cherryInProvider = createCherryIn({
  apiKey: process.env.CHERRYIN_API_KEY,
  // optional overrides:
  // baseURL: 'https://open.cherryin.net/v1',
  // anthropicBaseURL: 'https://open.cherryin.net/anthropic',
  // geminiBaseURL: 'https://open.cherryin.net/gemini/v1beta',
})

// Chat models will auto-route based on the model id prefix:
const openaiModel = cherryInProvider.chat('gpt-4o-mini')
const anthropicModel = cherryInProvider.chat('claude-3-5-sonnet-latest')
const geminiModel = cherryInProvider.chat('gemini-2.0-pro-exp')

const { text } = await openaiModel.invoke('Hello CherryIN!')

The provider also exposes completion, responses, embedding, image, transcription, and speech helpers aligned with the upstream APIs.

See AI SDK docs for configuring custom providers.