Skip to content

Add deepseek to fallback_context_limit_for_model with 1M context #87

@zociwhite

Description

@zociwhite

Description

deepseek-v4-flash (DeepSeek V4) supports 1M token context window, but fallback_context_limit_for_model() in
src/provider/models.rs has no hardcoded fallback for deepseek models.

Deepseek models currently fall through to get_cached_context_limit() or DEFAULT_CONTEXT_LIMIT = 200_000. Users
connecting directly to DeepSeek API get capped at 200k.

Suggested Fix

In src/provider/models.rs, add before the final else:

if model.starts_with("deepseek") {
    return Some(1_000_000);
}

Environment

- jcode v0.11.1 (1f622e6b)
- Provider: deepseek (direct, not OpenRouter)
- Model: deepseek-v4-flash
- Actual model context: 1,000,000 tokens

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions