mirror of
https://github.com/farion1231/cc-switch.git
synced 2026-05-06 22:01:44 +08:00
Responses conversions still use promptCacheKey, but chat completions now stay a pure shape transform. This keeps Claude -> chat requests aligned with providers that do not understand the field and keeps stream checks consistent with production behavior. Constraint: Issue #1919 requires removing prompt_cache_key from Claude -> OpenAI Chat requests Rejected: Add a runtime toggle for chat injection | requested behavior is unconditional removal Confidence: high Scope-risk: narrow Reversibility: clean Directive: Keep promptCacheKey limited to Claude -> Responses conversions unless a provider-specific contract is proven Tested: cargo test anthropic_to_openai Tested: cargo test anthropic_to_responses_with_cache_key Tested: cargo test transform_claude_request_for_api_format_responses Not-tested: Full src-tauri test suite Related: #1919