Files
cc-switch/src
Dex Miller 5566be2b4b Stop sending prompt cache keys on Claude chat conversions (#2003)
Responses conversions still use promptCacheKey, but chat completions now stay a pure shape transform. This keeps Claude -> chat requests aligned with providers that do not understand the field and keeps stream checks consistent with production behavior.

Constraint: Issue #1919 requires removing prompt_cache_key from Claude -> OpenAI Chat requests
Rejected: Add a runtime toggle for chat injection | requested behavior is unconditional removal
Confidence: high
Scope-risk: narrow
Reversibility: clean
Directive: Keep promptCacheKey limited to Claude -> Responses conversions unless a provider-specific contract is proven
Tested: cargo test anthropic_to_openai
Tested: cargo test anthropic_to_responses_with_cache_key
Tested: cargo test transform_claude_request_for_api_format_responses
Not-tested: Full src-tauri test suite
Related: #1919
2026-04-13 10:22:55 +08:00
..