feat: add Claude Opus 4.7 support across all providers (#12287)#12301
feat: add Claude Opus 4.7 support across all providers (#12287)#12301roomote-v0[bot] wants to merge 3 commits intomainfrom
Conversation
Add claude-opus-4-7 model definitions and references to: - Anthropic direct API types and handler (1M context, prompt caching, extended thinking) - OpenRouter model sets and fetcher maxTokens override - Vercel AI Gateway prompt caching and vision+tools sets - WebView 1M context beta tier pricing check
Fixed Bedrock implementation to match the working version from PR #12288: corrected model ID to |
…port - Rename model ID from "anthropic.claude-opus-4-7-v1" to "anthropic.claude-opus-4-7" to match the correct Bedrock model identifier (per PR #12288) - Update BedrockAdditionalModelFields to support adaptive thinking type - Add BEDROCK_ADAPTIVE_THINKING_ONLY_MODEL_IDS for Opus 4.7 - Opus 4.7 uses thinking.type: "adaptive" with output_config.effort instead of "enabled" with budget_tokens (which returns 400 error)
|
@roomote Also update the description on this PR to reflect the fixes? |
Updated the PR description to reflect the fixes from the latest commit: corrected Bedrock model ID ( |
|
Thanks for pushing this forward — really appreciate having 4.7 land across all providers rather than just Bedrock. A couple of pieces of feedback from someone running Opus 4.7 through a third-party Anthropic-compatible gateway (Kiro-backed CPAP): Anthropic provider changes look correct. I reverse-checked the edits in Small suggestion (non-blocking): every new Opus/Sonnet release touches 4–5 parallel switch/if blocks in this file (1M beta, prompt cache, extended thinking, tier pricing). Consider consolidating these into a single capability set per model in "claude-opus-4-7": {
// ... existing fields ...
capabilities: {
supports1MContext: true,
supportsPromptCache: true,
supportsExtendedThinking: true,
},
}Then the handler can do On the Bedrock fix: seeing the adaptive thinking work ( LGTM on the Anthropic/OpenRouter/Vercel parts from my side — looking forward to getting this merged. |
|
Thanks @hpylsy, I agree it would be cleaner to use a capability hash or something to organise these config variants. I think the models will continue to expose more and evolve these over time, so is likely worthwhile. |
Closes: #12287
Supersedes #12288 by extending Claude Opus 4.7 support beyond just Bedrock to all providers.
Changes
Bedrock (from #12288):
anthropic.claude-opus-4-7model entry tobedrockModelsBEDROCK_1M_CONTEXT_MODEL_IDSandBEDROCK_GLOBAL_INFERENCE_MODEL_IDSanthropic.claude-opus-4-7-v1toanthropic.claude-opus-4-7to match the correct Bedrock identifierthinking.type: "adaptive"withoutput_config.effortinstead of"enabled"withbudget_tokensBEDROCK_ADAPTIVE_THINKING_ONLY_MODEL_IDSconstant for models requiring this thinking modeBedrockAdditionalModelFieldsinterface to support the adaptive thinking typeAnthropic Direct API:
claude-opus-4-7model entry toanthropicModelswith 128K maxTokens, $5/$25 pricing, 1M context tiersgetModel()1M context tier pricing checkOpenRouter:
anthropic/claude-opus-4.7to prompt caching and reasoning budget model setsVercel AI Gateway:
anthropic/claude-opus-4.7to prompt caching and vision+tools model setsWebView UI:
claude-opus-4-7to 1M context beta tier pricing check inuseSelectedModelTest Procedure
Interactively review PR in Roo Code Cloud