[Fix] GLM-5-Turbo is missing from the Z.ai provider#51
Conversation
|
1 check still pending after the 10-minute CI watch window. Action required. See task
|
Codecov Report✅ All modified and coverable lines are covered by tests. 📢 Thoughts on this report? Let us know! |
edelauna
left a comment
There was a problem hiding this comment.
🔥 - nice looks good - had one comment related to tech debt, but we can also look at it in another PR or as part of a holistic review.
| expect(model.info.reasoningEffort).toBe("medium") | ||
| expect(model.info.preserveReasoning).toBe(true) | ||
| }) | ||
| }) |
There was a problem hiding this comment.
Nit / consistency, not a blocker: the new behavior tests below verify the thinking: { type: "enabled" | "disabled" } payload only against international_coding. This matches the pre-existing pattern (the GLM-4.7 behavior tests at lines 440/470/501 are also international-only), so this PR isn't introducing a gap — just continuing it.
createStreamWithThinking doesn't branch on zaiApiLine and the OpenAI client is mocked at the SDK level, so adding china_coding versions would exercise effectively the same code modulo baseURL. Worth doing only if you want to lock in symmetry against a future change that adds api-line-specific branching.
|
I see some merge conflicts here. Working on them now... |
90109f6 to
0bdf62d
Compare
0bdf62d to
6bd8e76
Compare
Issue: #31
What problem this solves
Fixes an issue where users configuring the Z.ai provider could not select
glm-5-turboeven though Z.AI publishes that model for both its international and mainland API lines.Why this change was made
This adds the missing GLM-5-Turbo model metadata to the shared Z.ai model tables and extends the existing provider coverage so the model follows the same thinking-mode behavior as the other reasoning-capable GLM entries. The change stays scoped to Z.ai model registration and provider verification instead of reworking model discovery.
User impact
Users can now choose GLM-5-Turbo in Roo Code when using the Z.ai provider. Roo Code exposes the model on both supported Z.ai entrypoints and sends the correct default or disabled thinking state for requests.