Summary
In OpenCode v1.14.24, cache accounting appears to be broken for the new DeepSeek model/provider. After running sessions with a large context, opencode stats still reports Cache Read: 0 and Cache Write: 0, while input token usage is very high.
I am not sure whether the root cause is that caching is not being enabled for the new DeepSeek model, DeepSeek cache usage fields are not being parsed, or the stats aggregation is ignoring provider cache tokens. From the user perspective, cache usage is always reported as zero.
Environment
- OpenCode version:
1.14.24
- Provider/model: OpenAICompatible / new DeepSeek v4 models
- Command used for verification:
opencode stats
Current behavior
opencode stats shows high input token usage but zero cache activity:
Sessions: 2
Messages: 19
Days: 1
Total Cost: $1.08
Avg Cost/Day: $1.08
Avg Tokens/Session: 304.3K
Median Tokens/Session: 304.3K
Input: 599.0K
Output: 2.4K
Cache Read: 0
Cache Write: 0
Tool usage from the same report:
read 25 (62.5%)
bash 8 (20.0%)
grep 4 (10.0%)
skill 1 ( 2.5%)
task 1 ( 2.5%)
glob 1 ( 2.5%)
Expected behavior
When DeepSeek supports prompt/context caching and returns cache usage information, OpenCode should report cache usage correctly:
Cache Read should increase when cached context is reused;
Cache Write should increase when cacheable context is written;
- token/cost accounting should reflect provider cache usage;
- if caching is unsupported for this provider/model, the stats output should make that explicit instead of silently showing zero cache activity.
Steps to reproduce
- Use OpenCode
1.14.24.
- Configure DeepSeek and select the new DeepSeek model.
- Run one or more sessions with repeated large project context/tool reads.
- Run:
- Observe that
Input is high, but Cache Read and Cache Write remain 0.
Why this matters
This makes it difficult to verify whether DeepSeek prompt caching is active. It can also make the stats report misleading, because users may see high token/cost usage but no indication that caching is unavailable, disabled, or not being accounted for.
Screenshot

Summary
In OpenCode v1.14.24, cache accounting appears to be broken for the new DeepSeek model/provider. After running sessions with a large context,
opencode statsstill reportsCache Read: 0andCache Write: 0, while input token usage is very high.I am not sure whether the root cause is that caching is not being enabled for the new DeepSeek model, DeepSeek cache usage fields are not being parsed, or the stats aggregation is ignoring provider cache tokens. From the user perspective, cache usage is always reported as zero.
Environment
1.14.24opencode statsCurrent behavior
opencode statsshows high input token usage but zero cache activity:Tool usage from the same report:
Expected behavior
When DeepSeek supports prompt/context caching and returns cache usage information, OpenCode should report cache usage correctly:
Cache Readshould increase when cached context is reused;Cache Writeshould increase when cacheable context is written;Steps to reproduce
1.14.24.Inputis high, butCache ReadandCache Writeremain0.Why this matters
This makes it difficult to verify whether DeepSeek prompt caching is active. It can also make the stats report misleading, because users may see high token/cost usage but no indication that caching is unavailable, disabled, or not being accounted for.
Screenshot