Feature Request
Problem
The diffs plugin currently injects its agent guidance via prependContext in the before_prompt_build hook:
api.on("before_prompt_build", async () => ({
prependContext: DIFFS_AGENT_GUIDANCE,
}));
prependContext prepends the guidance text to each individual user message turn. Since this text is constant and appears in the "new" portion of every prompt, it is re-sent and billed as fresh input tokens on every single turn — it cannot be cached at the system prompt level by providers like Anthropic.
For a session with 20 turns, the DIFFS_AGENT_GUIDANCE string (~175 tokens) is charged 20 times as new input, even though its content never changes.
Proposed Solution
Add a prependSystemContext (or appendSystemContext) return key to the before_prompt_build hook contract. Content returned via this key would be appended to the agent's system prompt rather than injected into each user turn.
The diffs plugin could then use:
api.on("before_prompt_build", async () => ({
prependSystemContext: DIFFS_AGENT_GUIDANCE,
}));
This would allow the guidance to be cached as part of the system prompt prefix, eliminating the per-turn token overhead.
Impact
- Reduces input token cost for any session where the diffs plugin is active
- Particularly beneficial for long sessions or agents using expensive models (e.g. claude-sonnet-4-6)
- The pattern would also be useful for other plugins that inject static guidance today or in the future
Alternatives Considered
- Adding agent-level
systemPromptSuffix config (checked schema — not currently supported)
- Manually disabling the plugin injection and copying guidance into
SOUL.md/TOOLS.md — fragile and loses the automatic guidance on tool updates
Thanks for the great project!
Feature Request
Problem
The
diffsplugin currently injects its agent guidance viaprependContextin thebefore_prompt_buildhook:prependContextprepends the guidance text to each individual user message turn. Since this text is constant and appears in the "new" portion of every prompt, it is re-sent and billed as fresh input tokens on every single turn — it cannot be cached at the system prompt level by providers like Anthropic.For a session with 20 turns, the
DIFFS_AGENT_GUIDANCEstring (~175 tokens) is charged 20 times as new input, even though its content never changes.Proposed Solution
Add a
prependSystemContext(orappendSystemContext) return key to thebefore_prompt_buildhook contract. Content returned via this key would be appended to the agent's system prompt rather than injected into each user turn.The diffs plugin could then use:
This would allow the guidance to be cached as part of the system prompt prefix, eliminating the per-turn token overhead.
Impact
Alternatives Considered
systemPromptSuffixconfig (checked schema — not currently supported)SOUL.md/TOOLS.md— fragile and loses the automatic guidance on tool updatesThanks for the great project!