fix(tools): route web_search requests through HTTP proxy env vars#16897
Open
battman21 wants to merge 3 commits intoopenclaw:mainfrom
Open
fix(tools): route web_search requests through HTTP proxy env vars#16897battman21 wants to merge 3 commits intoopenclaw:mainfrom
battman21 wants to merge 3 commits intoopenclaw:mainfrom
Conversation
Comment on lines
+46
to
+52
| function resolveSearchDispatcher(): ProxyAgent | undefined { | ||
| const proxyUrl = resolveSearchProxyUrl(); | ||
| if (!proxyUrl) { | ||
| return undefined; | ||
| } | ||
| return new ProxyAgent(proxyUrl); | ||
| } |
Contributor
There was a problem hiding this comment.
New ProxyAgent allocated on every search request
resolveSearchDispatcher() is called at each of the three fetch sites, and each call creates a new ProxyAgent(proxyUrl) — meaning every search request spins up a fresh connection pool. The Zalo proxy (extensions/zalo/src/proxy.ts) avoids this by caching the agent per URL. Consider caching the agent at module level (lazy singleton) so the same connection pool is reused across requests:
Suggested change
| function resolveSearchDispatcher(): ProxyAgent | undefined { | |
| const proxyUrl = resolveSearchProxyUrl(); | |
| if (!proxyUrl) { | |
| return undefined; | |
| } | |
| return new ProxyAgent(proxyUrl); | |
| } | |
| let _cachedDispatcher: ProxyAgent | undefined; | |
| let _cachedProxyUrl: string | undefined; | |
| function resolveSearchDispatcher(): ProxyAgent | undefined { | |
| const proxyUrl = resolveSearchProxyUrl(); | |
| if (!proxyUrl) { | |
| return undefined; | |
| } | |
| if (_cachedDispatcher && _cachedProxyUrl === proxyUrl) { | |
| return _cachedDispatcher; | |
| } | |
| _cachedDispatcher = new ProxyAgent(proxyUrl); | |
| _cachedProxyUrl = proxyUrl; | |
| return _cachedDispatcher; | |
| } |
Prompt To Fix With AI
This is a comment left during a code review.
Path: src/agents/tools/web-search.ts
Line: 46:52
Comment:
**New `ProxyAgent` allocated on every search request**
`resolveSearchDispatcher()` is called at each of the three fetch sites, and each call creates a `new ProxyAgent(proxyUrl)` — meaning every search request spins up a fresh connection pool. The Zalo proxy (`extensions/zalo/src/proxy.ts`) avoids this by caching the agent per URL. Consider caching the agent at module level (lazy singleton) so the same connection pool is reused across requests:
```suggestion
let _cachedDispatcher: ProxyAgent | undefined;
let _cachedProxyUrl: string | undefined;
function resolveSearchDispatcher(): ProxyAgent | undefined {
const proxyUrl = resolveSearchProxyUrl();
if (!proxyUrl) {
return undefined;
}
if (_cachedDispatcher && _cachedProxyUrl === proxyUrl) {
return _cachedDispatcher;
}
_cachedDispatcher = new ProxyAgent(proxyUrl);
_cachedProxyUrl = proxyUrl;
return _cachedDispatcher;
}
```
How can I resolve this? If you propose a fix, please make it concise.8ddf591 to
49b915d
Compare
bfc1ccb to
f92900f
Compare
49e7721 to
45d1c65
Compare
45d1c65 to
e7c86db
Compare
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Fixes #8534
Related: #15869
The
web_searchtool uses barefetch()which doesn't respect HTTP proxy environment variables (HTTP_PROXY,HTTPS_PROXY,http_proxy,https_proxy). This causes "fetch failed" errors in environments behind a proxy (WSL2, corporate networks, China).Node.js 22's built-in
fetch()doesn't automatically pick up proxy env vars — it needs an explicitdispatcherfrom undici'sProxyAgent.Fix: When proxy env vars are set, create an undici
ProxyAgentand pass it as thedispatcherto allfetch()calls (Brave Search, Perplexity, and Grok/xAI).Test plan
pnpm lintpassespnpm buildsucceeds (type-check + bundle)🤖 Generated with Claude Code
Greptile Summary
Adds HTTP proxy support to the
web_searchtool by readingHTTPS_PROXY/HTTP_PROXYenvironment variables and passing an undiciProxyAgentasdispatcherto all three fetch call sites (Brave Search, Perplexity, Grok/xAI). Includes a lazy-cached singleton pattern so theProxyAgentis reused across requests.runPerplexitySearch,runGrokSearch,runWebSearch) are updated consistently with the same dispatcher patternProxyAgentis cached at module level and only recreated if the proxy URL changes, avoiding per-request connection pool allocationdispatcherisundefinedand not spread into the options)src/telegram/proxy.ts,extensions/zalo/src/proxy.ts)Confidence Score: 4/5
as RequestInitcast is the standard approach for passingdispatcherto Node 22's fetch. No logic or security issues found.Last reviewed commit: 49b915d