Skip to content

Request to improve the performance of parseHeaders #2491

@ywave620

Description

@ywave620

Bug Description

parseHeaders in the utils.js seems too slow.

Reproducible By

const dispatcher = new Pool('http://127.0.0.1:8090' /* an 4worker nginx */, {
  pipelining: 30,
  connections: 100,
});

const proxy = http.createServer((req, res) => {
  dispatcher.request({ // the constructor of RequestHandler in undici/lib/api/api-request.js
    method: req.method,
    path: req.url,
    headers: req.headers,
    body: req,
  }, (err, { statusCode, headers, body /** a Readable */ }) => {
    if (err) {
      throw err
    }
    res.writeHead(statusCode, headers);
     pipe(body, res);
  });
});

function pipe(src, dst) {
  src.on('readable', () => {
    let chunk;
    while (null !== (chunk = src.read(65536))) {
      dst.write(chunk);
    }
  });
  src.on('end', () => {
    dst.end()
  });
}

Expected Behavior

Less CPU time spent in parseHeaders

Logs & Screenshots

image

Environment

Node 22
linux

Additional context

I open this issue because I've spotted some optimization for lowercasing the headers in node's built-in HTTP module, https://github.com/nodejs/node/blob/2e458d973638d01fcb6a0d7d611e0120a94f4d35/lib/_http_incoming.js#L279C3-L279C3
basically, it matches the known header field and use the cached string if found. Maybe we can adapt the same trick here to eliminate the toString() and toLowercase() overhead in parseHeaders.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions