Skip to content

Releases: honojs/node-server

v2.0.0

21 Apr 00:26

Choose a tag to compare

Now, we release the second major version of the Hono Node.js adapter πŸŽ‰ πŸŽ‰ πŸŽ‰

The Hono Node.js adapter is now up to 2.3x faster

v2 of the Hono Node.js adapter reaches up to 2.3x the throughput of v1 β€” that's the peak number, measured on the body-parsing scenario of bun-http-framework-benchmark. The other scenarios (Ping, Query) get a smaller but real boost too.

Install or upgrade with:

npm i @hono/node-server@latest

v2

The Node.js adapter is going through a major version bump to v2. That said, the public API stays the same β€” the headline of this release is the large performance improvement described above.

What does the Node.js adapter do?

A quick refresher on what the Node.js adapter actually does β€” it exists so that Hono applications can run on Node.js. Hono is built on the Web Standards APIs, but you cannot serve those directly from Node.js. The adapter bridges the Web Standards APIs and the Node.js APIs, which is what lets a Hono app β€” and more generally a Web-Standards-style app β€” run on top of Node.js.

If you write the following code and run node ./index.js, a server starts up on localhost:3000. And it really is plain Node.js underneath.

import { Hono } from 'hono'
import { serve } from '@hono/node-server'

const app = new Hono()
app.get('/', (c) => c.text('Hello World!'))

serve(app)

The early performance story

The very first implementation of the Node.js adapter looked roughly like this in pseudocode:

export const getRequestListener = (fetchCallback: FetchCallback) => {
  return async (incoming: IncomingMessage, outgoing: ServerResponse) => {
    const method = incoming.method || 'GET'
    const url = `http://${incoming.headers.host}${incoming.url}`

    // ...

    const init = {
      method: method,
      headers: headerRecord,
    }

    // app is a Hono application
    const res = await app.fetch(new Request(url, init))
    const buffer = await res.arrayBuffer()
    outgoing.writeHead(res.status, resHeaderRecord)
    outgoing.end(new Uint8Array(buffer))
  }
}

So the flow was:

  • a request comes in as an IncomingMessage
  • it gets converted into a Request object and handed to the app
  • the Response returned by the app is written back to the outgoing ServerResponse

In diagram form:

IncomingMessage => Request => app => Response => ServerResponse

This is, frankly, inefficient. So whenever Hono went head-to-head with other Node.js frameworks we kept losing β€” all we could do was shrug and say "well, it's slow on Node.js."

Introducing LightweightRequest / LightweightResponse

The huge step forward that fixed this was a legendary PR from @usualoma:

#95

It made things up to 2.7x faster.

SS

I previously wrote about this in detail in this post:

https://zenn.dev/yusukebe/articles/7ac501716ae1f7?locale=en

In short, the trick is wonderfully simple. It just follows the golden rule of performance tuning: don't do work you don't have to do. Lightweight versions of Request and Response are constructed and used first β€” and that path is fast. Only when something actually needs the contents of the Request, e.g. when you call req.json(), does a real new Request() get instantiated under the hood and used from then on. The result is fast, and behavior stays correct.

…but body parsing was still slow

"Fast" here was for a very simple "Hello World" benchmark β€” a GET that just returns text.

There are many ways to benchmark, but the one we tend to reach for is this:

https://github.com/SaltyAom/bun-http-framework-benchmark

It tests three scenarios: Ping, Query, and Body. Let's pit Hono against the major Node.js frameworks:

SS

As you can see, the Body case is very slow. The handler being measured is essentially this:

import { Hono } from 'hono'
import { serve } from '@hono/node-server'

const app = new Hono()

app.post('/json', async (c) => {
  const data = await c.req.json()
  return c.json(data)
})

serve(app)

c.req.json() is the slow part. The reason is well understood: inside the Node.js adapter, when json() is called the LightweightRequest path can't be used, so a real new Request() ends up being constructed.

perf: optimize request body reading

The 2.3x figure above comes from one PR specifically β€” PR #301 by @mgcrea:

The PR bundles a few changes, but the key one is "optimize request body reading". Quoting from the PR description:

The fix overrides text(), json(), arrayBuffer(), and blob() on the request prototype to read directly from the Node.js IncomingMessage using event-based I/O.

In other words, in the json() case above, we no longer convert into a Request at all β€” we read the body straight off the Node.js APIs. A classic fast path. That alone gives a large jump in body-parsing throughput.

The same PR also includes two other tuning improvements:

  • URL construction fast-path β€” skip building a URL object except in edge cases
  • buildOutgoingHttpHeaders optimization β€” skip the set-cookie header comparison when there are no cookies

v2 ships several other performance PRs as well β€” newHeadersFromIncoming and signal fast-paths, Response fast-paths and responseViaCache improvements, method-key caching, a regex-based buildUrl rewrite, and more (see the full list below). They all add up, but #301 is by far the largest single contributor, which is why it gets the spotlight here.

v2 performance

Now let's measure the final v2 build.

First, comparing against the v1 Node.js adapter. dev here is v2. Body improves by 2.3x, and the other scenarios get faster too:

SS

Next, the same comparison against other frameworks. With the Body score jumping, Hono passes Koa and Fastify and takes first place:

SS

Caution

Updated: The h3 entry in the earlier framework comparison was an older snapshot. Its successor srvx now ships a FastResponse mode, and in srvx's own benchmark (h3js/srvx/test/bench-node) srvx-fast (β‰ˆ68,560 req/sec) beats hono-fast (β‰ˆ59,477 req/sec). The methodology is different from the benchmarks above, but worth being upfront: with FastResponse enabled, srvx is faster than Hono v2 in that setup.

Breaking changes

There are two breaking changes in v2.

Dropped support for Node.js v18

Node.js v18 reached end-of-life, so v2 requires Node.js v20 or later.

Removed the Vercel adapter

The Vercel adapter (@hono/node-server/vercel) has been removed. It is no longer needed for Vercel's modern runtimes, so the recommendation is to deploy without it.

If you still need the previous behavior, the old adapter was a one-liner on top of getRequestListener and you can write the same thing in your own project:

import type { Hono } from 'hono'
import { getRequestListener } from '@hono/node-server'

export const handle = (app: Hono) => {
  return getRequestListener(app.fetch)
}

Then use it the same way you used handle from @hono/node-server/vercel before.

All changes

A full list of what landed in PR #316.

Performance

  • perf: optimize request body reading and URL construction (#301) by @mgcrea
  • perf: optimize buildOutgoingHttpHeaders for the common case (#301) by @mgcrea
  • perf(request): cache method key (#319) by @yusukebe
  • perf(url): mark host with port : as safe host (#320) by @yusukebe
  • perf(request): optimize newHeadersFromIncoming and signal fast-path (#332) by @GavinMeierSonos
  • perf(response, listener): Response fast-paths and responseViaCache improvements (#333) by @GavinMeierSonos
  • perf: replace Uint8Array lookup tables with regex in buildUrl (#345) by @usualoma

Features

Breaking changes

Fixes & refactors

  • fix: more strictly determine when new URL() should be used (#310) by @usualoma
  • refactor: improved compatibility with the Web Standard Request object (#311) by @usualoma
  • fix(request): return an error object instead of throwing (#318) by @usualoma
  • refactor: improve handling of null body in response (#341) by @usualoma
  • fix: ensure close handler is attached for Blob/ReadableStream cacheable responses (#342) by @usualoma
  • fix: improve `Response.js...
Read more

v2.0.0-rc.2

17 Apr 00:49

Choose a tag to compare

v2.0.0-rc.2 Pre-release
Pre-release
2.0.0-rc.2

v1.19.14

13 Apr 01:20

Choose a tag to compare

What's Changed

  • fix: add custom inspect to lightweight Request/Response to prevent TypeError on console.log by @usualoma in #340

Full Changelog: v1.19.13...v1.19.14

v1.19.13

07 Apr 03:56

Choose a tag to compare

Security Fix

Fixed an issue in Serve Static Middleware where inconsistent handling of repeated slashes (//) between the router and static file resolution could allow middleware to be bypassed. Users of Serve Static Middleware are encouraged to upgrade to this version.

See GHSA-92pp-h63x-v22m for details.

v1.19.12

30 Mar 08:31

Choose a tag to compare

What's Changed

Full Changelog: v1.19.11...v1.19.12

v2.0.0-rc.1

17 Mar 10:48
cfc08b3

Choose a tag to compare

v2.0.0-rc.1 Pre-release
Pre-release
chore: ignore claude setting (#314)

v1.19.11

05 Mar 02:24

Choose a tag to compare

What's Changed

  • fix: do not overwrite Content-Length in the fast path pattern if Content-Length already exists. by @usualoma in #309

Full Changelog: v1.19.10...v1.19.11

v1.19.10

03 Mar 10:35

Choose a tag to compare

Security Fix

Fixed an authorization bypass in Serve Static Middleware caused by inconsistent URL decoding (%2F handling) between the router and static file resolution. Users of Serve Static Middleware are encouraged to upgrade to this version.

See GHSA-wc8c-qw6v-h7f6 for details.

v1.19.9

13 Jan 21:45

Choose a tag to compare

What's Changed

  • fix(globals): Stop overwriting global.fetch by @usualoma in #295

Full Changelog: v1.19.8...v1.19.9

v1.19.8

09 Jan 10:09

Choose a tag to compare

What's Changed

  • docs: add guide for listening to UNIX domain socket by @TransparentLC in #292
  • fix(serve-static): Use Readable.toWeb in serveStatic by @otya128 in #293

New Contributors

Full Changelog: v1.19.7...v1.19.8