Skip to content

feat: export Headers class object to globalThis#667

Merged
saghul merged 4 commits intosaghul:masterfrom
ErosZy:polyfill-headers-export
Oct 21, 2024
Merged

feat: export Headers class object to globalThis#667
saghul merged 4 commits intosaghul:masterfrom
ErosZy:polyfill-headers-export

Conversation

@ErosZy
Copy link
Copy Markdown
Contributor

@ErosZy ErosZy commented Oct 21, 2024

I have written a simple implementation to test the performance of the Hono framework's hello feature, i found Headers class object is missing, so i export it to globalThis, the benchmark code is:

// server.js
// you should `npm i hono --save`
// and esbuild this script
import getopts from "tjs:getopts";
import ffi from "tjs:ffi";
import { Hono } from "hono";

let l = null;

const app = new Hono();

app.get("/", (c) => c.text("Hello Hono on LLRT!"));

async function handleConnection(conn) {
  const buf = new Uint8Array(1024);
  const nread = await conn.read(buf);
  const requestString = ffi.bufferToString(buf);
  const requestLines = requestString.split("\r\n");
  const requestLine = requestLines[0].split(" ");
  const headers = {};

  for (let i = 1; i < requestLines.length; i++) {
    const line = requestLines[i];
    if (line) {
      const [key, ...valueParts] = line.split(": ");
      headers[key] = valueParts.join(": ");
    }
  }

  const method = requestLine[0];
  const path = requestLine[1];
  const protocol = requestLine[2];
  const url = `http://localhost:${l.localAddress.port}${path}`;
  const request = new Request(url, {
    method,
    path,
    protocol,
    headers,
  });

  const response = await app.fetch(request);
  const body = await response.text();

  let responseHeaders = "";

  for (const [key, value] of Object.entries(response.headers)) {
    responseHeaders += `${key}: ${value}\r\n`;
  }

  const httpResponse = `HTTP/1.1 ${response.status} ${response.statusText}\r\n${responseHeaders}\r\n${body}`;

  conn.setNoDelay(true);
  await conn.write(new TextEncoder().encode(httpResponse));
  conn.close();
}

const options = getopts(tjs.args.slice(2), {
  alias: {
    listen: "l",
    port: "p",
  },
  default: {
    listen: "127.0.0.1",
    port: 1234,
  },
});

l = await tjs.listen("tcp", options.listen, options.port);

console.log(`Listening on ${l.localAddress.ip}:${l.localAddress.port}`);

for await (let conn of l) {
  handleConnection(conn);
  conn = undefined;
}

and you should get outputs and you can use wrk to benchmark:

> ./build/tjs run ./dist/bundle.js
Listening on 127.0.0.1:1234
> wrk -c 10 -d 5s -t 1 http://localhost:1234
Running 5s test @ http://localhost:1234
  1 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     3.98ms    1.18ms  18.48ms   87.13%
    Req/Sec     2.43k   353.78     2.70k    84.00%
  12079 requests in 5.00s, 424.65KB read
  Socket errors: connect 0, read 12079, write 0, timeout 0
Requests/sec:   2413.41
Transfer/sec:     84.85KB

Copy link
Copy Markdown
Owner

@saghul saghul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch!

@ErosZy ErosZy force-pushed the polyfill-headers-export branch from bd56cab to 147e96c Compare October 21, 2024 15:50
@ErosZy
Copy link
Copy Markdown
Contributor Author

ErosZy commented Oct 21, 2024

@saghul Could you please re-run the CI? It seems that most of the issues are caused by network errors, and I didn't encounter any related problems when I tried it locally.

@saghul saghul merged commit 9c65cd3 into saghul:master Oct 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants