perf(http2): avoid response header reserialization#5085
Open
trivikr wants to merge 1 commit intonodejs:mainfrom
Open
perf(http2): avoid response header reserialization#5085trivikr wants to merge 1 commit intonodejs:mainfrom
trivikr wants to merge 1 commit intonodejs:mainfrom
Conversation
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #5085 +/- ##
=======================================
Coverage 93.13% 93.14%
=======================================
Files 110 110
Lines 36104 36111 +7
=======================================
+ Hits 33624 33634 +10
+ Misses 2480 2477 -3 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
metcoder95
approved these changes
Apr 24, 2026
Assisted-by: openai:gpt-5.4 Signed-off-by: Kamat, Trivikram <16024985+trivikr@users.noreply.github.com>
516ad84 to
1272384
Compare
Member
Author
|
I ran the existing benchmarks in Main $ benchmarks> npm run bench:h2
...
[bench:run:h2] 17429794.56
[bench:run:h2] 14705542.4
[bench:run:h2] 10973610.666666666
[bench:run:h2] 8226152.96
[bench:run:h2] 7671800
[bench:run:h2] ┌─────────┬─────────────────────┬─────────┬────────────────────┬────────────┬─────────────────────────┬─────────────────────────┐
[bench:run:h2] │ (index) │ Tests │ Samples │ Result │ Tolerance │ Difference with Slowest │ Difference with slowest │
[bench:run:h2] ├─────────┼─────────────────────┼─────────┼────────────────────┼────────────┼─────────────────────────┼─────────────────────────┤
[bench:run:h2] │ 0 │ 'undici - dispatch' │ 0 │ 'Errored' │ 'N/A' │ 'N/A' │ │
[bench:run:h2] │ 1 │ 'native - http2' │ 50 │ '5737.30 req/sec' │ '± 2.93 %' │ │ '-' │
[bench:run:h2] │ 2 │ 'undici - fetch' │ 20 │ '6800.16 req/sec' │ '± 2.96 %' │ │ '+ 18.53 %' │
[bench:run:h2] │ 3 │ 'undici - pipeline' │ 30 │ '9112.77 req/sec' │ '± 2.75 %' │ │ '+ 58.83 %' │
[bench:run:h2] │ 4 │ 'undici - request' │ 25 │ '12156.35 req/sec' │ '± 2.76 %' │ │ '+ 111.88 %' │
[bench:run:h2] │ 5 │ 'undici - stream' │ 20 │ '13034.75 req/sec' │ '± 2.87 %' │ │ '+ 127.19 %' │
[bench:run:h2] └─────────┴─────────────────────┴─────────┴────────────────────┴────────────┴─────────────────────────┴─────────────────────────┘This branch $ benchmarks> npm run bench:h2
...
[bench:run:h2] 16233540.654545454
[bench:run:h2] 14560871.68
[bench:run:h2] 11115562.971428571
[bench:run:h2] 8099668.8
[bench:run:h2] 7945346.4
[bench:run:h2] ┌─────────┬─────────────────────┬─────────┬────────────────────┬────────────┬─────────────────────────┬─────────────────────────┐
[bench:run:h2] │ (index) │ Tests │ Samples │ Result │ Tolerance │ Difference with Slowest │ Difference with slowest │
[bench:run:h2] ├─────────┼─────────────────────┼─────────┼────────────────────┼────────────┼─────────────────────────┼─────────────────────────┤
[bench:run:h2] │ 0 │ 'undici - dispatch' │ 0 │ 'Errored' │ 'N/A' │ 'N/A' │ │
[bench:run:h2] │ 1 │ 'native - http2' │ 55 │ '6160.09 req/sec' │ '± 2.97 %' │ │ '-' │
[bench:run:h2] │ 2 │ 'undici - fetch' │ 25 │ '6867.72 req/sec' │ '± 2.72 %' │ │ '+ 11.49 %' │
[bench:run:h2] │ 3 │ 'undici - pipeline' │ 35 │ '8996.40 req/sec' │ '± 2.65 %' │ │ '+ 46.04 %' │
[bench:run:h2] │ 4 │ 'undici - request' │ 20 │ '12346.18 req/sec' │ '± 2.89 %' │ │ '+ 100.42 %' │
[bench:run:h2] │ 5 │ 'undici - stream' │ 20 │ '12585.98 req/sec' │ '± 2.63 %' │ │ '+ 104.32 %' │
[bench:run:h2] └─────────┴─────────────────────┴─────────┴────────────────────┴────────────┴─────────────────────────┴─────────────────────────┘ |
Member
Author
|
I tested with the following temporary harness which shows 5-7% improvement (80 rounds x 200 parallel requests) 'use strict'
const http2 = require('node:http2')
const { once } = require('node:events')
const { performance } = require('node:perf_hooks')
const { H2CClient } = require('..')
const runs = parseInt(process.env.RUNS, 10) || 5
const rounds = parseInt(process.env.ROUNDS, 10) || 80
const parallel = parseInt(process.env.PARALLEL, 10) || 200
const warmupRounds = parseInt(process.env.WARMUP_ROUNDS, 10) || 5
const body = Buffer.from('ok')
function formatNumber (n) {
return n.toLocaleString('en-US', { maximumFractionDigits: 2 })
}
function makeServer () {
const server = http2.createServer({
settings: {
maxConcurrentStreams: parallel
}
})
server.on('stream', (stream) => {
stream.respond({
':status': 200,
'content-type': 'text/plain',
'content-length': body.length
})
stream.end(body)
})
return server
}
function makeRoundRequests (client) {
const requests = new Array(parallel)
for (let i = 0; i < parallel; ++i) {
requests[i] = client.request({
path: '/',
method: 'GET'
}).then(({ body }) => body.dump())
}
return Promise.all(requests)
}
async function timeRound (client) {
const start = performance.now()
await makeRoundRequests(client)
return performance.now() - start
}
async function closeClient (client) {
await new Promise((resolve, reject) => {
client.close((err) => {
if (err) {
reject(err)
return
}
resolve()
})
})
}
async function main () {
const server = makeServer()
server.listen(0, '127.0.0.1')
await once(server, 'listening')
const origin = `http://127.0.0.1:${server.address().port}`
const clients = new Array(runs)
for (let i = 0; i < runs; ++i) {
clients[i] = new H2CClient(origin, {
maxConcurrentStreams: parallel,
pipelining: parallel
})
}
try {
for (let i = 0; i < runs; ++i) {
for (let j = 0; j < warmupRounds; ++j) {
await makeRoundRequests(clients[i])
}
}
const results = Array.from({ length: runs }, () => ({
elapsed: 0,
requests: 0
}))
for (let round = 0; round < rounds; ++round) {
for (let run = 0; run < runs; ++run) {
const elapsed = await timeRound(clients[run])
results[run].elapsed += elapsed
results[run].requests += parallel
}
}
const rows = results.map((result, i) => {
const reqSec = result.requests / (result.elapsed / 1000)
return {
Run: i + 1,
Rounds: rounds,
Requests: result.requests,
'Elapsed (ms)': formatNumber(result.elapsed),
'Req/sec': formatNumber(reqSec)
}
})
const avgReqSec = results.reduce((total, result) => {
return total + result.requests / (result.elapsed / 1000)
}, 0) / results.length
console.log(`${runs} interleaved runs, ${rounds} rounds x ${parallel} parallel .request() calls`)
console.table(rows)
console.log(`Average of ${runs}: ${formatNumber(avgReqSec)} req/sec`)
} finally {
await Promise.all(clients.map(closeClient))
await new Promise((resolve) => server.close(resolve))
}
}
main().catch((err) => {
console.error(err)
process.exitCode = 1
})Main$ benchmarks> node benchmarks/h2-request-harness.js
5 interleaved runs, 80 rounds x 200 parallel .request() calls
┌─────────┬─────┬────────┬──────────┬──────────────┬─────────────┐
│ (index) │ Run │ Rounds │ Requests │ Elapsed (ms) │ Req/sec │
├─────────┼─────┼────────┼──────────┼──────────────┼─────────────┤
│ 0 │ 1 │ 80 │ 16000 │ '369.18' │ '43,339.47' │
│ 1 │ 2 │ 80 │ 16000 │ '366.99' │ '43,598.06' │
│ 2 │ 3 │ 80 │ 16000 │ '364.91' │ '43,846.41' │
│ 3 │ 4 │ 80 │ 16000 │ '362.38' │ '44,152.61' │
│ 4 │ 5 │ 80 │ 16000 │ '373.23' │ '42,868.86' │
└─────────┴─────┴────────┴──────────┴──────────────┴─────────────┘
Average of 5: 43,561.08 req/secPR branch$ benchmarks> node benchmarks/h2-request-harness.js
5 interleaved runs, 80 rounds x 200 parallel .request() calls
┌─────────┬─────┬────────┬──────────┬──────────────┬─────────────┐
│ (index) │ Run │ Rounds │ Requests │ Elapsed (ms) │ Req/sec │
├─────────┼─────┼────────┼──────────┼──────────────┼─────────────┤
│ 0 │ 1 │ 80 │ 16000 │ '346.15' │ '46,222.63' │
│ 1 │ 2 │ 80 │ 16000 │ '350.39' │ '45,662.89' │
│ 2 │ 3 │ 80 │ 16000 │ '345.96' │ '46,248.54' │
│ 3 │ 4 │ 80 │ 16000 │ '331.03' │ '48,334.55' │
│ 4 │ 5 │ 80 │ 16000 │ '336.09' │ '47,605.81' │
└─────────┴─────┴────────┴──────────┴──────────────┴─────────────┘
Average of 5: 46,814.88 req/sec |
metcoder95
approved these changes
Apr 26, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This relates to...
N/A
Rationale
HTTP/2 responses currently pass header objects through parts of the dispatcher stack, while the
responseHeaders: 'raw'path still expects name/value arrays. That mismatch causes unnecessary header reserialization and makes the raw-header path inconsistent across H1 and H2 callers.Changes
parseRawHeaders()to accept nullish values and plain header objects, and normalize them into the existing flat raw-header array format.responseHeaders: 'raw'is requested.DispatchController.rawHeaders/rawTrailerscan beIncomingHttpHeaders.Features
N/A
Bug Fixes
responseHeaders: 'raw'for HTTP/2 responses so callers receive normalized raw headers without relying on H1-style buffer pairs.Breaking Changes and Deprecations
N/A
Status