-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Description
Runtime
node
Runtime version
20.19.1
Module version
21.1.0
Last module version without issue
No response
Used with
No response
Any other relevant information
No response
What are you trying to achieve or the steps to reproduce?
In our application we sometimes require the upload of report data that exceeds the default 1MB limit Hapi sets for payload size.
While we could just lift this limit, the limit is also there for sensible reasons and we only want to lift it if absolutely required.
The reports we need to upload respond well to compression, we're using the CompressionStream API in the browser to compress with gzip, resulting in a 1.6MB payload becoming ~0.5MB
When attempting to send this compressed body to our API, we still trigger Hapi's default 1MP limit despite this. We trigger the limit regardless of whether we read the compressed body on the client in-full in advance to determine length and set content-length
header, or if we use duplex: half
and stream the compression output directly (and ergo no content-length header present at all).
This seems like a calculation mistake on the part of Hapi.
What was the result you got?
A 413 Error
{
"statusCode": 413,
"error": "Request Entity Too Large",
"message": "Payload content length greater than maximum allowed: 1048576"
}
What result did you expect?
Success, the (compressed) content-length is only 492836
bytes.