-
Notifications
You must be signed in to change notification settings - Fork 833
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Supports gzip compression in the browser environment #4162
Conversation
Codecov Report
@@ Coverage Diff @@
## main #4162 +/- ##
==========================================
- Coverage 92.29% 92.28% -0.02%
==========================================
Files 329 328 -1
Lines 9370 9381 +11
Branches 1991 1995 +4
==========================================
+ Hits 8648 8657 +9
- Misses 722 724 +2
|
@@ -61,14 +61,16 @@ | |||
"access": "public" | |||
}, | |||
"dependencies": { | |||
"@opentelemetry/core": "1.17.0" | |||
"@opentelemetry/core": "1.17.0", | |||
"pako": "^2.1.0" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if pako is only needed for web or also for node.js?
as far as I know node.js has gzip support built in.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For web there's also Compression Streams API which could avoid the extra bundle size of pako (if browser support matches expectations)
}).forEach(([k, v]) => { | ||
xhr.setRequestHeader(k, v); | ||
}); | ||
|
||
xhr.send(body); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Architectural design opinion: I don't think loading up code with a bunch of options is good, especially if you want to do this for web use
- For browsers if-s don't get shaken out during bundle time (static analysis doesn't work that deep, mainly
if (false)
-s done by search-and-replace plugins), meaning if someone decides to not use something, it's still a bunch of dead code included in the bundle - It ties it down to a specific implementation and if there's anything js loves it's a billion ways to do something (eg. pako / deflate / new hotness lib of the month / builtin nodejs gz / compression stream api / maybe someone wants brotli at some point making first point double relevant)
I think it would be better to provide hooks configuration (analogous to middleware app.use
on the server side) for exporters where callback functions could manipulate request based on users preferences/needs, which is a lot more easily treehshaken (as only functions used would get bundled), eg:
import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-grpc';
// This is actually where using fetch API internally would allow for a more js standards based API (Request object)
// Also all of this is totally inspired by ky library, gotta throw some respect to well designed libs
function compressUsingPako(req: Request) {
return new Request(req, {
body: pako.gzip(req.body),
headers: {
'Content-Encoding': 'gzip'
}
});
}
const exporter = new OTLPTraceExporter({
url: 'https://somewhere',
hooks: [
compressUsingPako,
// Hooks could find many other use cases around changes to request
signRequest,
attachBatchId,
]
});
One important consideration, IMO, is also the CPU overhead the compression step adds during export. Have you looked at this, and would you be able to share some numbers? |
This PR is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 14 days. |
Which problem is this PR solving?
Supports gzip compression in the browser environment
Short description of the changes
Implemented support for Gzip compression in the browser environment for otlp-exporter-base (previously only supported in Node.js environment), utilizing pako for Gzip compression capability.
Using Gzip compression can greatly reduce transmission and bandwidth costs. When combined with the pb (protobuf) protocol, there can be even greater improvements. Testing showed an improvement of 70% to 80% before and after the implementation (based on sample data, the specific compression ratio may vary depending on the characteristics of the data).
none:
pb:
pb+gzip:
Type of change
How Has This Been Tested?
Checklist: