Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow BatchSpanProcessor to send early when a full batch is ready #4164

Conversation

RafalSumislawski
Copy link

Which problem is this PR solving?

The BatchSpanProcessor waits scheduledDelayMillis (5000 by default) since the arrival of the first span, or since the last export before exporting. It never exports more than one batch of maxExportBatchSize (512 by default). If there's more than 512 spans produced per 5000 seconds the surplus starts building up in the queue, until it overflows and starts dropping spans. Which is observable though these logs: Dropped 2576 spans because maxQueueSize reached

Short description of the changes

In comparision Java's BatchSpanProcessor also uses a configured delay and batch size, but will send the batch as soon as sufficient number of spans is enqueued (https://github.com/open-telemetry/opentelemetry-java/blob/main/sdk/trace/src/main/java/io/opentelemetry/sdk/trace/export/BatchSpanProcessor.java#L244). Therefore the delay doesn't limit the maximum throughput.

This PR implements similar logic in JS's BatchSpanProcessor.

Type of change

  • Bug fix (non-breaking change which fixes an issue)

How Has This Been Tested?

  • I've adapted all the relevant unit tests to this new behaviour
  • I've run a nodejs lambda which produces ~10k spans in ~20s and observed spans dropped before the changes and no spans dropped after the changes.

Checklist:

  • Followed the style guidelines of this project
  • Unit tests have been added
  • Documentation has been updated

@RafalSumislawski RafalSumislawski requested a review from a team September 27, 2023 11:06
@linux-foundation-easycla
Copy link

linux-foundation-easycla bot commented Sep 27, 2023

CLA Signed

The committers listed above are authorized under a signed CLA.

  • ✅ login: RafalSumislawski / name: Rafał Sumisławski (6cbbce3)

@Flarna
Copy link
Member

Flarna commented Sep 27, 2023

There seem to be a related issue #3094 and some similar PRs: #3458 #3828

fyi @seemk @dyladan

@codecov
Copy link

codecov bot commented Sep 27, 2023

Codecov Report

Merging #4164 (13c66b8) into main (2499708) will increase coverage by 1.73%.
The diff coverage is 93.54%.

❗ Current head 13c66b8 differs from pull request most recent head 6cbbce3. Consider uploading reports for the commit 6cbbce3 to get more accurate results

@@            Coverage Diff             @@
##             main    #4164      +/-   ##
==========================================
+ Coverage   90.52%   92.25%   +1.73%     
==========================================
  Files         159      329     +170     
  Lines        3757     9388    +5631     
  Branches      835     1997    +1162     
==========================================
+ Hits         3401     8661    +5260     
- Misses        356      727     +371     
Files Coverage Δ
...dk-trace-base/src/export/BatchSpanProcessorBase.ts 92.36% <93.54%> (-0.56%) ⬇️

... and 176 files with indirect coverage changes

Copy link

This PR is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 14 days.

@github-actions github-actions bot added the stale label Nov 27, 2023
Copy link

This PR was closed because it has been stale for 14 days with no activity.

@github-actions github-actions bot closed this Dec 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants