Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add loss generating token counts #1610

Merged
merged 12 commits into from
Oct 27, 2024
Merged

Conversation

dakinggg
Copy link
Collaborator

@dakinggg dakinggg commented Oct 22, 2024

Takes advantage of the new functionality in Composer to weight microbatches by loss generating tokens, and not just total tokens. See the Composer PR (mosaicml/composer#3677) for more details and manual testing.

Note: this needs a Composer release and bump (and CI won't pass until that happens)

@dakinggg dakinggg marked this pull request as ready for review October 24, 2024 06:29
@dakinggg dakinggg requested a review from a team as a code owner October 24, 2024 06:29
Copy link
Collaborator

@mvpatel2000 mvpatel2000 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will you hold PR here until release?

@dakinggg
Copy link
Collaborator Author

@mvpatel2000 yeah, CI won't pass until release

@dakinggg dakinggg changed the title Loss gen tokens Add loss generating token counts Oct 24, 2024
llmfoundry/data/utils.py Outdated Show resolved Hide resolved
@dakinggg dakinggg merged commit 874c30a into mosaicml:main Oct 27, 2024
9 checks passed
@@ -1310,9 +1321,11 @@ def build_from_hf(
raise NotImplementedError()

batch_collated = dl.dataloader.collate_fn(batch_tokenized) # type: ignore
actual_token_count = dl.get_num_tokens_in_batch(batch_collated)
actual_total_token_count = dl.get_num_tokens_in_batch(batch_collated, token_type='total')

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i might be missing something, but how can we pass in token_type here when it's not in the function definition here? https://github.com/mosaicml/llm-foundry/pull/1610/files#diff-9568d89aed75ca69416abe2a592c6bb9732129049a62c34e4e9263c18495a236R99

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The function being called here is actually defined on the DataSpec class in Composer (https://github.com/mosaicml/composer/blob/28756dd52e96371689b764cb72c336406460ad35/composer/core/data_spec.py#L301). The DataSpec takes in a function from the user and uses it.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Part of the reason for doing it this way was to maintain backwards compatibility with any existing user defined get_num_tokens_in_batch functions out there.

torch.sum(batch['labels'] != CROSS_ENTROPY_IGNORE_INDEX).item(),
)

# Subtract one for each example in the batch that starts with a non -100,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dakinggg I don't think this subtraction isn't necessary. Instead you can just do this:

loss_generating_tokens = int(
                torch.sum(batch['labels'][...,1:] != CROSS_ENTROPY_IGNORE_INDEX).item(),
            )

*I just came across this pr while looking into how mosaic's libs handle the gradient accumulation bug recently discussed on x.com

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah yeah, that should work too :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants