Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fine-tuning time consumption #184

Open
czczccz opened this issue Nov 6, 2024 · 1 comment
Open

Fine-tuning time consumption #184

czczccz opened this issue Nov 6, 2024 · 1 comment

Comments

@czczccz
Copy link

czczccz commented Nov 6, 2024

Hi! I found the model infers extremely fast but takes a long time when being fine-tuned. Training on hundreds of samples could take 1 hour. Is this normal?

@wgifford
Copy link
Collaborator

Can you provide more information about what dataset you are training on, what hardware, and general details of the trainer config? If the dataset is multivariate with many dimensions training will take longer. For example:

The fine-tuning in this example: https://github.com/ibm-granite-community/granite-timeseries-cookbook/blob/main/recipes/Time_Series/Few-shot_Finetuning_and_Evaluation.ipynb

For that notebook, with batch size = 64, it takes about 26s (on my MacBook Pro) per epoch. For reference this dataset has 12 inputs and 931 records.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants