You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be very useful a comparison with performances of gradient-based methods (lora,p-tuning,prompt tuning, etc.) on the same datasets and using the same models (i.e., t5-xxl) commonly used in literature. For instance you can compare with results quoted here https://aclanthology.org/2021.emnlp-main.243.pdf .
It is not clear from your manuscript whether or not the proposed approach is still competitive with (very) large models (larger than roberta-large), where it is well known that gradient-based models are performing very well.
Thank you, and congratulation for the very very interesting method!
The text was updated successfully, but these errors were encountered:
pretidav
changed the title
Comparison with gradient-based methods
Comparison with gradient-based methods on large models
Mar 3, 2023
It would be very useful a comparison with performances of gradient-based methods (lora,p-tuning,prompt tuning, etc.) on the same datasets and using the same models (i.e., t5-xxl) commonly used in literature. For instance you can compare with results quoted here https://aclanthology.org/2021.emnlp-main.243.pdf .
It is not clear from your manuscript whether or not the proposed approach is still competitive with (very) large models (larger than roberta-large), where it is well known that gradient-based models are performing very well.
Thank you, and congratulation for the very very interesting method!
The text was updated successfully, but these errors were encountered: