Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Recommend two papers #7

Open
realwangjiahao opened this issue Jul 25, 2024 · 0 comments
Open

Recommend two papers #7

realwangjiahao opened this issue Jul 25, 2024 · 0 comments

Comments

@realwangjiahao
Copy link

Thank you for summarizing these articles, it is very helpful for my research work.
I recently read two papers on using large language models for time series classification. These two articles inspired me a lot, I recommend them to you here.

The first one is "InstructTime: Advancing Time Series Classification with Multimodal Language Modeling", and this article proposes a new method called InstructTime, which uses gpt-2 to encode time series and achieves better classification results. You can find the article at https://arxiv.org/abs/2403.12371. And code can be used at https://github.com/Mingyue-Cheng/InstructTime.

The second paper is “CrossTimeNet: Learning Transferable Time Series Classifier with Cross-Domain Pre-training from Language Model.”
This article uses BERT and GPT-2 to encode time series and uses self-supervision methods to improve classification performance. You can find the article at https://arxiv.org/pdf/2403.12372. And code can be found at https://github.com/Mingyue-Cheng/CrossTimeNet.

These two articles can be divided into the "PFMs for Time Series" part。

Could you please add the two papers to your summary list? Thanks a lot.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant