Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[time series] Add PatchTST #25927

Merged
merged 201 commits into from
Nov 13, 2023
Merged
Show file tree
Hide file tree
Changes from 139 commits
Commits
Show all changes
201 commits
Select commit Hold shift + click to select a range
c7d3fc3
Initial commit of PatchTST model classes
Aug 16, 2023
97628ba
Add PatchTSTForPretraining
psinthong Aug 16, 2023
10b7517
update to include classification
wgifford Aug 21, 2023
1935eef
clean up auto files
wgifford Aug 21, 2023
c6195cb
Add PatchTSTForPrediction
psinthong Aug 22, 2023
2d4b02c
Fix relative import
psinthong Aug 22, 2023
ee8c872
Replace original PatchTSTEncoder with ChannelAttentionPatchTSTEncoder
Aug 22, 2023
c657ae8
temporary adding absolute path + add PatchTSTForForecasting class
namctin Aug 25, 2023
fa72e8a
Update base PatchTSTModel + Unittest
Aug 25, 2023
8b1310e
Update ForecastHead to use the config class
Aug 25, 2023
7c09b86
edit cv_random_masking, add mask to model output
namctin Aug 26, 2023
617db9a
Update configuration_patchtst.py
namctin Aug 26, 2023
484dc00
add masked_loss to the pretraining
namctin Aug 26, 2023
b1ef4af
add PatchEmbeddings
namctin Aug 27, 2023
bc22a87
Update configuration_patchtst.py
namctin Aug 27, 2023
9799a5b
edit loss which considers mask in the pretraining
namctin Aug 28, 2023
78f3173
remove patch_last option
namctin Aug 28, 2023
30819f6
Add commits from internal repo
Aug 28, 2023
2060fb0
Update ForecastHead
psinthong Aug 28, 2023
271b19b
Add model weight initilization + unittest
psinthong Aug 28, 2023
9325a6a
Update PatchTST unittest to use local import
psinthong Aug 29, 2023
0c5deb4
PatchTST integration tests for pretraining and prediction
diepi Aug 29, 2023
00c2af6
Added PatchTSTForRegression + update unittest to include label genera…
psinthong Aug 30, 2023
5802f07
Revert unrelated model test file
psinthong Aug 30, 2023
3a66438
Combine similar output classes
psinthong Aug 30, 2023
00ddf8d
update PredictionHead
namctin Aug 30, 2023
78c26f2
Update configuration_patchtst.py
namctin Aug 30, 2023
5f7c1a0
Add Revin
psinthong Aug 30, 2023
ac8882e
small edit to PatchTSTModelOutputWithNoAttention
namctin Aug 30, 2023
2457e58
Update modeling_patchtst.py
namctin Aug 31, 2023
f1658b2
Updating integration test for forecasting
diepi Aug 31, 2023
43707d7
Fix unittest after class structure changed
psinthong Sep 1, 2023
a69cb59
docstring updates
wgifford Sep 1, 2023
2be37c5
change input_size to num_input_channels
namctin Sep 1, 2023
22adead
more formatting
wgifford Sep 1, 2023
76adaae
Remove some unused params
namctin Sep 2, 2023
1b078c7
Add a comment for pretrained models
psinthong Sep 3, 2023
f718c04
add channel_attention option
namctin Sep 3, 2023
3c09a33
Update PatchTST models to use HF's MultiHeadAttention module
psinthong Sep 5, 2023
3bada03
Update paper + github urls
psinthong Sep 7, 2023
5506592
Fix hidden_state return value
psinthong Sep 7, 2023
bd2a1c5
Update integration test to use PatchTSTForForecasting
psinthong Sep 11, 2023
7641615
Adding dataclass decorator for model output classes
diepi Sep 11, 2023
a14053f
Run fixup script
psinthong Sep 12, 2023
2b704b4
Rename model repos for integration test
psinthong Sep 13, 2023
d46e0c8
edit argument explanation
namctin Sep 13, 2023
5c240dd
change individual option to shared_projection
namctin Sep 13, 2023
b8a186d
Merge branch 'add-patchtst' of https://github.com/namctin/transformer…
namctin Sep 13, 2023
2916ec0
style
kashif Sep 14, 2023
208b83c
Rename integration test + import cleanup
psinthong Sep 14, 2023
ba72907
Fix outpu_hidden_states return value
psinthong Sep 14, 2023
eb96b02
removed unused mode
kashif Sep 15, 2023
474e981
added std, mean and nops scaler
kashif Sep 19, 2023
46e89d6
add initial distributional loss for predition
kashif Sep 20, 2023
48b8621
fix typo in docs
kashif Sep 25, 2023
e5cf09d
add generate function
kashif Sep 25, 2023
5a7fb30
formatting
kashif Sep 25, 2023
18a43f5
add num_parallel_samples
kashif Sep 25, 2023
b54047b
Fix a typo
namctin Sep 28, 2023
406cf00
copy weighted_average function, edit PredictionHead
namctin Oct 2, 2023
e89477c
edit PredictionHead
namctin Oct 2, 2023
a0815ee
add distribution head to forecasting
namctin Oct 2, 2023
d727ef7
formatting
kashif Oct 4, 2023
de5a55b
Merge branch 'main' into add-patchtst
kashif Oct 4, 2023
e391bd3
Add generate function for forecasting
namctin Oct 4, 2023
a50f6c2
Add generate function to prediction task
namctin Oct 5, 2023
8daf165
formatting
kashif Oct 5, 2023
e2f8fd8
use argsort
kashif Oct 5, 2023
1c8ec9d
add past_observed_mask ordering
kashif Oct 5, 2023
7ebaa61
fix arguments
kashif Oct 5, 2023
f3dca25
docs
kashif Oct 5, 2023
5349bf4
add back test_model_outputs_equivalence test
kashif Oct 5, 2023
eb7f547
formatting
kashif Oct 5, 2023
a1cf42c
cleanup
kashif Oct 5, 2023
6392f99
formatting
kashif Oct 5, 2023
8a91544
use ACT2CLS
kashif Oct 5, 2023
dfbea05
formatting
kashif Oct 5, 2023
1a0c55e
fix add_start_docstrings decorator
kashif Oct 5, 2023
0a4e58b
add distribution head and generate function to regression task
namctin Oct 5, 2023
72a6e1e
add distribution head and generate function to regression task
namctin Oct 5, 2023
9908c6a
fix typos
kashif Oct 5, 2023
91a4c46
add forecast_masking
kashif Oct 6, 2023
17c60a7
fixed tests
kashif Oct 6, 2023
0eabe39
Merge branch 'main' into add-patchtst
kashif Oct 6, 2023
a61ac77
use set_seed
kashif Oct 6, 2023
de7fb9e
fix doc test
kashif Oct 6, 2023
0fd0ce7
formatting
kashif Oct 6, 2023
cb52b6f
Update docs/source/en/model_doc/patchtst.md
kashif Oct 7, 2023
3daec96
better var names
kashif Oct 8, 2023
c82022d
rename PatchTSTTranspose
kashif Oct 9, 2023
687e3c8
fix argument names and docs string
kashif Oct 9, 2023
5469748
remove compute_num_patches and unused class
kashif Oct 9, 2023
d5c8359
remove assert
kashif Oct 9, 2023
a25d433
renamed to PatchTSTMasking
kashif Oct 9, 2023
db96ed8
use num_labels for classification
kashif Oct 9, 2023
b6d3b4e
use num_labels
kashif Oct 9, 2023
ca648ef
use default num_labels from super class
kashif Oct 9, 2023
e56de11
move model_type after docstring
kashif Oct 9, 2023
fcfa103
renamed PatchTSTForMaskPretraining
kashif Oct 9, 2023
cd0133f
bs -> batch_size
kashif Oct 9, 2023
bc2bf31
more review fixes
kashif Oct 9, 2023
b8a8231
use hidden_state
kashif Oct 9, 2023
8c3ab7f
rename encoder layer and block class
kashif Oct 9, 2023
2553965
remove commented seed_number
namctin Oct 9, 2023
85538b1
edit docstring
namctin Oct 9, 2023
c36370d
Add docstring
namctin Oct 9, 2023
fe3f4d4
formatting
kashif Oct 10, 2023
11feb7c
use past_observed_mask
kashif Oct 10, 2023
3af8567
doc suggestion
kashif Oct 10, 2023
ed5b26b
make fix-copies
kashif Oct 10, 2023
ddffd71
use Args:
kashif Oct 10, 2023
ccdd013
add docstring
namctin Oct 10, 2023
c993a50
add docstring
namctin Oct 11, 2023
ddc7521
change some variable names and add PatchTST before some class names
namctin Oct 11, 2023
0d7d92d
formatting
kashif Oct 11, 2023
2381994
fix argument types
kashif Oct 11, 2023
e79f0fd
fix tests
kashif Oct 11, 2023
b61bec0
change x variable to patch_input
namctin Oct 11, 2023
e908862
format
namctin Oct 11, 2023
25e669b
formatting
kashif Oct 12, 2023
2ef1564
Merge branch 'main' into add-patchtst
kashif Oct 18, 2023
b9c01ff
fix-copies
kashif Oct 18, 2023
9955e4f
Update tests/models/patchtst/test_modeling_patchtst.py
kashif Oct 19, 2023
5dbe619
move loss to forward
kashif Oct 19, 2023
099b76c
Update src/transformers/models/patchtst/modeling_patchtst.py
kashif Oct 19, 2023
b9c935f
Update src/transformers/models/patchtst/modeling_patchtst.py
kashif Oct 19, 2023
b8d59f8
Update src/transformers/models/patchtst/modeling_patchtst.py
kashif Oct 19, 2023
b7c04c7
Update src/transformers/models/patchtst/modeling_patchtst.py
kashif Oct 19, 2023
c920eee
Update src/transformers/models/patchtst/modeling_patchtst.py
kashif Oct 19, 2023
6642ab9
formatting
kashif Oct 19, 2023
7869767
fix a bug when pre_norm is set to True
namctin Oct 19, 2023
2fb7417
output_hidden_states is set to False as default
namctin Oct 19, 2023
9168ca2
set pre_norm=True as default
namctin Oct 19, 2023
7829a57
format docstring
namctin Oct 20, 2023
5cc98cb
format
kashif Oct 20, 2023
3a09a1e
output_hidden_states is None by default
kashif Oct 20, 2023
1777fc3
add missing docs
kashif Oct 20, 2023
21803e0
better var names
kashif Oct 20, 2023
87068ca
docstring: remove default to False in output_hidden_states
namctin Oct 20, 2023
4c8c7d0
change labels name to target_values in regression task
namctin Oct 23, 2023
dccbc31
format
kashif Oct 23, 2023
3d12866
fix tests
kashif Oct 23, 2023
c489972
change to forecast_mask_ratios and random_mask_ratio
namctin Oct 23, 2023
6f0170d
Merge branch 'add-patchtst' of https://github.com/namctin/transformer…
namctin Oct 23, 2023
6318cd3
change mask names
namctin Oct 23, 2023
6734b65
change future_values to target_values param in the prediction class
namctin Oct 23, 2023
8a7f2a0
remove nn.Sequential and make PatchTSTBatchNorm class
namctin Oct 23, 2023
f23ff20
black
kashif Oct 24, 2023
d6eebdb
fix argument name for prediction
kashif Oct 24, 2023
61b9da5
add output_attentions option
namctin Oct 24, 2023
2b84706
add output_attentions to PatchTSTEncoder
namctin Oct 25, 2023
0be6440
formatting
kashif Oct 25, 2023
8972a92
Add attention output option to all classes
namctin Oct 25, 2023
a2ff8ef
Remove PatchTSTEncoderBlock
namctin Oct 25, 2023
d11ea0e
create PatchTSTEmbedding class
namctin Oct 25, 2023
93b88cf
use config in PatchTSTPatchify
namctin Oct 26, 2023
8175505
Use config in PatchTSTMasking class
namctin Oct 26, 2023
63684aa
add channel_attn_weights
namctin Oct 26, 2023
e8faa8b
Add PatchTSTScaler class
namctin Oct 26, 2023
6389fbf
add output_attentions arg to test function
namctin Oct 26, 2023
dd5e25d
format
kashif Oct 26, 2023
705beb2
Merge branch 'huggingface:main' into add-patchtst
namctin Oct 26, 2023
b07c55f
Update doc with image patchtst.md
kashif Oct 27, 2023
395b044
Merge branch 'main' into add-patchtst
kashif Oct 30, 2023
b63cb28
Merge branch 'main' into add-patchtst
kashif Nov 6, 2023
546f3e2
fix-copies
kashif Nov 6, 2023
e7c687e
rename Forecast <-> Prediction
kashif Nov 6, 2023
609a9d3
change name of a few parameters to match with PatchTSMixer.
namctin Nov 6, 2023
7f05610
Remove *ForForecasting class to match with other time series models.
namctin Nov 6, 2023
9807142
make style
kashif Nov 6, 2023
3b8a306
Remove PatchTSTForForecasting in the test
namctin Nov 6, 2023
d438e29
Merge branch 'add-patchtst' of https://github.com/namctin/transformer…
namctin Nov 6, 2023
ff45a20
remove PatchTSTForForecastingOutput class
namctin Nov 6, 2023
abc64c0
change test_forecast_head to test_prediction_head
namctin Nov 6, 2023
6b3fb30
style
kashif Nov 7, 2023
69897e3
fix docs
kashif Nov 7, 2023
42ec43b
fix tests
kashif Nov 7, 2023
f451c05
change num_labels to num_targets
namctin Nov 7, 2023
131454d
Remove PatchTSTTranspose
namctin Nov 8, 2023
58dd1ec
remove arguments in PatchTSTMeanScaler
namctin Nov 8, 2023
9a69973
remove arguments in PatchTSTStdScaler
namctin Nov 8, 2023
7ab8d59
add config as an argument to all the scaler classes
namctin Nov 9, 2023
4caa376
reformat
namctin Nov 9, 2023
03d27f7
Add norm_eps for batchnorm and layernorm
namctin Nov 9, 2023
03e3220
reformat.
namctin Nov 9, 2023
a8dc48a
reformat
namctin Nov 9, 2023
d002bac
edit docstring
namctin Nov 9, 2023
97d75e6
update docstring
namctin Nov 9, 2023
49232db
change variable name pooling to pooling_type
namctin Nov 9, 2023
3684320
fix output_hidden_states as tuple
namctin Nov 9, 2023
b818036
fix bug when calling PatchTSTBatchNorm
namctin Nov 10, 2023
fb5f490
change stride to patch_stride
namctin Nov 10, 2023
f45baef
create PatchTSTPositionalEncoding class and restructure the PatchTSTE…
namctin Nov 10, 2023
32f11dc
formatting
kashif Nov 10, 2023
dcfd201
initialize scalers with configs
kashif Nov 10, 2023
ede1060
Merge branch 'main' into add-patchtst
kashif Nov 10, 2023
5ed7a9f
edit output_hidden_states
namctin Nov 13, 2023
5f695df
Merge branch 'add-patchtst' of https://github.com/namctin/transformer…
namctin Nov 13, 2023
01294fd
style
kashif Nov 13, 2023
9bf4074
fix forecast_mask_patches doc string
kashif Nov 13, 2023
621341d
Merge branch 'main' into add-patchtst
kashif Nov 13, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -435,6 +435,7 @@ Current number of checkpoints: ![](https://img.shields.io/endpoint?url=https://h
1. **[OPT](https://huggingface.co/docs/transformers/master/model_doc/opt)** (from Meta AI) released with the paper [OPT: Open Pre-trained Transformer Language Models](https://arxiv.org/abs/2205.01068) by Susan Zhang, Stephen Roller, Naman Goyal, Mikel Artetxe, Moya Chen, Shuohui Chen et al.
1. **[OWL-ViT](https://huggingface.co/docs/transformers/model_doc/owlvit)** (from Google AI) released with the paper [Simple Open-Vocabulary Object Detection with Vision Transformers](https://arxiv.org/abs/2205.06230) by Matthias Minderer, Alexey Gritsenko, Austin Stone, Maxim Neumann, Dirk Weissenborn, Alexey Dosovitskiy, Aravindh Mahendran, Anurag Arnab, Mostafa Dehghani, Zhuoran Shen, Xiao Wang, Xiaohua Zhai, Thomas Kipf, and Neil Houlsby.
1. **[OWLv2](https://huggingface.co/docs/transformers/main/model_doc/owlv2)** (from Google AI) released with the paper [Scaling Open-Vocabulary Object Detection](https://arxiv.org/abs/2306.09683) by Matthias Minderer, Alexey Gritsenko, Neil Houlsby.
1. **[PatchTST](https://huggingface.co/docs/transformers/main/model_doc/patchtst)** (from IBM) released with the paper [A Time Series is Worth 64 Words: Long-term Forecasting with Transformers](https://arxiv.org/abs/2211.14730) by Yuqi Nie, Nam H. Nguyen, Phanwadee Sinthong, Jayant Kalagnanam.
1. **[Pegasus](https://huggingface.co/docs/transformers/model_doc/pegasus)** (from Google) released with the paper [PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization](https://arxiv.org/abs/1912.08777) by Jingqing Zhang, Yao Zhao, Mohammad Saleh and Peter J. Liu.
1. **[PEGASUS-X](https://huggingface.co/docs/transformers/model_doc/pegasus_x)** (from Google) released with the paper [Investigating Efficiently Extending Transformers for Long Input Summarization](https://arxiv.org/abs/2208.04347) by Jason Phang, Yao Zhao, and Peter J. Liu.
1. **[Perceiver IO](https://huggingface.co/docs/transformers/model_doc/perceiver)** (from Deepmind) released with the paper [Perceiver IO: A General Architecture for Structured Inputs & Outputs](https://arxiv.org/abs/2107.14795) by Andrew Jaegle, Sebastian Borgeaud, Jean-Baptiste Alayrac, Carl Doersch, Catalin Ionescu, David Ding, Skanda Koppula, Daniel Zoran, Andrew Brock, Evan Shelhamer, Olivier Hénaff, Matthew M. Botvinick, Andrew Zisserman, Oriol Vinyals, João Carreira.
Expand Down
1 change: 1 addition & 0 deletions README_es.md
Original file line number Diff line number Diff line change
Expand Up @@ -410,6 +410,7 @@ Número actual de puntos de control: ![](https://img.shields.io/endpoint?url=htt
1. **[OPT](https://huggingface.co/docs/transformers/master/model_doc/opt)** (from Meta AI) released with the paper [OPT: Open Pre-trained Transformer Language Models](https://arxiv.org/abs/2205.01068) by Susan Zhang, Stephen Roller, Naman Goyal, Mikel Artetxe, Moya Chen, Shuohui Chen et al.
1. **[OWL-ViT](https://huggingface.co/docs/transformers/model_doc/owlvit)** (from Google AI) released with the paper [Simple Open-Vocabulary Object Detection with Vision Transformers](https://arxiv.org/abs/2205.06230) by Matthias Minderer, Alexey Gritsenko, Austin Stone, Maxim Neumann, Dirk Weissenborn, Alexey Dosovitskiy, Aravindh Mahendran, Anurag Arnab, Mostafa Dehghani, Zhuoran Shen, Xiao Wang, Xiaohua Zhai, Thomas Kipf, and Neil Houlsby.
1. **[OWLv2](https://huggingface.co/docs/transformers/main/model_doc/owlv2)** (from Google AI) released with the paper [Scaling Open-Vocabulary Object Detection](https://arxiv.org/abs/2306.09683) by Matthias Minderer, Alexey Gritsenko, Neil Houlsby.
1. **[PatchTST](https://huggingface.co/docs/transformers/main/model_doc/patchtst)** (from IBM) released with the paper [A Time Series is Worth 64 Words: Long-term Forecasting with Transformers](https://arxiv.org/pdf/2211.14730.pdf) by Yuqi Nie, Nam H. Nguyen, Phanwadee Sinthong, Jayant Kalagnanam.
1. **[Pegasus](https://huggingface.co/docs/transformers/model_doc/pegasus)** (from Google) released with the paper [PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization](https://arxiv.org/abs/1912.08777) by Jingqing Zhang, Yao Zhao, Mohammad Saleh and Peter J. Liu.
1. **[PEGASUS-X](https://huggingface.co/docs/transformers/model_doc/pegasus_x)** (from Google) released with the paper [Investigating Efficiently Extending Transformers for Long Input Summarization](https://arxiv.org/abs/2208.04347) by Jason Phang, Yao Zhao, and Peter J. Liu.
1. **[Perceiver IO](https://huggingface.co/docs/transformers/model_doc/perceiver)** (from Deepmind) released with the paper [Perceiver IO: A General Architecture for Structured Inputs & Outputs](https://arxiv.org/abs/2107.14795) by Andrew Jaegle, Sebastian Borgeaud, Jean-Baptiste Alayrac, Carl Doersch, Catalin Ionescu, David Ding, Skanda Koppula, Daniel Zoran, Andrew Brock, Evan Shelhamer, Olivier Hénaff, Matthew M. Botvinick, Andrew Zisserman, Oriol Vinyals, João Carreira.
Expand Down
1 change: 1 addition & 0 deletions README_hd.md
Original file line number Diff line number Diff line change
Expand Up @@ -382,6 +382,7 @@ conda install -c huggingface transformers
1. **[OPT](https://huggingface.co/docs/transformers/master/model_doc/opt)** (from Meta AI) released with the paper [OPT: Open Pre-trained Transformer Language Models](https://arxiv.org/abs/2205.01068) by Susan Zhang, Stephen Roller, Naman Goyal, Mikel Artetxe, Moya Chen, Shuohui Chen et al.
1. **[OWL-ViT](https://huggingface.co/docs/transformers/model_doc/owlvit)** (Google AI से) साथ में कागज [विज़न ट्रांसफॉर्मर्स के साथ सिंपल ओपन-वोकैबुलरी ऑब्जेक्ट डिटेक्शन](https:/ /arxiv.org/abs/2205.06230) मैथियास मिंडरर, एलेक्सी ग्रिट्सेंको, ऑस्टिन स्टोन, मैक्सिम न्यूमैन, डिर्क वीसेनबोर्न, एलेक्सी डोसोवित्स्की, अरविंद महेंद्रन, अनुराग अर्नब, मुस्तफा देहघानी, ज़ुओरन शेन, जिओ वांग, ज़ियाओहुआ झाई, थॉमस किफ़, और नील हॉल्सबी द्वारा पोस्ट किया गया।
1. **[OWLv2](https://huggingface.co/docs/transformers/main/model_doc/owlv2)** (Google AI से) Matthias Minderer, Alexey Gritsenko, Neil Houlsby. द्वाराअनुसंधान पत्र [Scaling Open-Vocabulary Object Detection](https://arxiv.org/abs/2306.09683) के साथ जारी किया गया
1. **[PatchTST](https://huggingface.co/docs/transformers/main/model_doc/patchtst)** (IBM से) Yuqi Nie, Nam H. Nguyen, Phanwadee Sinthong, Jayant Kalagnanam. द्वाराअनुसंधान पत्र [A Time Series is Worth 64 Words: Long-term Forecasting with Transformers](https://arxiv.org/pdf/2211.14730.pdf) के साथ जारी किया गया
1. **[Pegasus](https://huggingface.co/docs/transformers/model_doc/pegasus)** (from Google) released with the paper [PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization](https://arxiv.org/abs/1912.08777) by Jingqing Zhang, Yao Zhao, Mohammad Saleh and Peter J. Liu.
1. **[PEGASUS-X](https://huggingface.co/docs/transformers/model_doc/pegasus_x)** (Google की ओर से) साथ में दिया गया पेपर [लंबे इनपुट सारांश के लिए ट्रांसफ़ॉर्मरों को बेहतर तरीके से एक्सटेंड करना](https://arxiv .org/abs/2208.04347) जेसन फांग, याओ झाओ, पीटर जे लियू द्वारा।
1. **[Perceiver IO](https://huggingface.co/docs/transformers/model_doc/perceiver)** (दीपमाइंड से) साथ में पेपर [पर्सीवर आईओ: संरचित इनपुट और आउटपुट के लिए एक सामान्य वास्तुकला] (https://arxiv.org/abs/2107.14795) एंड्रयू जेगल, सेबेस्टियन बोरग्यूड, जीन-बैप्टिस्ट अलायराक, कार्ल डोर्श, कैटलिन इओनेस्कु, डेविड द्वारा डिंग, स्कंद कोप्पुला, डैनियल ज़ोरान, एंड्रयू ब्रॉक, इवान शेलहैमर, ओलिवियर हेनाफ, मैथ्यू एम। बोट्विनिक, एंड्रयू ज़िसरमैन, ओरिओल विनियल्स, जोआओ कैरेरा द्वारा पोस्ट किया गया।
Expand Down
1 change: 1 addition & 0 deletions README_ja.md
Original file line number Diff line number Diff line change
Expand Up @@ -444,6 +444,7 @@ Flax、PyTorch、TensorFlowをcondaでインストールする方法は、それ
1. **[OPT](https://huggingface.co/docs/transformers/master/model_doc/opt)** (Meta AI から) Susan Zhang, Stephen Roller, Naman Goyal, Mikel Artetxe, Moya Chen, Shuohui Chen et al から公開された研究論文: [OPT: Open Pre-trained Transformer Language Models](https://arxiv.org/abs/2205.01068)
1. **[OWL-ViT](https://huggingface.co/docs/transformers/model_doc/owlvit)** (Google AI から) Matthias Minderer, Alexey Gritsenko, Austin Stone, Maxim Neumann, Dirk Weissenborn, Alexey Dosovitskiy, Aravindh Mahendran, Anurag Arnab, Mostafa Dehghani, Zhuoran Shen, Xiao Wang, Xiaohua Zhai, Thomas Kipf, and Neil Houlsby から公開された研究論文: [Simple Open-Vocabulary Object Detection with Vision Transformers](https://arxiv.org/abs/2205.06230)
1. **[OWLv2](https://huggingface.co/docs/transformers/main/model_doc/owlv2)** (Google AI から) Matthias Minderer, Alexey Gritsenko, Neil Houlsby. から公開された研究論文 [Scaling Open-Vocabulary Object Detection](https://arxiv.org/abs/2306.09683)
1. **[PatchTST](https://huggingface.co/docs/transformers/main/model_doc/patchtst)** (IBM から) Yuqi Nie, Nam H. Nguyen, Phanwadee Sinthong, Jayant Kalagnanam. から公開された研究論文 [A Time Series is Worth 64 Words: Long-term Forecasting with Transformers](https://arxiv.org/pdf/2211.14730.pdf)
1. **[Pegasus](https://huggingface.co/docs/transformers/model_doc/pegasus)** (Google から) Jingqing Zhang, Yao Zhao, Mohammad Saleh and Peter J. Liu から公開された研究論文: [PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization](https://arxiv.org/abs/1912.08777)
1. **[PEGASUS-X](https://huggingface.co/docs/transformers/model_doc/pegasus_x)** (Google から) Jason Phang, Yao Zhao, and Peter J. Liu から公開された研究論文: [Investigating Efficiently Extending Transformers for Long Input Summarization](https://arxiv.org/abs/2208.04347)
1. **[Perceiver IO](https://huggingface.co/docs/transformers/model_doc/perceiver)** (Deepmind から) Andrew Jaegle, Sebastian Borgeaud, Jean-Baptiste Alayrac, Carl Doersch, Catalin Ionescu, David Ding, Skanda Koppula, Daniel Zoran, Andrew Brock, Evan Shelhamer, Olivier Hénaff, Matthew M. Botvinick, Andrew Zisserman, Oriol Vinyals, João Carreira から公開された研究論文: [Perceiver IO: A General Architecture for Structured Inputs & Outputs](https://arxiv.org/abs/2107.14795)
Expand Down
1 change: 1 addition & 0 deletions README_ko.md
Original file line number Diff line number Diff line change
Expand Up @@ -359,6 +359,7 @@ Flax, PyTorch, TensorFlow 설치 페이지에서 이들을 conda로 설치하는
1. **[OPT](https://huggingface.co/docs/transformers/master/model_doc/opt)** (Meta AI 에서) Susan Zhang, Stephen Roller, Naman Goyal, Mikel Artetxe, Moya Chen, Shuohui Chen et al 의 [OPT: Open Pre-trained Transformer Language Models](https://arxiv.org/abs/2205.01068) 논문과 함께 발표했습니다.
1. **[OWL-ViT](https://huggingface.co/docs/transformers/model_doc/owlvit)** (Google AI 에서) Matthias Minderer, Alexey Gritsenko, Austin Stone, Maxim Neumann, Dirk Weissenborn, Alexey Dosovitskiy, Aravindh Mahendran, Anurag Arnab, Mostafa Dehghani, Zhuoran Shen, Xiao Wang, Xiaohua Zhai, Thomas Kipf, and Neil Houlsby 의 [Simple Open-Vocabulary Object Detection with Vision Transformers](https://arxiv.org/abs/2205.06230) 논문과 함께 발표했습니다.
1. **[OWLv2](https://huggingface.co/docs/transformers/main/model_doc/owlv2)** (Google AI 에서 제공)은 Matthias Minderer, Alexey Gritsenko, Neil Houlsby.의 [Scaling Open-Vocabulary Object Detection](https://arxiv.org/abs/2306.09683)논문과 함께 발표했습니다.
1. **[PatchTST](https://huggingface.co/docs/transformers/main/model_doc/patchtst)** (IBM 에서 제공)은 Yuqi Nie, Nam H. Nguyen, Phanwadee Sinthong, Jayant Kalagnanam.의 [A Time Series is Worth 64 Words: Long-term Forecasting with Transformers](https://arxiv.org/pdf/2211.14730.pdf)논문과 함께 발표했습니다.
1. **[Pegasus](https://huggingface.co/docs/transformers/model_doc/pegasus)** (Google 에서) Jingqing Zhang, Yao Zhao, Mohammad Saleh and Peter J. Liu 의 [PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization](https://arxiv.org/abs/1912.08777) 논문과 함께 발표했습니다.
1. **[PEGASUS-X](https://huggingface.co/docs/transformers/model_doc/pegasus_x)** (Google 에서) Jason Phang, Yao Zhao, Peter J. Liu 의 [Investigating Efficiently Extending Transformers for Long Input Summarization](https://arxiv.org/abs/2208.04347) 논문과 함께 발표했습니다.
1. **[Perceiver IO](https://huggingface.co/docs/transformers/model_doc/perceiver)** (Deepmind 에서) Andrew Jaegle, Sebastian Borgeaud, Jean-Baptiste Alayrac, Carl Doersch, Catalin Ionescu, David Ding, Skanda Koppula, Daniel Zoran, Andrew Brock, Evan Shelhamer, Olivier Hénaff, Matthew M. Botvinick, Andrew Zisserman, Oriol Vinyals, João Carreira 의 [Perceiver IO: A General Architecture for Structured Inputs & Outputs](https://arxiv.org/abs/2107.14795) 논문과 함께 발표했습니다.
Expand Down
1 change: 1 addition & 0 deletions README_zh-hans.md
Original file line number Diff line number Diff line change
Expand Up @@ -383,6 +383,7 @@ conda install -c huggingface transformers
1. **[OPT](https://huggingface.co/docs/transformers/master/model_doc/opt)** (来自 Meta AI) 伴随论文 [OPT: Open Pre-trained Transformer Language Models](https://arxiv.org/abs/2205.01068) 由 Susan Zhang, Stephen Roller, Naman Goyal, Mikel Artetxe, Moya Chen, Shuohui Chen et al 发布。
1. **[OWL-ViT](https://huggingface.co/docs/transformers/model_doc/owlvit)** (来自 Google AI) 伴随论文 [Simple Open-Vocabulary Object Detection with Vision Transformers](https://arxiv.org/abs/2205.06230) 由 Matthias Minderer, Alexey Gritsenko, Austin Stone, Maxim Neumann, Dirk Weissenborn, Alexey Dosovitskiy, Aravindh Mahendran, Anurag Arnab, Mostafa Dehghani, Zhuoran Shen, Xiao Wang, Xiaohua Zhai, Thomas Kipf, and Neil Houlsby 发布。
1. **[OWLv2](https://huggingface.co/docs/transformers/main/model_doc/owlv2)** (来自 Google AI) 伴随论文 [Scaling Open-Vocabulary Object Detection](https://arxiv.org/abs/2306.09683) 由 Matthias Minderer, Alexey Gritsenko, Neil Houlsby 发布。
1. **[PatchTST](https://huggingface.co/docs/transformers/main/model_doc/patchtst)** (来自 IBM) 伴随论文 [A Time Series is Worth 64 Words: Long-term Forecasting with Transformers](https://arxiv.org/pdf/2211.14730.pdf) 由 Yuqi Nie, Nam H. Nguyen, Phanwadee Sinthong, Jayant Kalagnanam 发布。
1. **[Pegasus](https://huggingface.co/docs/transformers/model_doc/pegasus)** (来自 Google) 伴随论文 [PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization](https://arxiv.org/abs/1912.08777) 由 Jingqing Zhang, Yao Zhao, Mohammad Saleh and Peter J. Liu 发布。
1. **[PEGASUS-X](https://huggingface.co/docs/transformers/model_doc/pegasus_x)** (来自 Google) 伴随论文 [Investigating Efficiently Extending Transformers for Long Input Summarization](https://arxiv.org/abs/2208.04347) 由 Jason Phang, Yao Zhao, Peter J. Liu 发布。
1. **[Perceiver IO](https://huggingface.co/docs/transformers/model_doc/perceiver)** (来自 Deepmind) 伴随论文 [Perceiver IO: A General Architecture for Structured Inputs & Outputs](https://arxiv.org/abs/2107.14795) 由 Andrew Jaegle, Sebastian Borgeaud, Jean-Baptiste Alayrac, Carl Doersch, Catalin Ionescu, David Ding, Skanda Koppula, Daniel Zoran, Andrew Brock, Evan Shelhamer, Olivier Hénaff, Matthew M. Botvinick, Andrew Zisserman, Oriol Vinyals, João Carreira 发布。
Expand Down
1 change: 1 addition & 0 deletions README_zh-hant.md
Original file line number Diff line number Diff line change
Expand Up @@ -395,6 +395,7 @@ conda install -c huggingface transformers
1. **[OPT](https://huggingface.co/docs/transformers/master/model_doc/opt)** (from Meta AI) released with the paper [OPT: Open Pre-trained Transformer Language Models](https://arxiv.org/abs/2205.01068) by Susan Zhang, Stephen Roller, Naman Goyal, Mikel Artetxe, Moya Chen, Shuohui Chen et al.
1. **[OWL-ViT](https://huggingface.co/docs/transformers/model_doc/owlvit)** (from Google AI) released with the paper [Simple Open-Vocabulary Object Detection with Vision Transformers](https://arxiv.org/abs/2205.06230) by Matthias Minderer, Alexey Gritsenko, Austin Stone, Maxim Neumann, Dirk Weissenborn, Alexey Dosovitskiy, Aravindh Mahendran, Anurag Arnab, Mostafa Dehghani, Zhuoran Shen, Xiao Wang, Xiaohua Zhai, Thomas Kipf, and Neil Houlsby.
1. **[OWLv2](https://huggingface.co/docs/transformers/main/model_doc/owlv2)** (from Google AI) released with the paper [Scaling Open-Vocabulary Object Detection](https://arxiv.org/abs/2306.09683) by Matthias Minderer, Alexey Gritsenko, Neil Houlsby.
1. **[PatchTST](https://huggingface.co/docs/transformers/main/model_doc/patchtst)** (from IBM) released with the paper [A Time Series is Worth 64 Words: Long-term Forecasting with Transformers](https://arxiv.org/pdf/2211.14730.pdf) by Yuqi Nie, Nam H. Nguyen, Phanwadee Sinthong, Jayant Kalagnanam.
1. **[Pegasus](https://huggingface.co/docs/transformers/model_doc/pegasus)** (from Google) released with the paper [PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization](https://arxiv.org/abs/1912.08777) by Jingqing Zhang, Yao Zhao, Mohammad Saleh and Peter J. Liu.
1. **[PEGASUS-X](https://huggingface.co/docs/transformers/model_doc/pegasus_x)** (from Google) released with the paper [Investigating Efficiently Extending Transformers for Long Input Summarization](https://arxiv.org/abs/2208.04347) by Jason Phang, Yao Zhao, Peter J. Liu.
1. **[Perceiver IO](https://huggingface.co/docs/transformers/model_doc/perceiver)** (from Deepmind) released with the paper [Perceiver IO: A General Architecture for Structured Inputs & Outputs](https://arxiv.org/abs/2107.14795) by Andrew Jaegle, Sebastian Borgeaud, Jean-Baptiste Alayrac, Carl Doersch, Catalin Ionescu, David Ding, Skanda Koppula, Daniel Zoran, Andrew Brock, Evan Shelhamer, Olivier Hénaff, Matthew M. Botvinick, Andrew Zisserman, Oriol Vinyals, João Carreira.
Expand Down
2 changes: 2 additions & 0 deletions docs/source/en/_toctree.yml
Original file line number Diff line number Diff line change
Expand Up @@ -739,6 +739,8 @@
title: Autoformer
- local: model_doc/informer
title: Informer
- local: model_doc/patchtst
title: PatchTST
- local: model_doc/time_series_transformer
title: Time Series Transformer
title: Time series models
Expand Down
1 change: 1 addition & 0 deletions docs/source/en/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -210,6 +210,7 @@ Flax), PyTorch, and/or TensorFlow.
| [OPT](model_doc/opt) | ✅ | ✅ | ✅ |
| [OWL-ViT](model_doc/owlvit) | ✅ | ❌ | ❌ |
| [OWLv2](model_doc/owlv2) | ✅ | ❌ | ❌ |
| [PatchTST](model_doc/patchtst) | ✅ | ❌ | ❌ |
| [Pegasus](model_doc/pegasus) | ✅ | ✅ | ✅ |
| [PEGASUS-X](model_doc/pegasus_x) | ✅ | ❌ | ❌ |
| [Perceiver](model_doc/perceiver) | ✅ | ❌ | ❌ |
Expand Down
Loading