diff --git a/README.md b/README.md index cff1adbc..06b3d32d 100644 --- a/README.md +++ b/README.md @@ -11,7 +11,7 @@ The current Python versions supported are 3.9, 3.10, 3.11. ## Initial Setup First clone the repository: ```bash -git clone git@github.com:IBM/tsfm.git +git clone git@github.com:ibm-granite/granite-tsfm.git cd tsfm ``` @@ -24,9 +24,9 @@ pip install ".[notebooks]" ``` ### Notebooks links -- Getting started with `PatchTSMixer` [[Try it out]](https://github.com/IBM/tsfm/blob/main/notebooks/hfdemo/patch_tsmixer_getting_started.ipynb) -- Transfer learning with `PatchTSMixer` [[Try it out]](https://github.com/IBM/tsfm/blob/main/notebooks/hfdemo/patch_tsmixer_transfer.ipynb) -- Transfer learning with `PatchTST` [[Try it out]](https://github.com/IBM/tsfm/blob/main/notebooks/hfdemo/patch_tst_transfer.ipynb) +- Getting started with `PatchTSMixer` [[Try it out]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/patch_tsmixer_getting_started.ipynb) +- Transfer learning with `PatchTSMixer` [[Try it out]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/patch_tsmixer_transfer.ipynb) +- Transfer learning with `PatchTST` [[Try it out]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/patch_tst_transfer.ipynb) - Getting started with `TinyTimeMixer (TTM)` [Try it out](notebooks/hfdemo/ttm_getting_started.ipynb) ## 📗 Google Colab @@ -40,14 +40,12 @@ The demo presented at NeurIPS 2023 is available in `tsfmhfdemos`. This demo requ pip install ".[demos]" ``` - ## Issues -If you encounter an issue with this project, you are welcome to submit a [bug report](https://github.com/IBM/TSFM/issues). +If you encounter an issue with this project, you are welcome to submit a [bug report](https://github.com/ibm-granite/granite-tsfm/issues). Before opening a new issue, please search for similar issues. It's possible that someone has already reported it. # Notice - The intention of this repository is to make it easier to use and demonstrate IBM Research TSFM components that have been made available in the [Hugging Face transformers library](https://huggingface.co/docs/transformers/main/en/index). As we continue to develop these capabilities we will update the code here.