Skip to content

Commit

Permalink
Merge branch 'main' into ttm_cleanup
Browse files Browse the repository at this point in the history
  • Loading branch information
wgifford committed Aug 6, 2024
2 parents c71957f + df23b5b commit 846ba3f
Show file tree
Hide file tree
Showing 19 changed files with 3,878 additions and 8 deletions.
42 changes: 42 additions & 0 deletions .github/workflows/inference-services-test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
# This workflow will install Python dependencies, run tests and lint with a variety of Python versions
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python

name: Tsfminference Service Tests

on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]

jobs:
build:

runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
python-version: ["3.10", "3.11"]

steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v3
with:
python-version: ${{ matrix.python-version }}
- name: Install deps into a virtual enviroment
run: |
python -m venv .venv
source .venv/bin/activate
python -m pip install --upgrade pip
pip install poetry
cd services/inference
poetry install -n --with dev
- name: Test local server tsfminference service with pytest
run: |
source .venv/bin/activate
cd services/inference
make start_service_local
sleep 20
pytest tests
make stop_service_local || true
2 changes: 1 addition & 1 deletion .github/workflows/python-package.yml
Original file line number Diff line number Diff line change
Expand Up @@ -36,4 +36,4 @@ jobs:
make quality
- name: Test with pytest
run: |
pytest
pytest tests
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Adapted from HF Transformers: https://github.com/huggingface/transformers/tree/main
.PHONY: quality style

check_dirs := tests tsfm_public tsfmhfdemos notebooks
check_dirs := tests tsfm_public tsfmhfdemos notebooks services


# this target runs checks on all files
Expand Down
12 changes: 6 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# TSFM: Time Series Foundation Models
Public notebooks and utilities for working with Time Series Foundation Models (TSFM)
Public notebooks, utilities, and serving components for working with Time Series Foundation Models (TSFM).

The core TSFM time series models have been made available on Hugging Face -- details can be found
[here](wiki.md).
[here](wiki.md). Information on the services component can be found [here](services/inference/README.md).


# Python Version
Expand All @@ -27,14 +27,14 @@ pip install ".[notebooks]"
- Getting started with `PatchTSMixer` [[Try it out]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/patch_tsmixer_getting_started.ipynb)
- Transfer learning with `PatchTSMixer` [[Try it out]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/patch_tsmixer_transfer.ipynb)
- Transfer learning with `PatchTST` [[Try it out]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/patch_tst_transfer.ipynb)
- Getting started with `TinyTimeMixer (TTM)` [Try it out](notebooks/hfdemo/ttm_getting_started.ipynb)
- Getting started with `TinyTimeMixer (TTM)` [[Try it out]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/ttm_getting_started.ipynb)

## 📗 Google Colab
Run the TTM tutorial in Google Colab, and quickly build a forecasting application with pre-trained TSFM models.
- [TTM Colab Tutorial](https://colab.research.google.com/github/IBM/tsfm/blob/tutorial/notebooks/tutorial/ttm_tutorial.ipynb)
- [TTM Colab Tutorial](https://colab.research.google.com/github/IBM/tsfm/blob/main/notebooks/tutorial/ttm_tutorial.ipynb)

## 💻 Demos Installation
The demo presented at NeurIPS 2023 is available in `tsfmhfdemos`. This demo requires you to have pre-trained and finetuned models in place (we plan to release these at later date). To install the requirements use `pip`:
The demo presented at NeurIPS 2023 is available in `tsfmhfdemos`. This demo requires you to have pre-trained and finetuned models in place (we plan to release these at a later date). To install the requirements use `pip`:

```bash
pip install ".[demos]"
Expand All @@ -46,7 +46,7 @@ Before opening a new issue, please search for similar issues. It's possible that


# Notice
The intention of this repository is to make it easier to use and demonstrate IBM Research TSFM components that have been made available in the [Hugging Face transformers library](https://huggingface.co/docs/transformers/main/en/index). As we continue to develop these capabilities we will update the code here.
The intention of this repository is to make it easier to use and demonstrate Granite TimeSeries components that have been made available in the [Hugging Face transformers library](https://huggingface.co/docs/transformers/main/en/index). As we continue to develop these capabilities we will update the code here.


IBM Public Repository Disclosure: All content in this repository including code has been provided by IBM under the associated open source software license and IBM is under no obligation to provide enhancements, updates, or support. IBM developers produced this code as an open source project (not as an IBM product), and IBM makes no assertions as to the level of quality nor security, and will not be maintaining this code going forward.
5 changes: 5 additions & 0 deletions services/inference/.dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
**
!tsfminference
!poetry.lock
!pyproject.toml

41 changes: 41 additions & 0 deletions services/inference/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# based on https://github.com/opendatahub-io/caikit-tgis-serving/blob/main/Dockerfile

FROM registry.access.redhat.com/ubi9/ubi-minimal:latest AS builder

RUN microdnf -y update && \
microdnf -y install \
git shadow-utils python3.11-pip python-wheel && \
pip3.11 install --no-cache-dir --upgrade pip wheel && \
microdnf clean all

ENV POETRY_VIRTUALENVS_IN_PROJECT=1

RUN mkdir /inference
COPY tsfminference/* /inference/tsfminference/
COPY pyproject.toml /inference/
COPY poetry.lock /inference/
WORKDIR /inference
RUN pip3.11 install poetry && poetry install

FROM registry.access.redhat.com/ubi9/ubi-minimal:latest AS deploy
RUN microdnf -y update && \
microdnf -y install \
shadow-utils python3.11 && \
microdnf clean all

WORKDIR /inference

COPY --from=builder /inference /inference

ENV VIRTUAL_ENV=/inference/.venv
ENV PATH="$VIRTUAL_ENV/bin:$PATH"
ENV HF_HOME=/tmp

RUN groupadd --system tsfminference --gid 1001 && \
adduser --system --uid 1001 --gid 0 --groups tsfminference \
--create-home --home-dir /inference --shell /sbin/nologin \
--comment "tsfminference User" tsfminference

USER tsfminference

CMD ["python", "-m", "uvicorn","tsfminference.main:app", "--host", "0.0.0.0", "--port", "8000" ]
30 changes: 30 additions & 0 deletions services/inference/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
CONTAINER_BUILDER ?= docker

# starts the tsfminference service (used mainly for test cases)
start_service_local:
python -m tsfminference.main &
sleep 10
stop_service_local:
pkill -f 'python.*tsfminference.*'
sleep 10

image:
$(CONTAINER_BUILDER) build -t tsfminference -f Dockerfile .

start_service_image: image
$(CONTAINER_BUILDER) run -p 8000:8000 -d --rm --name tsfmserver tsfminference
sleep 10
stop_service_image:
$(CONTAINER_BUILDER) stop tsfmserver

test_local: start_service_local
pytest tests
$(MAKE) stop_service_local

test_image: start_service_image
pytest tests
$(MAKE) stop_service_image




77 changes: 77 additions & 0 deletions services/inference/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
# TSFM Services



This component provides RESTful services for the tsfm-granite class of timeseries foundation models. At present it can serve the following models:

* https://huggingface.co/ibm-granite/granite-timeseries-ttm-v1
* https://huggingface.co/ibm-granite/granite-timeseries-patchtst
* https://huggingface.co/ibm-granite/granite-timeseries-patchtsmixer



## Prerequisites:

* GNU make
* python 3.10 or 3.11
* poetry (`pip install poetry`)
* zsh or bash (zsh is preferred)

_Note that our primary target environment for services deployment is x86_64 Linux. You may encounter hiccups if you try to use this on a different environment. If so, please
file an issue. Some of our developers do use a Mac so you're likely to find a quick resolution. None of our developers use native Windows, however._

## Known issues:

* Use of pkill statements in Makefile may not work properly on Mac OS. This will be apparent if you have left over processs after running test related make targets. Please help us put OS-specific checks into our Makefile to handle these cases by filing a PR.

### Installation

```sh
pip install poetry && poetry install --with dev
```

### Testing using a local server instance

```sh
make test_local
```

### Creating an image

_You must have either docker or podman installed on your system for this to
work. You must also have proper permissions on your system to build images._

```sh
CONTAINER_BUILDER=<docker|podmain> make image
# e.g, CONTAINER_BUILDER=docker make image
```

After a successful build you should have a local image named `tsfminference:latest`

```sh
(py311) ➜ tsfm-services git:(revised-build-system) ✗ docker images | grep tsfminference | head -n 1
tsfminference latest df592dcb0533 46 seconds ago 1.49GB
# some of the numeric and hash values on your machine could be different
```

### Testing using the built image

```sh
make test_image
```

### Viewing the OpenAPI 3.x specification and swagger page

```sh
make start_local_server
```

Then open your browser to http://127.0.0.1:8000

To stop the server run:

```sh
# may not work properly on a Mac
# if not, kill the uvicorn process manually.
make stop_local_server
```
Loading

0 comments on commit 846ba3f

Please sign in to comment.