Skip to content

Commit

Permalink
Bug
Browse files Browse the repository at this point in the history
  • Loading branch information
fwhigh committed Jan 31, 2019
1 parent 1705fd9 commit f79a8a7
Show file tree
Hide file tree
Showing 3 changed files with 17 additions and 6 deletions.
19 changes: 15 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,17 @@

Predictive Models in Production

## Workflow

This is the regular workflow once you've got the whole system set up.

1. Edit code and commit to Github.
1. Build the Docker image locally.
1. Push the Docker image to the AWS ECR repository (command below).
1. Manually upload the latest model to AWS S3 s3://<your-S3-bucket>/models/staging/YYYYMMDD/, where YYYYMMDD is today.
1. Deploy the AWS ElasticBeanstalk API with `eb deploy`. This should pick up the S3 model you just uploaded.
1. Run the AWS Batch training job. This will make a new model (again) and restart the EB application (again).

## Local dev

### Pro tips
Expand Down Expand Up @@ -30,13 +41,13 @@ Then open [http://localhost:8888](http://localhost:8888) to run Jupyter.
If you need to enter into the container's shell, do this.

```bash
ENVIRONMENT=dev BUCKET=fwhigh-predictive-models bash scripts/run_training_container.sh -
ENVIRONMENT=dev BUCKET=$BUCKET bash scripts/run_training_container.sh -
```

### Train a model programmatically

```bash
ENVIRONMENT=dev BUCKET=fwhigh-predictive-models bash scripts/run_training_container.sh scripts/train.sh
ENVIRONMENT=dev BUCKET=$BUCKET bash scripts/run_training_container.sh scripts/train.sh
```

### Pushing the new Docker image to production for the training and Flask API services
Expand Down Expand Up @@ -64,7 +75,7 @@ ENVIRONMENT=dev bash scripts/run_api_container.sh "python -m pmip.routes"
Run the Flask API locally.

```bash
ENVIRONMENT=dev BUCKET=fwhigh-predictive-models bash scripts/run_api_container.sh
ENVIRONMENT=dev BUCKET=$BUCKET bash scripts/run_api_container.sh
```

Drop into the Flask API container.
Expand All @@ -76,7 +87,7 @@ ENVIRONMENT=dev bash scripts/run_api_container.sh -
### Push the Lambda API to Lambda

```bash
BUCKET=fwhigh-predictive-models serverless deploy --region $([ -z "$AWS_DEFAULT_REGION" ] && aws configure get region || echo "$AWS_DEFAULT_REGION")
BUCKET=$BUCKET serverless deploy --region $([ -z "$AWS_DEFAULT_REGION" ] && aws configure get region || echo "$AWS_DEFAULT_REGION")
```

Run it
Expand Down
2 changes: 1 addition & 1 deletion pmip/routes.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ def possible_types(value):
)
elif os.getenv('ENVIRONMENT', '') in ['staging', 'prod']:
latest_model_id = get_latest_s3_dateint(
datadir='models',
datadir=f'models/{os.getenv("ENVIRONMENT")}',
bucket=os.getenv('BUCKET')
)
model = load_from_s3_and_unpickle(
Expand Down
2 changes: 1 addition & 1 deletion scripts/get_training_data.sh
Original file line number Diff line number Diff line change
Expand Up @@ -8,4 +8,4 @@ echo Writing it to $DIR

aws s3 cp --recursive $S3_DIR/ $DIR/
cd $DIR
unzip -f *.zip
unzip *.zip

0 comments on commit f79a8a7

Please sign in to comment.