This is the code that powers the API for opentestdata.org, the free and open database of automated test fixture data. The canonical publicly hosted instance of this API is at api.opentestdata.org (it is an API server, though, so loading that URL in a web browser will not show anything interesting. But see the API docs section below...
This project is very much in early development. Before you submit a PR, talk to us. We might be working on it already.
If you're interested in contributing, check out the list of open issues.
Follow the development instructions below to set up an instance of this API server.
API docs are returned by this api server at the /ui
url. You can find them hosted publicly here.
At a high level, the basic tech stack looks like this:
- Python (programming language)
- Flask (HTTP server)
- SQLAlchemy (ORM)
- OpenAPI (API specification)
In production, the site runs on Google Cloud services.
- Clone the repo, and navigate into it
- Ensure you have Python 3.7+, Pip, and Pipenv
- Ensure mysql is installed and running, then run the following script to set up development and test databases
mysql -u <USER> -p<PASS> < api/db/init-dev.sql
- Run
make dev
to get dependencies set up. (This includes mysql client dependencies, which build native bindings. If you installed mysql/openssl via homebrew, you may need toexport LDFLAGS=-L/usr/local/opt/openssl/lib
before running this to make sure it can find openssl.
Anytime you check out new code, you should:
- Re-run
make dev
- Re-run
make migrate
to make sure your db is upgraded with any new schemas - Re-run
make test
to ensure all the tests pass before you begin work
Anytime you are about to submit a pull request, you should:
- Check if a database schema migration is required (and if so, generate, verify, and commit it)
- Re-run
make test
to ensure tests pass before you commit - Re-run
make lint
to ensure code style conforms to the standard.
Command | Action |
---|---|
make dev |
Install dev deps |
make reqs |
Generate requirements files from pipenv, for use with e.g. appengine |
make serve |
Run the development server on the default port of 5000 |
make migrate-init |
Initialize the migration system |
make migrate-new |
Create a new migration file to review and commit |
make migrate-rev |
Create an empty migration revision file to be filled out manually (useful when writing a custom migration script, for example to add options to an enum field type |
make migrate |
Update the database based on all the current migration files |
make migrate-history |
Show the migration history |
make test |
Run the API tests using the test database (requires that the db initialization script have been run) |
make lint |
Run the lint script to ensure you aren't going to commit any style errors |
You probably won't have access to production services, but if you do:
- In production, we run on Google AppEngine, in a Flask-like environment. Assume you have the gcloud client ready.
app.yaml
defines the AppEngine configapp_secrets.yaml
is not checked into git but contains secret environment variables used in production.
Command | Action |
---|---|
make deploy |
Deploy the app to AppEngine. Requires the OTD_PROJECT_ID env var to be set correctly, and the app_secrets.yaml file to be in the root of the repo. This file contains the environment variables used to connect to the production database. |
FLASK_ENV=prod_migration make migrate |
Migrate the production database. Must have correct db data set in app_secrets.yaml and have the access to make a connection to the prod db |
Note that care must be taken regarding DB migrations, including using a phased deploy strategy. If a breaking change is made to a DB schema, code must first be deployed that can handle either version of the schema. Then the migration can be run, and only after that can new code be deployed that is responsible only for handling the new version of the schema.