- Coding style
- Pre-commit
- We integrate pre-commit into our
framework to make sure the consistency of coding style. If you use
git commit
inside docker container, it will check whether coding style is compliant to our requirement or not. Instead of docker env, if you usegit commit
in another env, you may need to run the following command first:cd card-segmentation/ pip3 install pre-commit==2.6.0 pre-commit install --install-hooks
- We integrate pre-commit into our
framework to make sure the consistency of coding style. If you use
- Install docker, docker-compose and nvidia-docker2
- Clone repo
git clone https://github.com/NickLi0605/card_segmentation.git
- Run docker-compose
Note:
cd card_segmentation docker-compose up
- This docker image will start jupyter notebook automatically.
- You may need to replace the volume by your self in
docker-compose.yml
- Install pyenv
- Create virtual env for python with pyenv
pyenv install 3.8.3 pyenv virtualenv 3.8.3 <venv_name> # create virtual env pyenv local <venv_name> # apply virtualenv for the project
- Install dependencies
pip install -r requirement.txt
- Enable pre-commit hooks
pre-commit install
- Must to
- Construct environment with dockers
- Download dataset
- Download scripts for midv2019 and midv500
- Convert to coco format
- How to donlowad:
Note: this repo is forked from here and add download links for midv2019
git clone https://github.com/AlexLi0605/midv500 python run.py --dataset_dir "path to store dataset" --convert_to_coco
- Split dataset into training / validation / testing with reasons
- Basic training & inference code with pre-trained model
- Apply data augumentation with albumentations
- Benchmark for different models with
- Inference time
- Model size
- Memory usage
- Demo video
- Nice to have
- Refactor others' codes if we use them
- Analysis the pros and cons for detection2
- Consider the issues that model may face and how to solve it
- Idea about model deployment, data collection, data sacing and automatic model-tuning