Skip to content

Commit

Permalink
Merge branch 'master' of github.com:catalyst-team/catalyst
Browse files Browse the repository at this point in the history
  • Loading branch information
Scitator committed May 7, 2020
2 parents 224004b + 9c29095 commit ba6da28
Show file tree
Hide file tree
Showing 121 changed files with 1,665 additions and 677 deletions.
5 changes: 3 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

-

## [20.05] - YYYY-MM-DD
## [20.05] - 2020-05-07

### Added

Expand All @@ -46,6 +46,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Github actions CI was updated ([#754](https://github.com/catalyst-team/catalyst/pull/754))
- Changed default `num_epochs` to 1 for `State` ([#756](https://github.com/catalyst-team/catalyst/pull/756))
- Changed `state.batch_in`/`state.batch_out` to `state.input`/`state.output` ([#763](https://github.com/catalyst-team/catalyst/pull/763))
- Moved `torchvision` dependency from `catalyst` to `catalyst[cv]` ([#738](https://github.com/catalyst-team/catalyst/pull/738)))

### Removed

Expand All @@ -64,7 +65,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Fixed Dockerfile dependency ([#780](https://github.com/catalyst-team/catalyst/pull/780))


## [20.04] - 2020-04-21
## [20.04] - 2020-04-06

### Added

Expand Down
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
.PHONY: check-docs docker docker-fp16 docker-dev docker-dev-fp16 install-from-source clean

check-docs:
bash ./bin/codestyle/_check_docs.sh
bash ./bin/tests/check_docs.sh

docker: ./requirements/
echo building $${REPO_NAME:-catalyst-base}:$${TAG:-latest} ...
Expand Down
2 changes: 1 addition & 1 deletion bin/tests/check_dl_core.sh
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ rm -rf ./tests/logs
# (set -e; for f in tests/_tests_scripts/*.py; do PYTHONPATH=./catalyst:${PYTHONPATH} python "$f"; done)
(set -e; for f in tests/_tests_scripts/core_*.py; do PYTHONPATH=./catalyst:${PYTHONPATH} python "$f"; done)
(set -e; for f in tests/_tests_scripts/dl_*.py; do PYTHONPATH=./catalyst:${PYTHONPATH} python "$f"; done)
(set -e; for f in tests/_tests_scripts/z_*.py; do PYTHONPATH=./catalyst:${PYTHONPATH} python "$f"; done)
#(set -e; for f in tests/_tests_scripts/z_*.py; do PYTHONPATH=./catalyst:${PYTHONPATH} python "$f"; done)


################################ pipeline 99 ################################
Expand Down
6 changes: 3 additions & 3 deletions bin/tests/check_dl_core_callbacks.sh
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ function check_checkpoints {


################################ pipeline 00 ################################
# checking dafult parameters of checkpoint and one stage
# checking default parameters of checkpoint and one stage
LOG_MSG='pipeline 00'
echo ${LOG_MSG}

Expand Down Expand Up @@ -295,7 +295,7 @@ rm -rf ${LOGDIR}


################################ pipeline 09 ################################
# checking with one checkpoint and two stages
# checking with one checkpoint and two stages
# with different ''load_on_stage_end'' options
LOG_MSG='pipeline 09'
echo ${LOG_MSG}
Expand Down Expand Up @@ -325,7 +325,7 @@ rm -rf ${LOGDIR}


################################ pipeline 10 ################################
# checking with three checkpoints and two stages
# checking with three checkpoints and two stages
# with different ''load_on_stage_end'' options
LOG_MSG='pipeline 10'
echo ${LOG_MSG}
Expand Down
191 changes: 191 additions & 0 deletions bin/tests/check_dl_core_settings.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,191 @@
#!/usr/bin/env bash

# Cause the script to exit if a single command fails
set -eo pipefail -v

pip uninstall -r requirements/requirements-contrib.txt -y
pip uninstall -r requirements/requirements-cv.txt -y
pip uninstall -r requirements/requirements-ecosystem.txt -y
pip uninstall -r requirements/requirements-ml.txt -y
pip uninstall -r requirements/requirements-nlp.txt -y
pip install -r requirements/requirements.txt

################################ pipeline 00 ################################
# checking catalyst-core loading (default)
cat <<EOT > .catalyst
[catalyst]
contrib_required = false
cv_required = false
ml_required = false
nlp_required = false
EOT

python -c """
from catalyst.contrib.dl import callbacks
from catalyst.contrib import utils
try:
callbacks.AlchemyLogger
except (AttributeError, ImportError):
pass # Ok
else:
raise AssertionError('\'ImportError\' expected')
"""


################################ pipeline 01 ################################
# checking catalyst-contrib dependencies loading
cat <<EOT > .catalyst
[catalyst]
contrib_required = true
cv_required = false
ml_required = false
nlp_required = false
EOT

# check if fail if requirements not installed
python -c """
from catalyst.tools import settings
assert settings.use_lz4 == False and settings.use_pyarrow == False
try:
from catalyst.contrib.dl.callbacks import AlchemyLogger, VisdomLogger
except ImportError:
pass # Ok
else:
raise AssertionError('\'ImportError\' expected')
"""

pip install -r requirements/requirements-contrib.txt
pip install -r requirements/requirements-ecosystem.txt

python -c """
from catalyst.contrib.dl.callbacks import AlchemyLogger, VisdomLogger
"""


################################ pipeline 02 ################################
# checking catalyst-cv dependencies loading
cat <<EOT > .catalyst
[catalyst]
contrib_required = false
cv_required = true
ml_required = false
nlp_required = false
EOT

# check if fail if requirements not installed
python -c """
from catalyst.tools import settings
assert settings.use_libjpeg_turbo == False
try:
from catalyst.contrib.data import cv as cv_data
from catalyst.contrib.dl.callbacks import InferMaskCallback
from catalyst.contrib.models import cv as cv_models
from catalyst.contrib.utils import imread, imwrite
from catalyst.data.__main__ import COMMANDS
assert not (
'process-images' in COMMANDS
or 'process-images' in COMMANDS
or 'project-embeddings' in COMMANDS
)
except (ImportError, AssertionError):
pass # Ok
else:
raise AssertionError('\'ImportError\' or \'AssertionError\' expected')
"""

pip install -r requirements/requirements-cv.txt

python -c """
from catalyst.contrib.data import cv as cv_data
from catalyst.contrib.dl.callbacks import InferMaskCallback
from catalyst.contrib.models import cv as cv_models
from catalyst.contrib.utils import imread, imwrite
from catalyst.data.__main__ import COMMANDS
assert (
'process-images' in COMMANDS
and 'process-images' in COMMANDS
and 'project-embeddings' in COMMANDS
)
"""


################################ pipeline 03 ################################
# checking catalyst-ml dependencies loading
cat <<EOT > .catalyst
[catalyst]
contrib_required = false
cv_required = false
ml_required = true
nlp_required = false
EOT

# check if fail if requirements not installed
python -c """
try:
from catalyst.contrib.__main__ import COMMANDS
assert not (
'check-index-model' in COMMANDS or 'create-index-model' in COMMANDS
)
except (ImportError, AssertionError):
pass # Ok
else:
raise AssertionError('\'ImportError\' or \'AssertionError\' expected')
"""

pip install -r requirements/requirements-ml.txt

python -c """
from catalyst.contrib.__main__ import COMMANDS
assert 'check-index-model' in COMMANDS and 'create-index-model' in COMMANDS
"""


################################ pipeline 04 ################################
# checking catalyst-nlp dependencies loading
cat <<EOT > .catalyst
[catalyst]
contrib_required = false
cv_required = false
ml_required = false
nlp_required = true
EOT

# check if fail if requirements not installed
python -c """
try:
from catalyst.contrib.data import nlp as nlp_data
from catalyst.contrib.models import nlp as nlp_models
from catalyst.contrib.utils import tokenize_text, process_bert_output
from catalyst.contrib.__main__ import COMMANDS as CONTRIB_SCRIPTS
from catalyst.data.__main__ import COMMANDS
assert 'text2embedding' not in COMMANDS
except (ImportError, AssertionError):
pass # Ok
else:
raise AssertionError('\'ImportError\' or \'AssertionError\' expected')
"""

pip install -r requirements/requirements-nlp.txt

python -c """
from catalyst.contrib.data import nlp as nlp_data
from catalyst.contrib.models import nlp as nlp_models
from catalyst.contrib.utils import tokenize_text, process_bert_output
from catalyst.data.__main__ import COMMANDS
assert 'text2embedding' in COMMANDS
"""


################################ pipeline 99 ################################
rm .catalyst
File renamed without changes.
4 changes: 2 additions & 2 deletions catalyst/contrib/__main__.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
from argparse import ArgumentParser, RawTextHelpFormatter
from collections import OrderedDict
import logging
import os

from catalyst.contrib.scripts import collect_env, find_thresholds
from catalyst.tools import settings

logger = logging.getLogger(__name__)

Expand All @@ -18,7 +18,7 @@
COMMANDS["check-index-model"] = check_index_model
COMMANDS["create-index-model"] = create_index_model
except ImportError as ex:
if os.environ.get("USE_NMSLIB", "0") == "1":
if settings.nmslib_required:
logger.warning(
"nmslib not available, to install nmslib,"
" run `pip install nmslib`."
Expand Down
1 change: 1 addition & 0 deletions catalyst/contrib/data/cv/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
# flake8: noqa

from .mixins import *
from .reader import *
from .transforms import *
92 changes: 92 additions & 0 deletions catalyst/contrib/data/cv/reader.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
from typing import Tuple, Union

from catalyst import utils
from catalyst.data.reader import ReaderSpec


class ImageReader(ReaderSpec):
"""Image reader abstraction. Reads images from a ``csv`` dataset."""

def __init__(
self,
input_key: str,
output_key: str,
rootpath: str = None,
grayscale: bool = False,
):
"""
Args:
input_key (str): key to use from annotation dict
output_key (str): key to use to store the result
rootpath (str): path to images dataset root directory
(so your can use relative paths in annotations)
grayscale (bool): flag if you need to work only
with grayscale images
"""
super().__init__(input_key, output_key)
self.rootpath = rootpath
self.grayscale = grayscale

def __call__(self, element):
"""Reads a row from your annotations dict with filename and
transfer it to an image
Args:
element: elem in your dataset
Returns:
np.ndarray: Image
"""
image_name = str(element[self.input_key])
img = utils.imread(
image_name, rootpath=self.rootpath, grayscale=self.grayscale
)

output = {self.output_key: img}
return output


class MaskReader(ReaderSpec):
"""Mask reader abstraction. Reads masks from a `csv` dataset."""

def __init__(
self,
input_key: str,
output_key: str,
rootpath: str = None,
clip_range: Tuple[Union[int, float], Union[int, float]] = (0, 1),
):
"""
Args:
input_key (str): key to use from annotation dict
output_key (str): key to use to store the result
rootpath (str): path to images dataset root directory
(so your can use relative paths in annotations)
clip_range (Tuple[int, int]): lower and upper interval edges,
image values outside the interval are clipped
to the interval edges
"""
super().__init__(input_key, output_key)
self.rootpath = rootpath
self.clip = clip_range

def __call__(self, element):
"""Reads a row from your annotations dict with filename and
transfer it to a mask
Args:
element: elem in your dataset.
Returns:
np.ndarray: Mask
"""
mask_name = str(element[self.input_key])
mask = utils.mimread(
mask_name, rootpath=self.rootpath, clip_range=self.clip
)

output = {self.output_key: mask}
return output


__all__ = ["ImageReader", "MaskReader"]
1 change: 1 addition & 0 deletions catalyst/contrib/data/dataset/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
This subpackage was borrowed from [torchvision](https://github.com/pytorch/vision).
8 changes: 8 additions & 0 deletions catalyst/contrib/data/dataset/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# flake8: noqa

from catalyst.contrib.data.dataset.mnist import MNIST
from catalyst.contrib.data.dataset.transforms import (
Compose,
Normalize,
ToTensor,
)
Loading

0 comments on commit ba6da28

Please sign in to comment.