diff --git a/.gitignore b/.gitignore index 5d1f98c0..cf5d0c0c 100644 --- a/.gitignore +++ b/.gitignore @@ -1,4 +1,7 @@ ./data +wandb/ +lightning_logs/ +*.pth # Byte-compiled / optimized / DLL files __pycache__/ *.py[cod] diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index e92ff835..a2ae7607 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -1,4 +1,13 @@ repos: +- repo: local + hooks: + - id: update-requirements-txt + name: update-requirements-txt + description: 'Generate requirements.txt based on poetry.lock' + entry: poetry + args: [export, --format, requirements.txt, --output, requirements.txt] + language: system + pass_filenames: false - repo: https://github.com/pre-commit/pre-commit-hooks rev: v4.4.0 hooks: diff --git a/README.md b/README.md old mode 100755 new mode 100644 index 83c77f7c..c95d2692 --- a/README.md +++ b/README.md @@ -1,450 +1,115 @@ -## Requirements +# What is InnoFW? -- Linux|Windows operating system -- 30 GB+ of storage, 4 GB+ RAM -- Python 3.8+, <3.10 -- Poetry 1.1+ +InnoFW is a configuration-based machine learning framework that helps people to get started with development of machine learning solutions. InnoFW is easy to pickup and play. And easy to master. -## Section 1. Configuring Loss Function +- define configuration files for models, datasets, optimizers, losses, metrics etc. and interchange them with one another +- have unified and intuitive code structure +- powerful CLI for argument passing +- train models on multiple gpu by passing a flag +- select loggers: Tensorboard, ClearML, Wandb and ...(upcoming) +- Easy work with S3-like storages -Let's take a look at the following .yaml file -```yaml +# Why to use InnoFW? -task: - - image-segmentation +This framework serves as a template for different projects members to start working on a problem. Machine learning engineers can enjoy ease of integration of a new model and software developers can value unified and documented API for model training and inference. -implementations: - torch: # framework name - JaccardLoss: # name of the loss function(can any name) - weight: 0.3 # weight of the loss function(can be any float) - object: # can be `function` - # path to the class/function(can be a local path or a installed library code) - _target_: pytorch_toolbelt.losses.JaccardLoss - mode: binary # additional argument for the class - BinaryFocalLoss: # another loss, structure is similar to above described loss - weight: 0.7 - object: - _target_: pytorch_toolbelt.losses.BinaryFocalLoss -``` -`task` denotes the type of the problem these loss functions were designed to be used. +> Please note that the project is under early development stage. -`implementations` contains information on how to instantiate the loss functions for different frameworks. -Inner level is for framework names. Here we can use `torch`, `sklearn`, `xgboost` etc. -Inside of the framework level we have the names of the objects. Names are later used during logging. -You are free to select any name. +# Project is based on +1. pytorch lightning +2. hydra +3. pydantic +4. sklearn +5. torch -Latter if we go inside the "name's level" we will have two fields: weight, object/function. -Weight is used to specify the weight of the loss function. -#### Object/Function: +# How It Works +InnoFW uses hydra to provide configuration structure that is suitable for the most machine learning projects. -**TL;DR** +1. Create an experiment config file in the folder ```config/experiments/``` based on ```config/experiments/template.yaml```. +2. Once you define your configuration file you can start training your model. + ```python train.py experiment=yolov5_cars``` +3. InnoFW checks the configuration file for consistency of individual modules(model, dataset, loss, optimizer etc.) and if everything is fine then selects and adapter. Adapter is responsible for starting the training, testing, validation and inference pipeline. +4. Model is being trained and checkpoints saved. -- if code to be instantiated is a function then name this field `function` -- if code to be instantiated is an object then name this field `object` +# Quick start ---- -Here we are choosing the type of the code we want to instantiate. - -It can be an `object` of a class or a `function`. -As functions cannot be instantiated right away without arguments. -We need to instantiate function later in the code when we receive arguments. - -Under the hood: - - object - gets instantiated - function - gets wrapped into a lambda function - -this allows us to have the same interface for both objects and functions later on. - -Example: - -In the following snippet we initialize the loss object `BinaryFocalLoss` - -```python -from pytorch_toolbelt.losses import BinaryFocalLoss -import torch - -criterion = BinaryFocalLoss() - -pred = torch.tensor([0.0, 0.0, 0.0]) -target = torch.tensor([1, 1, 1]) - -pred.unsqueeze_(0) -target.unsqueeze_(0) - -loss1 = criterion(pred, target) -``` - -In the following snippet we initialize the function `binary_cross_entropy` and pass arguments right away. - -```python -import torch -import torch.nn.functional as F - -pred = torch.tensor([0.0, 0.0, 0.0]) -target = torch.tensor([1, 1, 1]) - -pred.unsqueeze_(0) -target.unsqueeze_(0) - -loss1 = F.binary_cross_entropy(pred, target) -``` - -## Section 2. How to add your dataset? - -Now we will consider adding your custom dataset into the framework. - -1. Split your data into two folders: train and test. -2. Make sure that you have the corresponding datamodule to process your data. All the available datamodules stored in - `innofw/core/datamodules/`. Each datamodule has a `task` and `framework` attributes*. Pair of `task` and `framework` can - be duplicated, in this case difference is - in the data retrieval logic, select one that is more suitable for your problem. - 1. In case you have not found suitable datamodule then write your own. Refer - to [section 2.2](#Section-2.2.-Writing-own-datamodule). -3. Create a configuration file in config/datasets/[dataset_name].yaml\ - Dataset config file should be structured as follows: - ```yaml - - task: - - [dedicated task] - - name: [name of the dataset] - description: [specify dataset description] - - markup_info: [specify markup information] - date_time: [specify date] - - _target_: innofw.core.datamodules.[submodule].[submodule].[class_name] - - # =============== Data Paths ================= # - # use one of the following: - - # ====== 1. local data ====== # - train: - source: /path/to/file/or/folder - test: - source: /path/to/file/or/folder - # ====== 2. remote data ====== # - train: - source: https://api.blackhole.ai.innopolis.university/public-datasets/folder/train.zip - target: folder/to/extract/train/ - test: - source: https://api.blackhole.ai.innopolis.university/public-datasets/folder/test.zip - target: folder/to/extract/test/ - # ================================== # - - # some datamodules require additional arguments - # look for them in the documentation of each datamodule - # arguments passed in the following way: - arg1: value1 # here arg1 - name of the argument, value1 - value for the arg1 - arg2: value2 - # ... same for other datamodule arguments +1. install python 3.8-3.9 to your system +2. clone project + ```git clone https://github.com/InnopolisUni/innofw.git``` +3. create virtual env + ```python -m venv venv``` +4. install packages + ```pip install -r requirements.txt``` -4. To run prediction on new data you should create an inference datamodule configuration file. Configuration file is - alike to file created in 3. - ```yaml - - task: - - [dedicated task] - - name: [name of the dataset] - description: [specify dataset description] - - markup_info: [specify markup information] - date_time: [specify date] - - _target_: innofw.core.datamodules.[submodule].[submodule].[class_name] - - # =============== Data Paths ================= # - # use one of the following: - - # ====== 1. local data ====== # - infer: - source: /path/to/file/or/folder - # ====== 2. remote data ====== # - infer: - source: https://api.blackhole.ai.innopolis.university/public-datasets/folder/infer.zip - target: folder/to/extract/infer/ - # ================================== # - - # some datamodules require additional arguments - # look for them in the documentation of each datamodule - # arguments passed in the following way: - arg1: value1 # here arg1 - name of the argument, value1 - value for the arg1 - arg2: value2 - # ... same for other datamodule arguments - - -* \* `task` refers to the problem type where this datamodule is used. `framework` refers to the framework type where - this datamodule is used - -### Section 2.2. Writing own datamodule - -Datamodule is a class which has the following responsibilities: -1. creation of data loaders for each dataset type: train, test, val and infer. -2. dataset setting up(e.g. downloading, preprocessing, creating additional files etc.) -3. model predictions saving - formatting the predictions provided by a model - - - -For now all of our data modules inherit from following two classes: `PandasDataModule`, `BaseLightningDataModule` - -[//]: # (features of each datamodule) -PandasDataModule is suitable for tasks with input provided as table. The class provides the data by first uploading it into RAM. -BaseLightningDataModule is suitable for tasks where notion of 'batches' is reasonable for the data and the model. - -``` -from innofw.core.datamodules.lightning_datamodules.base import ( - BaseLightningDataModule, -) - - -class DataModule(BaseLightningDataModule): - def setup(self, *args, **kwargs): - pass - - def train_dataloader(self): - pass - - def val_dataloader(self): - pass - - def test_dataloader(self): - pass -``` - -Where each dataloader utilizes the dataset(similar term as -[torch's Dataset](https://pytorch.org/tutorials/beginner/basics/data_tutorial.html)) - -## Section 3. How to train a new model? - -### Pytorch model - -[//]: # (train a third-party model) - -[//]: # (link in the conf file) - - -[//]: # (train your own model) -If you have written your own model, for instance this dummy model: - -```python -import torch.nn as nn - - -class MNISTClassifier(nn.Module): - def __init__(self, hidden_dim: int = 100): - super().__init__() - self.layers = nn.Sequential( - nn.Linear(28 * 28, hidden_dim), - nn.Linear(hidden_dim, 10) - ) - def forward(self, x): - return self.layers(x) -``` +For full guide on installation using poetry or docker follow this [documentation page](google.com) -And you would like to add train it. Then you should do the following: - -1. add `task` and `framework` parameters - ```python - import torch.nn as nn - - - class MNISTClassifier(nn.Module): - task = ['image-classification'] - framework = ['torch'] - # rest of the code is the same - ``` - - **standard list of tasks:** - - image-classification - - image-segmentation - - image-detection - - table-regression - - table-classification - - table-clustering - ... - - **standard list of frameworks:** - - torch - - sklearn - - xgboost + -2. add the file with model to `innofw/core/models/torch/architectures/[task]/file_with_nn_module.py` -3. make sure dictionary in `get_default` in `innofw/utils/defaults.py` contains a mapping between your - task and a lightning module - if `task` has no corresponding `pytorch_lightning.LightningModule` add new implementation in this - folder `innofw/core/models/torch/lightning_modules/[task].py`. +# Covered Tasks: +- semantic segmentation +- image classification +- object detection +- tabular data regression +- tabular data classification +- tabular data clustering +- one-shot learning +- anomaly detection in time series - > for more information on lightning modules - visit [official documentation](https://pytorch-lightning.readthedocs.io/en/latest/common/lightning_module.html) -4. make sure you have suitable dataset class for your model. Refer to [chapter 2]() +# Models List: +- yolov5(+seg) +- segmentation-models.pytorch models +- sklearn +- torchvision's resnet18 +- lstm +- one-shot learning +- biobert -5. add configuration file to your model. +# FAQ +1. can I use pip over poetry? + Yes, there is a requirements.txt file - in `config/models/[model_name].yaml` define a `_target_` field and arguments for your model. +2. - For example: - ```yaml - _target_: innofw.core.models.torch.architectures.classification.MNISTClassifier - hidden_dim: 256 +# Troubleshooting +1. poetry does not install packages + try ``` + sudo -E env "PATH=$PATH" poetry install + ``` +2. poetry does not install packages -Now you are able to train and test your model! 😊 - -## Section 4. Start training, testing and inference - -1. Make sure you have needed working dataset configuration. See Section 2. -2. Make sure you have needed working model configuration file. See Section 3. -3. Write an experiment file - - For instance file in folder `config/experiments` named `KA_130722_yolov5.yaml` with contents: - ```yaml - - # @package _global_ - defaults: - - override /models: [model_config_name] - - override /datasets: [dataset_config_name] - - project: [project_name] - task: [task_name] - seed: 42 - epochs: 300 - batch_size: 4 - weights_path: /path/to/store/weights - weights_freq: 1 # weights saving frequency - - ckpt_path: /path/to/saved/model/weights.pt - ``` -4. Launch training - -```shell -python train.py experiments=KA_130722_yolov5.yaml -``` - -5. Launch testing - -```shell -python test.py experiments=KA_130722_yolov5.yaml -``` - -6. Launch inference - -```shell -python infer.py experiments=KA_130722_yolov5.yaml -``` - -## Section 5. Training on GPU - -[//]: # (#devices:) - -[//]: # () - -[//]: # (# - 0) - -[//]: # () - -[//]: # (#gpus:) - -[//]: # () - -[//]: # (# - 0 # 1) - -[//]: # () - -[//]: # (# - 2 # 3) - -[//]: # () - -[//]: # (# accelerator: cpu # цпу) - -[//]: # () - -[//]: # (# accelerator: gpu #) - -[//]: # () - -[//]: # (#gpus: -1 # все гпу) - -[//]: # () - -[//]: # (#) - -[//]: # () - -[//]: # (#) - -[//]: # () - -[//]: # (#accelerator: gpu) - -[//]: # (#devices: 2 # найдет свободные гпу) - -[//]: # (#accelerator: gpu # gpu, cpu) - -[//]: # (#devices: 2 # If the devices flag is not defined,) - -[//]: # () - -[//]: # (# it will assume devices to be "auto" and fetch the auto_device_count from the accelerator. [1]) - -[//]: # () - -[//]: # (#auto_select_gpus: True # False # [1]) - -[//]: # () - -References: - -1. https://pytorch-lightning.readthedocs.io/en/stable/common/trainer.html - -[//]: # () - -[//]: # (# 2.) - -[//]: # (## What is the difference between a wrapper and an adapter?) - -[//]: # () - -[//]: # (case: Wrapper) - -[//]: # (cfg -> MODEL -> Wrapper -> wrapped_model .fit) - -[//]: # (.test) - -[//]: # (.val) - -[//]: # (.predict # under the hood calls) - -[//]: # () - -[//]: # (case: Adapter) - -[//]: # (cfg -> adapted_model .fit # under the hood calls train function of underlying library/framework) +# Contributing +1. fork the framework +2. make a commit to your fork +3. make a pull request -[//]: # (.test # under the hood calls train function of underlying library/framework) -[//]: # (.val # under the hood calls train function of underlying library/framework) +We welcome any contribution from typo fixes to integration of new libraries with models, optimizers, augmentations etc. -[//]: # (.predict # under the hood calls train function of underlying library/framework) -## Section 6. Versioning rules -1) Framework versions must be specified in X.Y.Z format, where: -‒ X – older version (updates in case of big changes); -‒ Y – younger version (updates in case of small changes); -‒ Z - tiny changes (updates in case of tiny changes). -2) When one of the numbers is increased, all numbers after it must be set to zero. -3) Backward compatibility in software must be maintained in all versions with the same older version. +# Upcoming +1. models from mmsegmentation +2. models from huggingface +3. refactored datamodule +4. lion optimizer +inspirations: +1. lightning flash +2. ludwig +3. catalyst diff --git a/README_old.md b/README_old.md new file mode 100755 index 00000000..83c77f7c --- /dev/null +++ b/README_old.md @@ -0,0 +1,450 @@ +## Requirements + +- Linux|Windows operating system +- 30 GB+ of storage, 4 GB+ RAM +- Python 3.8+, <3.10 +- Poetry 1.1+ + +## Section 1. Configuring Loss Function + +Let's take a look at the following .yaml file + +```yaml + +task: + - image-segmentation + +implementations: + torch: # framework name + JaccardLoss: # name of the loss function(can any name) + weight: 0.3 # weight of the loss function(can be any float) + object: # can be `function` + # path to the class/function(can be a local path or a installed library code) + _target_: pytorch_toolbelt.losses.JaccardLoss + mode: binary # additional argument for the class + BinaryFocalLoss: # another loss, structure is similar to above described loss + weight: 0.7 + object: + _target_: pytorch_toolbelt.losses.BinaryFocalLoss +``` + +`task` denotes the type of the problem these loss functions were designed to be used. + +`implementations` contains information on how to instantiate the loss functions for different frameworks. + +Inner level is for framework names. Here we can use `torch`, `sklearn`, `xgboost` etc. +Inside of the framework level we have the names of the objects. Names are later used during logging. +You are free to select any name. + +Latter if we go inside the "name's level" we will have two fields: weight, object/function. +Weight is used to specify the weight of the loss function. + +#### Object/Function: + +**TL;DR** + +- if code to be instantiated is a function then name this field `function` +- if code to be instantiated is an object then name this field `object` + + +--- +Here we are choosing the type of the code we want to instantiate. + +It can be an `object` of a class or a `function`. +As functions cannot be instantiated right away without arguments. +We need to instantiate function later in the code when we receive arguments. + +Under the hood: + + object - gets instantiated + function - gets wrapped into a lambda function + +this allows us to have the same interface for both objects and functions later on. + +Example: + +In the following snippet we initialize the loss object `BinaryFocalLoss` + +```python +from pytorch_toolbelt.losses import BinaryFocalLoss +import torch + +criterion = BinaryFocalLoss() + +pred = torch.tensor([0.0, 0.0, 0.0]) +target = torch.tensor([1, 1, 1]) + +pred.unsqueeze_(0) +target.unsqueeze_(0) + +loss1 = criterion(pred, target) +``` + +In the following snippet we initialize the function `binary_cross_entropy` and pass arguments right away. + +```python +import torch +import torch.nn.functional as F + +pred = torch.tensor([0.0, 0.0, 0.0]) +target = torch.tensor([1, 1, 1]) + +pred.unsqueeze_(0) +target.unsqueeze_(0) + +loss1 = F.binary_cross_entropy(pred, target) +``` + +## Section 2. How to add your dataset? + +Now we will consider adding your custom dataset into the framework. + +1. Split your data into two folders: train and test. +2. Make sure that you have the corresponding datamodule to process your data. All the available datamodules stored in + `innofw/core/datamodules/`. Each datamodule has a `task` and `framework` attributes*. Pair of `task` and `framework` can + be duplicated, in this case difference is + in the data retrieval logic, select one that is more suitable for your problem. + 1. In case you have not found suitable datamodule then write your own. Refer + to [section 2.2](#Section-2.2.-Writing-own-datamodule). +3. Create a configuration file in config/datasets/[dataset_name].yaml\ + Dataset config file should be structured as follows: + ```yaml + + task: + - [dedicated task] + + name: [name of the dataset] + description: [specify dataset description] + + markup_info: [specify markup information] + date_time: [specify date] + + _target_: innofw.core.datamodules.[submodule].[submodule].[class_name] + + # =============== Data Paths ================= # + # use one of the following: + + # ====== 1. local data ====== # + train: + source: /path/to/file/or/folder + test: + source: /path/to/file/or/folder + # ====== 2. remote data ====== # + train: + source: https://api.blackhole.ai.innopolis.university/public-datasets/folder/train.zip + target: folder/to/extract/train/ + test: + source: https://api.blackhole.ai.innopolis.university/public-datasets/folder/test.zip + target: folder/to/extract/test/ + # ================================== # + + # some datamodules require additional arguments + # look for them in the documentation of each datamodule + # arguments passed in the following way: + arg1: value1 # here arg1 - name of the argument, value1 - value for the arg1 + arg2: value2 + # ... same for other datamodule arguments + +4. To run prediction on new data you should create an inference datamodule configuration file. Configuration file is + alike to file created in 3. + ```yaml + + task: + - [dedicated task] + + name: [name of the dataset] + description: [specify dataset description] + + markup_info: [specify markup information] + date_time: [specify date] + + _target_: innofw.core.datamodules.[submodule].[submodule].[class_name] + + # =============== Data Paths ================= # + # use one of the following: + + # ====== 1. local data ====== # + infer: + source: /path/to/file/or/folder + # ====== 2. remote data ====== # + infer: + source: https://api.blackhole.ai.innopolis.university/public-datasets/folder/infer.zip + target: folder/to/extract/infer/ + # ================================== # + + # some datamodules require additional arguments + # look for them in the documentation of each datamodule + # arguments passed in the following way: + arg1: value1 # here arg1 - name of the argument, value1 - value for the arg1 + arg2: value2 + # ... same for other datamodule arguments + + +* \* `task` refers to the problem type where this datamodule is used. `framework` refers to the framework type where + this datamodule is used + +### Section 2.2. Writing own datamodule + +Datamodule is a class which has the following responsibilities: +1. creation of data loaders for each dataset type: train, test, val and infer. +2. dataset setting up(e.g. downloading, preprocessing, creating additional files etc.) +3. model predictions saving - formatting the predictions provided by a model + + + +For now all of our data modules inherit from following two classes: `PandasDataModule`, `BaseLightningDataModule` + +[//]: # (features of each datamodule) +PandasDataModule is suitable for tasks with input provided as table. The class provides the data by first uploading it into RAM. +BaseLightningDataModule is suitable for tasks where notion of 'batches' is reasonable for the data and the model. + +``` +from innofw.core.datamodules.lightning_datamodules.base import ( + BaseLightningDataModule, +) + + +class DataModule(BaseLightningDataModule): + def setup(self, *args, **kwargs): + pass + + def train_dataloader(self): + pass + + def val_dataloader(self): + pass + + def test_dataloader(self): + pass +``` + +Where each dataloader utilizes the dataset(similar term as +[torch's Dataset](https://pytorch.org/tutorials/beginner/basics/data_tutorial.html)) + +## Section 3. How to train a new model? + +### Pytorch model + +[//]: # (train a third-party model) + +[//]: # (link in the conf file) + + +[//]: # (train your own model) +If you have written your own model, for instance this dummy model: + +```python +import torch.nn as nn + + +class MNISTClassifier(nn.Module): + def __init__(self, hidden_dim: int = 100): + super().__init__() + self.layers = nn.Sequential( + nn.Linear(28 * 28, hidden_dim), + nn.Linear(hidden_dim, 10) + ) + + def forward(self, x): + return self.layers(x) +``` + +And you would like to add train it. Then you should do the following: + +1. add `task` and `framework` parameters + ```python + import torch.nn as nn + + + class MNISTClassifier(nn.Module): + task = ['image-classification'] + framework = ['torch'] + # rest of the code is the same + ``` + + **standard list of tasks:** + - image-classification + - image-segmentation + - image-detection + - table-regression + - table-classification + - table-clustering + ... + + **standard list of frameworks:** + - torch + - sklearn + - xgboost + + +2. add the file with model to `innofw/core/models/torch/architectures/[task]/file_with_nn_module.py` +3. make sure dictionary in `get_default` in `innofw/utils/defaults.py` contains a mapping between your + task and a lightning module + + if `task` has no corresponding `pytorch_lightning.LightningModule` add new implementation in this + folder `innofw/core/models/torch/lightning_modules/[task].py`. + + > for more information on lightning modules + visit [official documentation](https://pytorch-lightning.readthedocs.io/en/latest/common/lightning_module.html) + +4. make sure you have suitable dataset class for your model. Refer to [chapter 2]() + +5. add configuration file to your model. + + in `config/models/[model_name].yaml` define a `_target_` field and arguments for your model. + + For example: + + ```yaml + _target_: innofw.core.models.torch.architectures.classification.MNISTClassifier + hidden_dim: 256 + ``` + +Now you are able to train and test your model! 😊 + +## Section 4. Start training, testing and inference + +1. Make sure you have needed working dataset configuration. See Section 2. +2. Make sure you have needed working model configuration file. See Section 3. +3. Write an experiment file + + For instance file in folder `config/experiments` named `KA_130722_yolov5.yaml` with contents: + ```yaml + + # @package _global_ + defaults: + - override /models: [model_config_name] + - override /datasets: [dataset_config_name] + + project: [project_name] + task: [task_name] + seed: 42 + epochs: 300 + batch_size: 4 + weights_path: /path/to/store/weights + weights_freq: 1 # weights saving frequency + + ckpt_path: /path/to/saved/model/weights.pt + ``` +4. Launch training + +```shell +python train.py experiments=KA_130722_yolov5.yaml +``` + +5. Launch testing + +```shell +python test.py experiments=KA_130722_yolov5.yaml +``` + +6. Launch inference + +```shell +python infer.py experiments=KA_130722_yolov5.yaml +``` + +## Section 5. Training on GPU + +[//]: # (#devices:) + +[//]: # () + +[//]: # (# - 0) + +[//]: # () + +[//]: # (#gpus:) + +[//]: # () + +[//]: # (# - 0 # 1) + +[//]: # () + +[//]: # (# - 2 # 3) + +[//]: # () + +[//]: # (# accelerator: cpu # цпу) + +[//]: # () + +[//]: # (# accelerator: gpu #) + +[//]: # () + +[//]: # (#gpus: -1 # все гпу) + +[//]: # () + +[//]: # (#) + +[//]: # () + +[//]: # (#) + +[//]: # () + +[//]: # (#accelerator: gpu) + +[//]: # (#devices: 2 # найдет свободные гпу) + +[//]: # (#accelerator: gpu # gpu, cpu) + +[//]: # (#devices: 2 # If the devices flag is not defined,) + +[//]: # () + +[//]: # (# it will assume devices to be "auto" and fetch the auto_device_count from the accelerator. [1]) + +[//]: # () + +[//]: # (#auto_select_gpus: True # False # [1]) + +[//]: # () + +References: + +1. https://pytorch-lightning.readthedocs.io/en/stable/common/trainer.html + +[//]: # () + +[//]: # (# 2.) + +[//]: # (## What is the difference between a wrapper and an adapter?) + +[//]: # () + +[//]: # (case: Wrapper) + +[//]: # (cfg -> MODEL -> Wrapper -> wrapped_model .fit) + +[//]: # (.test) + +[//]: # (.val) + +[//]: # (.predict # under the hood calls) + +[//]: # () + +[//]: # (case: Adapter) + +[//]: # (cfg -> adapted_model .fit # under the hood calls train function of underlying library/framework) + + +[//]: # (.test # under the hood calls train function of underlying library/framework) + +[//]: # (.val # under the hood calls train function of underlying library/framework) + +[//]: # (.predict # under the hood calls train function of underlying library/framework) + +## Section 6. Versioning rules +1) Framework versions must be specified in X.Y.Z format, where: +‒ X – older version (updates in case of big changes); +‒ Y – younger version (updates in case of small changes); +‒ Z - tiny changes (updates in case of tiny changes). +2) When one of the numbers is increased, all numbers after it must be set to zero. +3) Backward compatibility in software must be maintained in all versions with the same older version. + + diff --git a/config/experiments/template.yaml b/config/experiments/template.yaml new file mode 100644 index 00000000..bfb882bf --- /dev/null +++ b/config/experiments/template.yaml @@ -0,0 +1,12 @@ +# @package _global_ +defaults: + - override /models: faster_rcnn + - override /datasets: detection_wheat + + +project: "template" +task: "none" +random_seed: 42 +epochs: 1 +batch_size: 2 +weights_freq: 1 diff --git a/requirements.txt b/requirements.txt index 4f64fd84..6ecd69c3 100644 --- a/requirements.txt +++ b/requirements.txt @@ -1,6 +1,8 @@ absl-py==1.3.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:34995df9bd7a09b3b8749e230408f5a2a2dd7a68a0d33c12a3d0cb15a041a507 \ --hash=sha256:463c38a08d2e4cef6c498b76ba5bd4858e4c6ef51da1a5a1f27139a022e20248 +aeronet==0.0.18 ; python_version >= "3.8" and python_version < "3.10" \ + --hash=sha256:298edb95e105e9bf3f7c3c5ddbfe0d2c9b9b9c602105d22498d8e1675b323f00 affine==2.3.1 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:d676de66157ad6af99ffd94e0f54e89dfc35b0fb7252ead2ed0ad2dca431bdd0 \ --hash=sha256:de17839ff05e965580870c3b15e14cefd7992fa05dba9202a0879bbed0c171e4 @@ -103,15 +105,15 @@ altair==4.2.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:d87d9372e63b48cd96b2a6415f0cf9457f50162ab79dc7a31cd7e024dd840026 antlr4-python3-runtime==4.9.3 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:f224469b4168294902bb1efa80a8bf7855f24c99aef99cbefc1bcd3cce77881b +appdirs==1.4.4 ; python_version >= "3.8" and python_version < "3.10" \ + --hash=sha256:7d5d0167b2b1ba821647616af46a749d1c653740dd0d2415100fe26e27afdf41 \ + --hash=sha256:a841dacd6b99318a741b166adb07e19ee71a274450e68237b4650ca1055ab128 appnope==0.1.3 ; python_version >= "3.8" and python_version < "3.10" and sys_platform == "darwin" \ --hash=sha256:02bd91c4de869fbb1e1c50aafc4098827a7a54ab2f39d9dcba6c9547ed920e24 \ --hash=sha256:265a455292d0bd8a72453494fa24df5a11eb18373a60c7c0430889f22548605e asttokens==2.1.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:1b28ed85e254b724439afc783d4bee767f780b936c3fe8b3275332f42cf5f561 \ --hash=sha256:4aa76401a151c8cc572d906aad7aea2a841780834a19d780f4321c0fe1b54635 -astunparse==1.6.3 ; python_version >= "3.8" and python_version < "3.10" \ - --hash=sha256:5ad93a8456f0d084c3456d059fd9a92cce667963232cbf763eac3bc5b7940872 \ - --hash=sha256:c2652417f2c8b5bb325c885ae329bdf3f86424075c4fd1a128674bc6fba4b8e8 async-timeout==4.0.2 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:2163e1640ddb52b7a8c80d0a67a08587e5d245cc9c553a74a847056bc2976b15 \ --hash=sha256:8ca1e4fcf50d07413d66d1a5e416e42cfdf5851c981d679a09851a6853383b3c @@ -275,6 +277,9 @@ deepchem==2.6.1 ; python_version >= "3.8" and python_version < "3.10" \ dill==0.3.5.1 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:33501d03270bbe410c72639b350e941882a8b0fd55357580fbc873fba0c59302 \ --hash=sha256:d75e41f3eff1eee599d738e76ba8f4ad98ea229db8b085318aa2b3333a208c86 +docker-pycreds==0.4.0 ; python_version >= "3.8" and python_version < "3.10" \ + --hash=sha256:6ce3270bcaf404cc4c3e27e4b6c70d3521deae82fb508767870fdbf772d584d4 \ + --hash=sha256:7266112468627868005106ec19cd0d722702d2b7d5912a28e19b826c3d37af49 efficientnet-pytorch==0.6.3 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:6667459336893e9bf6367de3788ba449fed97f65da3b6782bf2204b6273a319f entrypoints==0.4 ; python_version >= "3.8" and python_version < "3.10" \ @@ -283,14 +288,14 @@ entrypoints==0.4 ; python_version >= "3.8" and python_version < "3.10" \ executing==1.2.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:0314a69e37426e3608aada02473b4161d4caf5a4b244d1d0c48072b8fee7bacc \ --hash=sha256:19da64c18d2d851112f09c287f8d3dbbdf725ab0e569077efb6cdcbd3497c107 +fastcore==1.5.27 ; python_version >= "3.8" and python_version < "3.10" \ + --hash=sha256:79dffaa3de96066e4d7f2b8793f1a8a9468c82bc97d3d48ec002de34097b2a9f \ + --hash=sha256:c6b66b35569d17251e25999bafc7d9bcdd6446c1e710503c08670c3ff1eef271 filelock==3.8.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:55447caa666f2198c5b6b13a26d2084d26fa5b115c00d065664b2124680c4edc \ --hash=sha256:617eb4e5eedc82fc5f47b6d61e4d11cb837c56cb4544e39081099fa17ad109d4 fire==0.4.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:c5e2b8763699d1142393a46d0e3e790c5eb2f0706082df8f647878842c216a62 -flatbuffers==22.10.26 ; python_version >= "3.8" and python_version < "3.10" \ - --hash=sha256:8698aaa635ca8cf805c7d8414d4a4a8ecbffadca0325fa60551cb3ca78612356 \ - --hash=sha256:e36d5ba7a5e9483ff0ec1d238fdc3011c866aab7f8ce77d5e9d445ac12071d84 fonttools==4.38.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:2bb244009f9bf3fa100fc3ead6aeb99febe5985fa20afbfbaa2f8946c2fbdaf1 \ --hash=sha256:820466f43c8be8c3009aef8b87e785014133508f0de64ec469e4efb643ae54fb @@ -357,9 +362,6 @@ frozenlist==1.3.1 ; python_version >= "3.8" and python_version < "3.10" \ fsspec[http]==2022.10.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:6b7c6ab3b476cdf17efcfeccde7fca28ef5a48f73a71010aaceec5fc15bf9ebf \ --hash=sha256:cb6092474e90487a51de768170f3afa50ca8982c26150a59072b16433879ff1d -gast==0.4.0 ; python_version >= "3.8" and python_version < "3.10" \ - --hash=sha256:40feb7b8b8434785585ab224d1568b857edb18297e5a3047f1ba012bc83b42c1 \ - --hash=sha256:b7adcdd5adbebf1adf17378da5ba3f543684dbec47b1cda1f3997e573cd542c4 gitdb==4.0.9 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:8033ad4e853066ba6ca92050b9df2f89301b8fc8bf7e9324d412a63f8bf1a8fd \ --hash=sha256:bac2fd45c0a1c9cf619e63a90d62bdc63892ef92387424b855792a6cabe789aa @@ -372,10 +374,6 @@ google-auth-oauthlib==0.4.6 ; python_version >= "3.8" and python_version < "3.10 google-auth==2.13.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:9352dd6394093169157e6971526bab9a2799244d68a94a4a609f0dd751ef6f5e \ --hash=sha256:99510e664155f1a3c0396a076b5deb6367c52ea04d280152c85ac7f51f50eb42 -google-pasta==0.2.0 ; python_version >= "3.8" and python_version < "3.10" \ - --hash=sha256:4612951da876b1a10fe3960d7226f0c7682cf901e16ac06e473b267a5afa8954 \ - --hash=sha256:b32482794a366b5366a32c92a9a9201b107821889935a02b3e51f6b432ea84ed \ - --hash=sha256:c9f2c8dfc8f96d0d5808299920721be30c9eec37f2389f28904f454565c8a16e graphviz==0.20.1 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:587c58a223b51611c0cf461132da386edd896a029524ca61a1462b880bf97977 \ --hash=sha256:8c58f14adaa3b947daf26c19bc1e98c4e0702cdc31cf99153e6f06904d492bf8 @@ -485,11 +483,15 @@ joblib==1.2.0 ; python_version >= "3.8" and python_version < "3.10" \ jsonschema==4.16.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:165059f076eff6971bae5b742fc029a7b4ef3f9bcf04c14e4776a7605de14b23 \ --hash=sha256:9e74b8f9738d6a946d70705dc692b74b5429cd0960d58e79ffecfc43b2221eb9 +keras-applications==1.0.8 ; python_version >= "3.8" and python_version < "3.10" \ + --hash=sha256:5579f9a12bcde9748f4a12233925a59b93b73ae6947409ff34aa2ba258189fe5 \ + --hash=sha256:df4323692b8c1174af821bf906f1e442e63fa7589bf0f1230a0b6bdc5a810c95 keras-preprocessing==1.1.2 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:7b82029b130ff61cc99b55f3bd27427df4838576838c5b2f65940e4fcec99a7b \ --hash=sha256:add82567c50c8bc648c14195bf544a5ce7c1f76761536956c3d2978970179ef3 -keras==2.10.0 ; python_version >= "3.8" and python_version < "3.10" \ - --hash=sha256:26a6e2c2522e7468ddea22710a99b3290493768fc08a39e75d1173a0e3452fdf +keras==2.2.4 ; python_version >= "3.8" and python_version < "3.10" \ + --hash=sha256:794d0c92c6c4122f1f0fcf3a7bc2f49054c6a54ddbef8d8ffafca62795d760b6 \ + --hash=sha256:90b610a3dbbf6d257b20a079eba3fdf2eed2158f64066a7c6f7227023fd60bc9 kiwisolver==1.4.4 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:02f79693ec433cb4b5f51694e8477ae83b3205768a6fb48ffba60549080e295b \ --hash=sha256:03baab2d6b4a54ddbb43bba1a3a2d1627e82d205c5cf8f4c924dc49284b87166 \ @@ -559,17 +561,12 @@ kiwisolver==1.4.4 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:f6cb459eea32a4e2cf18ba5fcece2dbdf496384413bc1bae15583f19e567f3b2 \ --hash=sha256:f8ad8285b01b0d4695102546b342b493b3ccc6781fc28c8c6a1bb63e95d22f09 \ --hash=sha256:f9f39e2f049db33a908319cf46624a569b36983c7c78318e9726a4cb8923b26c -libclang==14.0.6 ; python_version >= "3.8" and python_version < "3.10" \ - --hash=sha256:206d2789e4450a37d054e63b70451a6fc1873466397443fa13de2b3d4adb2796 \ - --hash=sha256:2e4303e04517fcd11173cb2e51a7070eed71e16ef45d4e26a82c5e881cac3d27 \ - --hash=sha256:5dd3c6fca1b007d308a4114afa8e4e9d32f32b2572520701d45fcc626ac5cd6c \ - --hash=sha256:7b06fc76bd1e67c8b04b5719bf2ac5d6a323b289b245dfa9e468561d99538188 \ - --hash=sha256:8791cf3c3b087c373a6d61e9199da7a541da922c9ddcfed1122090586b996d6e \ - --hash=sha256:9052a8284d8846984f6fa826b1d7460a66d3b23a486d782633b42b6e3b418789 \ - --hash=sha256:cfb0e892ebb5dff6bd498ab5778adb8581f26a00fd8347b3c76c989fe2fd04f7 \ - --hash=sha256:e2add1703129b2abe066fb1890afa880870a89fd6ab4ec5d2a7a8dc8d271677e \ - --hash=sha256:e429853939423f276a25140b0b702442d7da9a09e001c05e48df888336947614 \ - --hash=sha256:ea03c12675151837660cdd5dce65bd89320896ac3421efef43a36678f113ce95 +lovely-numpy==0.2.8 ; python_version >= "3.8" and python_version < "3.10" \ + --hash=sha256:08de46fc872fc993c812ee9ddda2af4fc5f44263ed276b89afdb5518cb635f40 \ + --hash=sha256:3102f19cacd1ea209e0670c2f146cd74d005a74ad37dacd7071fd1e1945a9a4f +lovely-tensors==0.1.14 ; python_version >= "3.8" and python_version < "3.10" \ + --hash=sha256:4e606640c5d26a90ceb81ae8744e31fcaa281e766b49cebb72bb8fd45c705e1b \ + --hash=sha256:db8428876774fcc6e64f7224eef4621568c7e2e95e4117ba553b9b017f6184f6 lxml==4.9.1 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:04da965dfebb5dac2619cb90fcf93efdb35b3c6994fea58a157a834f2f94b318 \ --hash=sha256:0538747a9d7827ce3e16a8fdd201a99e661c7dee3c96c885d8ecba3c35d1032c \ @@ -821,7 +818,7 @@ munch==2.5.0 ; python_version >= "3.8" and python_version < "3.10" \ networkx==2.8.7 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:15cdf7f7c157637107ea690cabbc488018f8256fa28242aed0fb24c93c03a06d \ --hash=sha256:815383fd52ece0a7024b5fd8408cc13a389ea350cd912178b82eed8b96f82cd3 -numpy==1.23.4 ; python_version < "3.10" and python_version >= "3.8" \ +numpy==1.23.4 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:0fe563fc8ed9dc4474cbf70742673fc4391d70f4363f917599a7fa99f042d5a8 \ --hash=sha256:12ac457b63ec8ded85d85c1e17d85efd3c2b0967ca39560b307a35a6703a4735 \ --hash=sha256:2341f4ab6dba0834b685cce16dad5f9b6606ea8a00e6da154f5dbded70fdc4dd \ @@ -874,9 +871,6 @@ opencv-python==4.6.0.66 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:dbdc84a9b4ea2cbae33861652d25093944b9959279200b7ae0badd32439f74de \ --hash=sha256:e6e448b62afc95c5b58f97e87ef84699e6607fe5c58730a03301c52496005cae \ --hash=sha256:f482e78de6e7b0b060ff994ffd859bddc3f7f382bb2019ef157b0ea8ca8712f5 -opt-einsum==3.3.0 ; python_version >= "3.8" and python_version < "3.10" \ - --hash=sha256:2455e59e3947d3c275477df7f5205b30635e266fe6dc300e3d9f9646bfcea147 \ - --hash=sha256:59f6475f77bbc37dcf7cd748519c0ec60722e91e63ca114e68821c0c54a46549 packaging==21.3 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:dd47c42927d89ab911e606518907cc2d3a1f38bbd026385970643f9c5b8ecfeb \ --hash=sha256:ef103e05f519cdc783ae24ea4e2e0f508a9c99b2d4969652eed6a2e1ea5bd522 @@ -911,6 +905,8 @@ pandas==1.5.1 ; python_version >= "3.8" and python_version < "3.10" \ parso==0.8.3 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:8c07be290bb59f03588915921e29e8a50002acaf2cdc5fa0e0114f91709fafa0 \ --hash=sha256:c001d4636cd3aecdaf33cbb40aebb59b094be2a74c556778ef5576c175e19e75 +pathtools==0.1.2 ; python_version >= "3.8" and python_version < "3.10" \ + --hash=sha256:7c35c5421a39bb82e58018febd90e3b6e5db34c5443aaaf742b3f33d4655f1c0 patool==1.12 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:3f642549c9a78f5b8bef1af92df385b521d360520d1f34e4dba3fd1dee2a21bc \ --hash=sha256:e3180cf8bfe13bedbcf6f5628452fca0c2c84a3b5ae8c2d3f55720ea04cb1097 @@ -982,6 +978,9 @@ pillow==9.3.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:e6ea6b856a74d560d9326c0f5895ef8050126acfdc7ca08ad703eb0081e82b74 \ --hash=sha256:ebf2029c1f464c59b8bdbe5143c79fa2045a581ac53679733d3a91d400ff9efb \ --hash=sha256:f1ff2ee69f10f13a9596480335f406dd1f70c3650349e2be67ca3139280cade0 +pip==22.3.1 ; python_version >= "3.8" and python_version < "3.10" \ + --hash=sha256:65fd48317359f3af8e593943e6ae1506b66325085ea64b706a998c6e83eeaf38 \ + --hash=sha256:908c78e6bc29b676ede1c4d57981d490cb892eb45cd8c214ab6298125119e077 pkgutil-resolve-name==1.3.10 ; python_version >= "3.8" and python_version < "3.9" \ --hash=sha256:357d6c9e6a755653cfd78893817c0853af365dd51ec97f3d358a819373bbd174 \ --hash=sha256:ca27cc078d25c5ad71a9de0a7a330146c4e014c2462d9af19c6b828280649c5e @@ -1407,12 +1406,58 @@ rich==12.6.0 ; python_version >= "3.8" and python_version < "3.10" \ rsa==4.9 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:90260d9058e514786967344d0ef75fa8727eed8a7d2e43ce9f4bcf1b536174f7 \ --hash=sha256:e38464a49c6c85d7f1351b0126661487a7e0a14a50f1675ec50eb34d4f20ef21 +rtree==1.0.1 ; python_version >= "3.8" and python_version < "3.10" \ + --hash=sha256:004e131b570dc360a49e7f3b60e7bc6517943a54df056587964d1cb903889e7e \ + --hash=sha256:015df09e1bc55ddf7c88799bf1515d058cd0ee78eacf4cd443a32876d3b3a863 \ + --hash=sha256:0d68a81ad419d5c2ea5fecc677e6c178666c057e2c7b24100a6c48392196f1e9 \ + --hash=sha256:11d16f51cf9205cd6995af36e24efe8f184270f667fb49bb69b09fc46b97e7d4 \ + --hash=sha256:157207191aebdacbbdbb369e698cfbfebce53bc97114e96c8af5bed3126475f1 \ + --hash=sha256:16900ee02cf5c198a42b03635268a80f606aa102f3f7618b89f75023d406da1c \ + --hash=sha256:18ce7e4d04b85c48f2d364835620b3b20e38e199639746e7b12f07a2303e18ff \ + --hash=sha256:1a213e5d385278ca7668bc5b27083f8d6e39996a9bd59b6528f3a30009dae4ed \ + --hash=sha256:1a94e2f4bf74bd202ea8b67ea3d7c71e763ad41f79be1d6b72aa2c8d5a8e92c4 \ + --hash=sha256:1e894112cef4de6c518bdea0b43eada65f12888c3645cc437c3a677aa023039f \ + --hash=sha256:222121699c303a64065d849bf7038b1ecabc37b65c7fa340bedb38ef0e805429 \ + --hash=sha256:273ee61783de3a1664e5f868feebf5eea4629447137751bfa4087b0f82093082 \ + --hash=sha256:296203e933b6ec0dd07f6a7456c4f1492def95b6993f20cc61c92b0fee0aecc5 \ + --hash=sha256:2ee7165e9872a026ccb868c021711eba39cedf7d1820763c9de52d5324691a92 \ + --hash=sha256:3573cbb0de872f54d0a0c29596a84e8ac3939c47ca3bece4a82e92775730a0d0 \ + --hash=sha256:50b658a6707f215a0056d52e9f83a97148c0af62dea07cf29b3789a2c429e78a \ + --hash=sha256:57128293dd625cb1f07726f32208097953e8854d70ab1fc55d6858733618b9ed \ + --hash=sha256:582854252b8fd5c8472478af060635434931fb55edd269bac128cbf2eef43620 \ + --hash=sha256:5b20f69e040a05503b22297af223f336fe7047909b57e4b207b98292f33a229f \ + --hash=sha256:62f38020af47b765adc6b0bc7c4e810c6c3d1eab44ba339b592ff25a4c0dc0a7 \ + --hash=sha256:656b148589c0b5bab4a7db4d033634329f42a5feaac10ca40aceeca109d83c1f \ + --hash=sha256:6792de0e3c2fd3ad7e069445027603bec7a47000432f49c80246886311f4f152 \ + --hash=sha256:698de8ce6c62e159d93b35bacf64bcf3619077b5367bc88cd2cff5e0bc36169b \ + --hash=sha256:6ce4a6fdb63254a4c1efebe7a4f7a59b1c333c703bde4ae715d9ad88c833e10b \ + --hash=sha256:6db6a0a93e41594ffc14b053f386dd414ab5a82535bbd9aedafa6ac8dc0650d8 \ + --hash=sha256:77908cd7acdd519a731979ebf5baff8afd102109c2f52864c1e6ee75d3ea2d87 \ + --hash=sha256:784efa6b7be9e99b33613ae8495931032689441eabb6120c9b3eb91188c33794 \ + --hash=sha256:7b2c15f9373ba314c83a8df5cb6d99b4e3af23c376c6b1317add995432dd0970 \ + --hash=sha256:7e3d5f0e7b28250afbb290ab88b49aa0f121c9714d0da2080581783690347507 \ + --hash=sha256:8de99f28af0f1783eefb80918959903b4b18112f6a12b48f296ecb162804e69d \ + --hash=sha256:93c5e0bf31e76b4f92a6eec3d2891e938408774c75a8ed6ac3d2c8db04a2be33 \ + --hash=sha256:9855b8f11cdad99c56eb361b7b632a4fbd3d8cbe3f2081426b445f0cfb7fdca9 \ + --hash=sha256:ad9912faeddb1ddcec5e26b33089166d58a107af6862d8b7f1bb2b7c0002ab39 \ + --hash=sha256:b31fd22d214160859d038da7cb2aaa27acb71efc24a7bcc75c84b5e502721549 \ + --hash=sha256:b54057e8a8ad92c1d8e9eaa5cf32aad70dde454abbf9b638e9d6024520a52c02 \ + --hash=sha256:becd711fe97c2e09b1b7969e83080a3c8012bce2d30f6db879aade255fcba5c1 \ + --hash=sha256:c2973b76f61669a85e160b4ad09879c4089fc0e3f20fd99adf161ca298fe8374 \ + --hash=sha256:c5fb3671a8d440c24b1dd29ec621d4345ced7185e26f02abe98e85a6629fcb50 \ + --hash=sha256:c6e29e5eb3083ad12ac5c1ce6e37465ea3428d894d3466cc9c9e2ee4bf768e53 \ + --hash=sha256:cfa8cffec5cb9fed494c4bb335ebdb69b3c26178b0b685f67f79296c6b3d800c \ + --hash=sha256:d5abe5a19d943a88bea14901970e4c53e4579fc2662404cdea6163bf4c04d49a \ + --hash=sha256:e4335e131a58952635560a003458011d97f9ea6f3c010dc24906050b42ee2c03 \ + --hash=sha256:e7ca5d743f6a1dc62653dfac8ee7ce2e1ba91be7cf97916a7f60b7cbe48fb48d \ + --hash=sha256:e898d7409ab645c25e06d4e058f99271182601d70b2887aba3351bf08e09a0c6 \ + --hash=sha256:f5120da3a1b96f3a7a17dd6af0afdd4e6f3cc9baa87e9ee0a272882f01f980bb s3transfer==0.6.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:06176b74f3a15f61f1b4f25a1fc29a4429040b7647133a463da8fa5bd28d5ecd \ --hash=sha256:2ed07d3866f523cc561bf4a00fc5535827981b117dd7876f036b0c1aca42c947 -sahi==0.11.10 ; python_version >= "3.8" and python_version < "3.10" \ - --hash=sha256:7d3b038edfcbeeb393e07d1717bb0fc9d29767a197b9fe5f14b2a9f67c64a503 \ - --hash=sha256:b9b483e7adf4e7a2d9ad95e787a90ac62e7e391b0dcd4585014031820eadda7d +sahi==0.11.11 ; python_version >= "3.8" and python_version < "3.10" \ + --hash=sha256:30be5bbebc7d8cc56488b1edcec91e92f102ff7648f36c034dba580b3812fd0b \ + --hash=sha256:ee4be3dc79a1d64a3fdd5a2277c2bcfc46103ba4d184c6a1290a19ac11b79de5 scikit-image==0.19.3 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:03779a7e1736fdf89d83c0ba67d44110496edd736a3bfce61a2b5177a1c8a099 \ --hash=sha256:0b0a199157ce8487c77de4fde0edc0b42d6d42818881c11f459262351d678b2d \ @@ -1496,9 +1541,82 @@ selfies==2.1.1 ; python_version >= "3.8" and python_version < "3.10" \ semver==2.13.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:ced8b23dceb22134307c1b8abfa523da14198793d9787ac838e70e29e77458d4 \ --hash=sha256:fa0fe2722ee1c3f57eac478820c3a5ae2f624af8264cbdf9000c980ff7f75e3f -setuptools-scm==7.0.5 ; python_version >= "3.8" and python_version < "3.10" \ - --hash=sha256:031e13af771d6f892b941adb6ea04545bbf91ebc5ce68c78aaf3fff6e1fb4844 \ - --hash=sha256:7930f720905e03ccd1e1d821db521bff7ec2ac9cf0ceb6552dd73d24a45d3b02 +sentry-sdk==1.14.0 ; python_version >= "3.8" and python_version < "3.10" \ + --hash=sha256:273fe05adf052b40fd19f6d4b9a5556316807246bd817e5e3482930730726bb0 \ + --hash=sha256:72c00322217d813cf493fe76590b23a757e063ff62fec59299f4af7201dd4448 +setproctitle==1.3.2 ; python_version >= "3.8" and python_version < "3.10" \ + --hash=sha256:1c5d5dad7c28bdd1ec4187d818e43796f58a845aa892bb4481587010dc4d362b \ + --hash=sha256:1c8d9650154afaa86a44ff195b7b10d683c73509d085339d174e394a22cccbb9 \ + --hash=sha256:1f0cde41857a644b7353a0060b5f94f7ba7cf593ebde5a1094da1be581ac9a31 \ + --hash=sha256:1f29b75e86260b0ab59adb12661ef9f113d2f93a59951373eb6d68a852b13e83 \ + --hash=sha256:1fa1a0fbee72b47dc339c87c890d3c03a72ea65c061ade3204f285582f2da30f \ + --hash=sha256:1ff863a20d1ff6ba2c24e22436a3daa3cd80be1dfb26891aae73f61b54b04aca \ + --hash=sha256:265ecbe2c6eafe82e104f994ddd7c811520acdd0647b73f65c24f51374cf9494 \ + --hash=sha256:288943dec88e178bb2fd868adf491197cc0fc8b6810416b1c6775e686bab87fe \ + --hash=sha256:2a97d51c17d438cf5be284775a322d57b7ca9505bb7e118c28b1824ecaf8aeaa \ + --hash=sha256:2e3ac25bfc4a0f29d2409650c7532d5ddfdbf29f16f8a256fc31c47d0dc05172 \ + --hash=sha256:2fbd8187948284293f43533c150cd69a0e4192c83c377da837dbcd29f6b83084 \ + --hash=sha256:37ece938110cab2bb3957e3910af8152ca15f2b6efdf4f2612e3f6b7e5459b80 \ + --hash=sha256:4058564195b975ddc3f0462375c533cce310ccdd41b80ac9aed641c296c3eff4 \ + --hash=sha256:4749a2b0c9ac52f864d13cee94546606f92b981b50e46226f7f830a56a9dc8e1 \ + --hash=sha256:4bba3be4c1fabf170595b71f3af46c6d482fbe7d9e0563999b49999a31876f77 \ + --hash=sha256:4d8938249a7cea45ab7e1e48b77685d0f2bab1ebfa9dde23e94ab97968996a7c \ + --hash=sha256:5194b4969f82ea842a4f6af2f82cd16ebdc3f1771fb2771796e6add9835c1973 \ + --hash=sha256:55ce1e9925ce1765865442ede9dca0ba9bde10593fcd570b1f0fa25d3ec6b31c \ + --hash=sha256:570d255fd99c7f14d8f91363c3ea96bd54f8742275796bca67e1414aeca7d8c3 \ + --hash=sha256:587c7d6780109fbd8a627758063d08ab0421377c0853780e5c356873cdf0f077 \ + --hash=sha256:589be87172b238f839e19f146b9ea47c71e413e951ef0dc6db4218ddacf3c202 \ + --hash=sha256:5b932c3041aa924163f4aab970c2f0e6b4d9d773f4d50326e0ea1cd69240e5c5 \ + --hash=sha256:5fb4f769c02f63fac90989711a3fee83919f47ae9afd4758ced5d86596318c65 \ + --hash=sha256:630f6fe5e24a619ccf970c78e084319ee8be5be253ecc9b5b216b0f474f5ef18 \ + --hash=sha256:65d884e22037b23fa25b2baf1a3316602ed5c5971eb3e9d771a38c3a69ce6e13 \ + --hash=sha256:6c877691b90026670e5a70adfbcc735460a9f4c274d35ec5e8a43ce3f8443005 \ + --hash=sha256:710e16fa3bade3b026907e4a5e841124983620046166f355bbb84be364bf2a02 \ + --hash=sha256:7a55fe05f15c10e8c705038777656fe45e3bd676d49ad9ac8370b75c66dd7cd7 \ + --hash=sha256:7aa0aac1711fadffc1d51e9d00a3bea61f68443d6ac0241a224e4d622489d665 \ + --hash=sha256:7f0bed90a216ef28b9d227d8d73e28a8c9b88c0f48a082d13ab3fa83c581488f \ + --hash=sha256:7f2719a398e1a2c01c2a63bf30377a34d0b6ef61946ab9cf4d550733af8f1ef1 \ + --hash=sha256:7fe9df7aeb8c64db6c34fc3b13271a363475d77bc157d3f00275a53910cb1989 \ + --hash=sha256:88486e6cce2a18a033013d17b30a594f1c5cb42520c49c19e6ade40b864bb7ff \ + --hash=sha256:8e4f8f12258a8739c565292a551c3db62cca4ed4f6b6126664e2381acb4931bf \ + --hash=sha256:8ff3c8cb26afaed25e8bca7b9dd0c1e36de71f35a3a0706b5c0d5172587a3827 \ + --hash=sha256:9124bedd8006b0e04d4e8a71a0945da9b67e7a4ab88fdad7b1440dc5b6122c42 \ + --hash=sha256:92c626edc66169a1b09e9541b9c0c9f10488447d8a2b1d87c8f0672e771bc927 \ + --hash=sha256:a149a5f7f2c5a065d4e63cb0d7a4b6d3b66e6e80f12e3f8827c4f63974cbf122 \ + --hash=sha256:a47d97a75fd2d10c37410b180f67a5835cb1d8fdea2648fd7f359d4277f180b9 \ + --hash=sha256:a499fff50387c1520c085a07578a000123f519e5f3eee61dd68e1d301659651f \ + --hash=sha256:a8e0881568c5e6beff91ef73c0ec8ac2a9d3ecc9edd6bd83c31ca34f770910c4 \ + --hash=sha256:ab45146c71ca6592c9cc8b354a2cc9cc4843c33efcbe1d245d7d37ce9696552d \ + --hash=sha256:b2c9cb2705fc84cb8798f1ba74194f4c080aaef19d9dae843591c09b97678e98 \ + --hash=sha256:b34baef93bfb20a8ecb930e395ccd2ae3268050d8cf4fe187de5e2bd806fd796 \ + --hash=sha256:b617f12c9be61e8f4b2857be4a4319754756845dbbbd9c3718f468bbb1e17bcb \ + --hash=sha256:b9fb97907c830d260fa0658ed58afd48a86b2b88aac521135c352ff7fd3477fd \ + --hash=sha256:bae283e85fc084b18ffeb92e061ff7ac5af9e183c9d1345c93e178c3e5069cbe \ + --hash=sha256:c2c46200656280a064073447ebd363937562debef329482fd7e570c8d498f806 \ + --hash=sha256:c8a09d570b39517de10ee5b718730e171251ce63bbb890c430c725c8c53d4484 \ + --hash=sha256:c91b9bc8985d00239f7dc08a49927a7ca1ca8a6af2c3890feec3ed9665b6f91e \ + --hash=sha256:ca58cd260ea02759238d994cfae844fc8b1e206c684beb8f38877dcab8451dfc \ + --hash=sha256:d7d17c8bd073cbf8d141993db45145a70b307385b69171d6b54bcf23e5d644de \ + --hash=sha256:dad42e676c5261eb50fdb16bdf3e2771cf8f99a79ef69ba88729aeb3472d8575 \ + --hash=sha256:db684d6bbb735a80bcbc3737856385b55d53f8a44ce9b46e9a5682c5133a9bf7 \ + --hash=sha256:de3a540cd1817ede31f530d20e6a4935bbc1b145fd8f8cf393903b1e02f1ae76 \ + --hash=sha256:e00c9d5c541a2713ba0e657e0303bf96ddddc412ef4761676adc35df35d7c246 \ + --hash=sha256:e1aafc91cbdacc9e5fe712c52077369168e6b6c346f3a9d51bf600b53eae56bb \ + --hash=sha256:e425be62524dc0c593985da794ee73eb8a17abb10fe692ee43bb39e201d7a099 \ + --hash=sha256:e43f315c68aa61cbdef522a2272c5a5b9b8fd03c301d3167b5e1343ef50c676c \ + --hash=sha256:e49ae693306d7624015f31cb3e82708916759d592c2e5f72a35c8f4cc8aef258 \ + --hash=sha256:e5c50e164cd2459bc5137c15288a9ef57160fd5cbf293265ea3c45efe7870865 \ + --hash=sha256:e8579a43eafd246e285eb3a5b939e7158073d5087aacdd2308f23200eac2458b \ + --hash=sha256:e85e50b9c67854f89635a86247412f3ad66b132a4d8534ac017547197c88f27d \ + --hash=sha256:e932089c35a396dc31a5a1fc49889dd559548d14cb2237adae260382a090382e \ + --hash=sha256:f0452282258dfcc01697026a8841258dd2057c4438b43914b611bccbcd048f10 \ + --hash=sha256:f4bfc89bd33ebb8e4c0e9846a09b1f5a4a86f5cb7a317e75cc42fee1131b4f4f \ + --hash=sha256:fa2f50678f04fda7a75d0fe5dd02bbdd3b13cbe6ed4cf626e4472a7ccf47ae94 \ + --hash=sha256:faec934cfe5fd6ac1151c02e67156c3f526e82f96b24d550b5d51efa4a5527c6 \ + --hash=sha256:fcd3cf4286a60fdc95451d8d14e0389a6b4f5cebe02c7f2609325eb016535963 \ + --hash=sha256:fe8a988c7220c002c45347430993830666e55bc350179d91fcee0feafe64e1d4 \ + --hash=sha256:fed18e44711c5af4b681c2b3b18f85e6f0f1b2370a28854c645d636d5305ccd8 \ + --hash=sha256:ffc61a388a5834a97953d6444a2888c24a05f2e333f9ed49f977a87bb1ad4761 setuptools==65.6.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:6211d2f5eddad8757bd0484923ca7c0a6302ebc4ab32ea5e94357176e0ca0840 \ --hash=sha256:d1eebf881c6114e51df1664bc2c9133d022f78d12d5f4f665b9191f084e2862d @@ -1515,6 +1633,7 @@ shapely==1.8.5.post1 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:38f0fbbcb8ca20c16451c966c1f527cc43968e121c8a048af19ed3e339a921cd \ --hash=sha256:4728666fff8cccc65a07448cae72c75a8773fea061c3f4f139c44adc429b18c3 \ --hash=sha256:48dcfffb9e225c0481120f4bdf622131c8c95f342b00b158cdbe220edbbe20b6 \ + --hash=sha256:4b47bb6f9369e8bf3e6dbd33e6a25a47ee02b2874792a529fe04a49bf8bc0df6 \ --hash=sha256:532a55ee2a6c52d23d6f7d1567c8f0473635f3b270262c44e1b0c88096827e22 \ --hash=sha256:5d7f85c2d35d39ff53c9216bc76b7641c52326f7e09aaad1789a3611a0f812f2 \ --hash=sha256:65b21243d8f6bcd421210daf1fabb9de84de2c04353c5b026173b88d17c1a581 \ @@ -1525,6 +1644,7 @@ shapely==1.8.5.post1 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:783bad5f48e2708a0e2f695a34ed382e4162c795cb2f0368b39528ac1d6db7ed \ --hash=sha256:78fb9d929b8ee15cfd424b6c10879ce1907f24e05fb83310fc47d2cd27088e40 \ --hash=sha256:84010db15eb364a52b74ea8804ef92a6a930dfc1981d17a369444b6ddec66efd \ + --hash=sha256:89164e7a9776a19e29f01369a98529321994e2e4d852b92b7e01d4d9804c55bf \ --hash=sha256:8d086591f744be483b34628b391d741e46f2645fe37594319e0a673cc2c26bcf \ --hash=sha256:8e59817b0fe63d34baedaabba8c393c0090f061917d18fc0bcc2f621937a8f73 \ --hash=sha256:99a2f0da0109e81e0c101a2b4cd8412f73f5f299e7b5b2deaf64cd2a100ac118 \ @@ -1539,6 +1659,7 @@ shapely==1.8.5.post1 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:c2822111ddc5bcfb116e6c663e403579d0fe3f147d2a97426011a191c43a7458 \ --hash=sha256:c6a9a4a31cd6e86d0fbe8473ceed83d4fe760b19d949fb557ef668defafea0f6 \ --hash=sha256:d048f93e42ba578b82758c15d8ae037d08e69d91d9872bca5a1895b118f4e2b0 \ + --hash=sha256:d8a2b2a65fa7f97115c1cd989fe9d6f39281ca2a8a014f1d4904c1a6e34d7f25 \ --hash=sha256:e9c30b311de2513555ab02464ebb76115d242842b29c412f5a9aa0cac57be9f6 \ --hash=sha256:ec14ceca36f67cb48b34d02d7f65a9acae15cd72b48e303531893ba4a960f3ea \ --hash=sha256:ef3be705c3eac282a28058e6c6e5503419b250f482320df2172abcbea642c831 @@ -1571,42 +1692,6 @@ tensorboard-plugin-wit==1.8.1 ; python_version >= "3.8" and python_version < "3. --hash=sha256:ff26bdd583d155aa951ee3b152b3d0cffae8005dc697f72b44a8e8c2a77a8cbe tensorboard==2.10.1 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:fb9222c1750e2fa35ef170d998a1e229f626eeced3004494a8849c88c15d8c1c -tensorflow-estimator==2.10.0 ; python_version >= "3.8" and python_version < "3.10" \ - --hash=sha256:f324ea17cd57f16e33bf188711d5077e6b2e5f5a12c328d6e01a07b23888edcd -tensorflow-io-gcs-filesystem==0.27.0 ; python_version >= "3.8" and python_version < "3.10" \ - --hash=sha256:043008e51e920028b7c564795d82d2487b0baf6bdb23cb9d84796c4a8fcab668 \ - --hash=sha256:152f4c20e5341d486df35f7ce9751a441ed89b43c1036491cd2b30a742fbe20a \ - --hash=sha256:1ad97ef862c1fb3f7ba6fe3cb5de25cb41d1c55121deaf00c590a5726a7afe88 \ - --hash=sha256:244754af85090d3fdd67c0b160bce8509e9a43fefccb295e3c9b72df21d9db61 \ - --hash=sha256:3e510134375ed0467d1d90cd80b762b68e93b429fe7b9b38a953e3fe4306536f \ - --hash=sha256:4cc906a12bbd788be071e2dab333f953e82938b77f93429e55ad4b4bfd77072a \ - --hash=sha256:564a7de156650cac9e1e361dabd6b5733a4ef31f5f11ef5eebf7fe694128334f \ - --hash=sha256:5c809435233893c0df80dce3d10d310885c86dcfb08ca9ebb55e0fcb8a4e13ac \ - --hash=sha256:8d2c01ba916866204b70f96103bbaa24655b1e7b416b399e49dce893a7835aa7 \ - --hash=sha256:9cf6a8efc35a04a8c3d5ec4c6b6e4931a6bc8d4e1f9d9aa0bad5fd272941c886 \ - --hash=sha256:b3a0ebfeac11507f6fc96162b8b22010b7d715bb0848311e54ef18d88f07014a \ - --hash=sha256:babca2a12755badd1517043f9d633823533fbd7b463d7d36e9e6179b246731dc \ - --hash=sha256:c22c71ee80f131b2d55d53a3c66a910156004c2dcba976cabd8deeb5e236397a \ - --hash=sha256:e21842a0a7c906525884bdbdc6d82bcfec98c6da5bafe7bfc89fd7253fcab5cf \ - --hash=sha256:ed17c281a28df9ab0547cdf166e885208d2a43db0f0f8fbe66addc4e23ee36ff \ - --hash=sha256:f7d24da555e2a1fe890b020b1953819ad990e31e63088a77ce87b7ffa67a7aaf -tensorflow==2.10.0 ; python_version >= "3.8" and python_version < "3.10" \ - --hash=sha256:0701da16a3d6d34763cd9ced6467cee24c02c9abf0d1a48ba59ea5a8d0421cec \ - --hash=sha256:0a3b58d90fadb5bdf81a964bea73bb89019a9d1e9ac12de75375c8f65e0d7570 \ - --hash=sha256:25e1e898bc1df521af9a8bfe0e511124379a6414083234ec67c6ab212ad12b2f \ - --hash=sha256:487918f4074685e213ba247387faab34933df76939134008441cb9d3e2c95cab \ - --hash=sha256:4b542af76d93c43e9d24dcb69888793831e434dc781c9533ee07f928fce84a15 \ - --hash=sha256:5806d4645bce5eb415863d757b5f056364b9d1cfa2c34f711f69d46cac605eee \ - --hash=sha256:60d5b4fbbb7a1304d96352372fa032e861e98bb3f23aced7ce53bc475a2df97d \ - --hash=sha256:64cc999ae83ddd891083141d3e5d718e3d799501a1b56c544f2ca648a8396c3e \ - --hash=sha256:741a74278f471dc21991a6c7dc802d454d42fd39515900c6363b8c38a898fb0f \ - --hash=sha256:8773858cbf37aaad444b07605d29f5b2d8f7cd1ecbf1cce2777931b96884589c \ - --hash=sha256:9f4677e9ab7104e73710a94ff5d2ed4b335378dcd2ac7402a68c31802a680911 \ - --hash=sha256:c588a1f34d9db51ea856aff07da9aa877c1d1d109336eee2c3bbb16dabd3f605 \ - --hash=sha256:d9b19b5120c0b393d9e2fc72561cfa3a454ef7f1ac649d8ad0dcc98817a086a4 \ - --hash=sha256:d9f711c5ff04333355c83eb96ca2e1db57c9663c6fa01d68b5953a040a602a3c \ - --hash=sha256:e129114dc529e63af9c419b5917b3407d0d26a4c8b73e114f601a175a7eb0477 \ - --hash=sha256:e85f89bc23c62d4243fad70bac902f00a234b33da8b91e2967eeef0f4b75b1e3 termcolor==2.1.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:91dd04fdf661b89d7169cefd35f609b19ca931eb033687eaa647cef1ff177c49 \ --hash=sha256:b80df54667ce4f48c03fe35df194f052dc27a541ebbf2544e4d6b47b5d6949c4 @@ -1659,9 +1744,6 @@ tokenizers==0.13.1 ; python_version >= "3.8" and python_version < "3.10" \ toml==0.10.2 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b \ --hash=sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f -tomli==2.0.1 ; python_version >= "3.8" and python_version < "3.10" \ - --hash=sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc \ - --hash=sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f toolz==0.12.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:2059bd4148deb1884bb0eb770a3cde70e7f954cfbbdc2285f1f2de01fd21eb6f \ --hash=sha256:88c570861c440ee3f2f6037c4654613228ff40c93a6c25e0eba70d17282c6194 @@ -1686,9 +1768,9 @@ torch==1.12.1 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:e9c8f4a311ac29fc7e8e955cfb7733deb5dbe1bdaabf5d4af2765695824b7e0d \ --hash=sha256:f00c721f489089dc6364a01fd84906348fe02243d0af737f944fddb36003400d \ --hash=sha256:f3b52a634e62821e747e872084ab32fbcb01b7fa7dbb7471b6218279f02a178a -torchmetrics==0.10.1 ; python_version >= "3.8" and python_version < "3.10" \ - --hash=sha256:e69fae7c6597ba505753f48e0f4ae1e73abde56c28fdc0c3baae826ec8c1d213 \ - --hash=sha256:e892ecd413e6bf63950329d1317c70f697d81d0f7e386152238062e322c8f1f3 +torchmetrics==0.11.1 ; python_version >= "3.8" and python_version < "3.10" \ + --hash=sha256:9987d7c21b081cceef246a72be1ce25bf29c842764f59dda54f59e3b4cd1970b \ + --hash=sha256:de2e9feb3316f798ab08b318302ff04e764f47e691f0847f780044279fa176ca torchvision==0.13.1 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:0298bae3b09ac361866088434008d82b99d6458fe8888c8df90720ef4b347d44 \ --hash=sha256:08f592ea61836ebeceb5c97f4d7a813b9d7dc651bbf7ce4401563ccfae6a21fc \ @@ -1747,6 +1829,9 @@ urlpath==1.2.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:e54c0c82db4894a7217772150bdbc01413794576996e7834f81d67f22359c9d0 validators==0.20.0 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:24148ce4e64100a2d5e267233e23e7afeb55316b47d30faae7eb6e7292bc226a +wandb==0.13.9 ; python_version >= "3.8" and python_version < "3.10" \ + --hash=sha256:0a17365ce1f18306ce7a7f16b943094fac7284bb85f4e52c0685705602f9e307 \ + --hash=sha256:b8752e5287aca9f8192eca7be352882975973cd3cd0c88815930498fd357569d watchdog==2.1.9 ; python_version >= "3.8" and python_version < "3.10" and platform_system != "Darwin" \ --hash=sha256:083171652584e1b8829581f965b9b7723ca5f9a2cd7e20271edf264cfd7c1412 \ --hash=sha256:117ffc6ec261639a0209a3252546b12800670d4bf5f84fbd355957a0595fe654 \ @@ -1782,71 +1867,6 @@ werkzeug==2.2.2 ; python_version >= "3.8" and python_version < "3.10" \ wheel==0.37.1 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:4bdcd7d840138086126cd09254dc6195fb4fc6f01c050a1d7236f2630db1d22a \ --hash=sha256:e9a504e793efbca1b8e0e9cb979a249cf4a0a7b5b8c9e8b65a5e39d49529c1c4 -wrapt==1.14.1 ; python_version >= "3.8" and python_version < "3.10" \ - --hash=sha256:00b6d4ea20a906c0ca56d84f93065b398ab74b927a7a3dbd470f6fc503f95dc3 \ - --hash=sha256:01c205616a89d09827986bc4e859bcabd64f5a0662a7fe95e0d359424e0e071b \ - --hash=sha256:02b41b633c6261feff8ddd8d11c711df6842aba629fdd3da10249a53211a72c4 \ - --hash=sha256:07f7a7d0f388028b2df1d916e94bbb40624c59b48ecc6cbc232546706fac74c2 \ - --hash=sha256:11871514607b15cfeb87c547a49bca19fde402f32e2b1c24a632506c0a756656 \ - --hash=sha256:1b376b3f4896e7930f1f772ac4b064ac12598d1c38d04907e696cc4d794b43d3 \ - --hash=sha256:21ac0156c4b089b330b7666db40feee30a5d52634cc4560e1905d6529a3897ff \ - --hash=sha256:257fd78c513e0fb5cdbe058c27a0624c9884e735bbd131935fd49e9fe719d310 \ - --hash=sha256:2b39d38039a1fdad98c87279b48bc5dce2c0ca0d73483b12cb72aa9609278e8a \ - --hash=sha256:2cf71233a0ed05ccdabe209c606fe0bac7379fdcf687f39b944420d2a09fdb57 \ - --hash=sha256:2fe803deacd09a233e4762a1adcea5db5d31e6be577a43352936179d14d90069 \ - --hash=sha256:3232822c7d98d23895ccc443bbdf57c7412c5a65996c30442ebe6ed3df335383 \ - --hash=sha256:34aa51c45f28ba7f12accd624225e2b1e5a3a45206aa191f6f9aac931d9d56fe \ - --hash=sha256:36f582d0c6bc99d5f39cd3ac2a9062e57f3cf606ade29a0a0d6b323462f4dd87 \ - --hash=sha256:380a85cf89e0e69b7cfbe2ea9f765f004ff419f34194018a6827ac0e3edfed4d \ - --hash=sha256:40e7bc81c9e2b2734ea4bc1aceb8a8f0ceaac7c5299bc5d69e37c44d9081d43b \ - --hash=sha256:43ca3bbbe97af00f49efb06e352eae40434ca9d915906f77def219b88e85d907 \ - --hash=sha256:4fcc4649dc762cddacd193e6b55bc02edca674067f5f98166d7713b193932b7f \ - --hash=sha256:5a0f54ce2c092aaf439813735584b9537cad479575a09892b8352fea5e988dc0 \ - --hash=sha256:5a9a0d155deafd9448baff28c08e150d9b24ff010e899311ddd63c45c2445e28 \ - --hash=sha256:5b02d65b9ccf0ef6c34cba6cf5bf2aab1bb2f49c6090bafeecc9cd81ad4ea1c1 \ - --hash=sha256:60db23fa423575eeb65ea430cee741acb7c26a1365d103f7b0f6ec412b893853 \ - --hash=sha256:642c2e7a804fcf18c222e1060df25fc210b9c58db7c91416fb055897fc27e8cc \ - --hash=sha256:6a9a25751acb379b466ff6be78a315e2b439d4c94c1e99cb7266d40a537995d3 \ - --hash=sha256:6b1a564e6cb69922c7fe3a678b9f9a3c54e72b469875aa8018f18b4d1dd1adf3 \ - --hash=sha256:6d323e1554b3d22cfc03cd3243b5bb815a51f5249fdcbb86fda4bf62bab9e164 \ - --hash=sha256:6e743de5e9c3d1b7185870f480587b75b1cb604832e380d64f9504a0535912d1 \ - --hash=sha256:709fe01086a55cf79d20f741f39325018f4df051ef39fe921b1ebe780a66184c \ - --hash=sha256:7b7c050ae976e286906dd3f26009e117eb000fb2cf3533398c5ad9ccc86867b1 \ - --hash=sha256:7d2872609603cb35ca513d7404a94d6d608fc13211563571117046c9d2bcc3d7 \ - --hash=sha256:7ef58fb89674095bfc57c4069e95d7a31cfdc0939e2a579882ac7d55aadfd2a1 \ - --hash=sha256:80bb5c256f1415f747011dc3604b59bc1f91c6e7150bd7db03b19170ee06b320 \ - --hash=sha256:81b19725065dcb43df02b37e03278c011a09e49757287dca60c5aecdd5a0b8ed \ - --hash=sha256:833b58d5d0b7e5b9832869f039203389ac7cbf01765639c7309fd50ef619e0b1 \ - --hash=sha256:88bd7b6bd70a5b6803c1abf6bca012f7ed963e58c68d76ee20b9d751c74a3248 \ - --hash=sha256:8ad85f7f4e20964db4daadcab70b47ab05c7c1cf2a7c1e51087bfaa83831854c \ - --hash=sha256:8c0ce1e99116d5ab21355d8ebe53d9460366704ea38ae4d9f6933188f327b456 \ - --hash=sha256:8d649d616e5c6a678b26d15ece345354f7c2286acd6db868e65fcc5ff7c24a77 \ - --hash=sha256:903500616422a40a98a5a3c4ff4ed9d0066f3b4c951fa286018ecdf0750194ef \ - --hash=sha256:9736af4641846491aedb3c3f56b9bc5568d92b0692303b5a305301a95dfd38b1 \ - --hash=sha256:988635d122aaf2bdcef9e795435662bcd65b02f4f4c1ae37fbee7401c440b3a7 \ - --hash=sha256:9cca3c2cdadb362116235fdbd411735de4328c61425b0aa9f872fd76d02c4e86 \ - --hash=sha256:9e0fd32e0148dd5dea6af5fee42beb949098564cc23211a88d799e434255a1f4 \ - --hash=sha256:9f3e6f9e05148ff90002b884fbc2a86bd303ae847e472f44ecc06c2cd2fcdb2d \ - --hash=sha256:a85d2b46be66a71bedde836d9e41859879cc54a2a04fad1191eb50c2066f6e9d \ - --hash=sha256:a9a52172be0b5aae932bef82a79ec0a0ce87288c7d132946d645eba03f0ad8a8 \ - --hash=sha256:aa31fdcc33fef9eb2552cbcbfee7773d5a6792c137b359e82879c101e98584c5 \ - --hash=sha256:b014c23646a467558be7da3d6b9fa409b2c567d2110599b7cf9a0c5992b3b471 \ - --hash=sha256:b21bb4c09ffabfa0e85e3a6b623e19b80e7acd709b9f91452b8297ace2a8ab00 \ - --hash=sha256:b5901a312f4d14c59918c221323068fad0540e34324925c8475263841dbdfe68 \ - --hash=sha256:b9b7a708dd92306328117d8c4b62e2194d00c365f18eff11a9b53c6f923b01e3 \ - --hash=sha256:d1967f46ea8f2db647c786e78d8cc7e4313dbd1b0aca360592d8027b8508e24d \ - --hash=sha256:d52a25136894c63de15a35bc0bdc5adb4b0e173b9c0d07a2be9d3ca64a332735 \ - --hash=sha256:d77c85fedff92cf788face9bfa3ebaa364448ebb1d765302e9af11bf449ca36d \ - --hash=sha256:d79d7d5dc8a32b7093e81e97dad755127ff77bcc899e845f41bf71747af0c569 \ - --hash=sha256:dbcda74c67263139358f4d188ae5faae95c30929281bc6866d00573783c422b7 \ - --hash=sha256:ddaea91abf8b0d13443f6dac52e89051a5063c7d014710dcb4d4abb2ff811a59 \ - --hash=sha256:dee0ce50c6a2dd9056c20db781e9c1cfd33e77d2d569f5d1d9321c641bb903d5 \ - --hash=sha256:dee60e1de1898bde3b238f18340eec6148986da0455d8ba7848d50470a7a32fb \ - --hash=sha256:e2f83e18fe2f4c9e7db597e988f72712c0c3676d337d8b101f6758107c42425b \ - --hash=sha256:e3fb1677c720409d5f671e39bac6c9e0e422584e5f518bfd50aa4cbbea02433f \ - --hash=sha256:ee2b1b1769f6707a8a445162ea16dddf74285c3964f605877a20e38545c3c462 \ - --hash=sha256:ee6acae74a2b91865910eef5e7de37dc6895ad96fa23603d1d27ea69df545015 \ - --hash=sha256:ef3f72c9666bba2bab70d2a8b79f2c6d2c1a42a7f7e2b0ec83bb2f9e383950af xgboost==1.6.2 ; python_version >= "3.8" and python_version < "3.10" \ --hash=sha256:1ce15d3292d6ee75be4491ff6463c76ca548b7d77ca70f707cb23ea051e2faf7 \ --hash=sha256:64f5c7189ccea717c6ac2a10ddf1f825f2be0fb829ed96d9ec3a7588b04b7922 \