Releases: allegroai/clearml
Releases · allegroai/clearml
PyPI v1.8.3 - ClearML
Bug fixes
- Set GCS credentials to
None
if invalid service account credentials are provided (#841, thanks @freddessert!) - Fix a sync issue when loading deferred configuration
PyPI v1.8.2 - ClearML
New Features and Improvements
- Added
VCS_ENTRY_POINT
environment variable that overrides ClearML's entrypoint auto-detection
Bug Fixes
- Fix all parameters returned from a pipeline are considered strings
- Fix
Task.set_parameters()
does not add parameter type when parameter exists but does not have a type
PyPI v1.8.1 - ClearML
New Features and Improvements
- Raise error on failed uploads (#820, thanks @shpigi!)
- Add hyperdataset examples (#823)
- Change
report_event_flush_threshold
default to 100 - Add
ModelInfo.weights_object()
for store callback access to the actual model object being stored (valid for both pre/post save calls, otherwiseNone
) - Support
num_workers
in dataset operations - Support max connections setting for Azure storage using the
sdk.azure.storage.max_connection
configuration option
Bug Fixes
- Fix
clearml
logger default level cannot be changed (#741) - Fix Hydra does use get overridden information from ClearML (#751)
- Fix
StorageManager.list(“s3://..”, with_metadata=True)
doesn't work - Fix
ModelsList.keys()
is missing - Fix
CLEARML_DEFERRED_TASK_INIT=1
doesn't work - Fix default API method does not work when set in configuration
PyPI v1.8.0 - ClearML
New Features and Improvements
- Add tarfile member sanitization to
extractall()
(#803, thanks @TrellixVulnTeam!) - Add
Task.delete_artifacts()
withraise_on_errors
argument (#806, thanks @frolovconst!) - Add CI/CD example (#815, thanks @thepycoder!)
- Limit number of
_serialize
requests when adding list of links withadd_external_files()
(#813) - Add support for connecting Enum values as parameters
- Improve CoLab integration (store entire colab, not history)
- Add
clearml.browser_login
to authenticate browser online sessions such as CoLab, Jupyter Notebooks etc. - Remove
import_bind
from stack trace of import errors - Add
sdk.development.worker.report_event_flush_threshold
configuration option to control the number of events to trigger a report - Return stub object from
Task.init()
if noclearml.conf
file is found - Improve manual model uploading example
- Remove deprecated demo server
Bug Fixes
- Fix passing
compression=ZIP_STORED
(or 0) toDataset.upload()
usesZIP_DEFLATED
and overrides the user-supplied argument (#812, thanks @doronser!) - Fix
unique_selector
is not applied properly on batches after the first batch. Remove default selector value since it does not work for all event types (and we always specify it anyway) - Fix clearml-init colab detection
- Fix cloning pipelines ran with
start_locally()
doesn't work - Fix if project has a default
output uri
there is no way to disable it in development mode (manual), allow passingoutput_uri=False
to disable it - Fix git remote repository detection when remote is not "origin"
- Fix reported images might not all be reported when waiting to complete the task
- Fix
Dataset.get_local_copy()
deletes the source archive if it is stored locally - Fix too many parts will cause preview to inflate Task object beyond its 16MB limit - set a total limit of 320kbs
- Fix media preview is created instead of a table preview
- Fix
task.update_output_model()
should always upload local models to a remote server - Fix broken pip package might mess up requirements detection
PyPI v1.7.2 - ClearML
New Features and Improvements
- Support running jupyter notebook inside a git repository (repository will be referenced without uncommitted changes and jupyter notebook will be stored om plain code as uncommitted changes)
- Add jupyter notebook fail warning
- Allow pipeline steps to return string paths without them being treated as a folder artifact and zipped (#780)
- Remove
future
from Python 3 requirements
Bug Fixes
- Fix exception raised when using
ThreadPool
(#790) - Fix Pyplot/Matplotlib binding reports incorrect line labels and colors (#791)
- Pipelines
- Jupyter Notebook
- Fix support for multiple jupyter servers running on the same machine
- Fix issue with old/new notebook packages installed
- Fix local cache with access rules disabling partial local access
- Fix
Task.upload_artifact()
fails uploading pandasDataFrame
- Fix relative paths in examples (#787, thanks @mendrugory!)
PyPI v1.7.1 - ClearML
New Features and Improvements
- Add callback option for pipeline step retry
Bug Fixes
- Fix Python Fire binding
- Fix Dataset failing to load helper packages should not crash
- Fix
Dataset.get_local_copy()
is allowed for a non-finalized dataset - Fix
Task.upload_artifact()
does not upload empty lists/tuples - Fix pipeline retry mechanism interface
- Fix Python <3.5 compatibility
- Fix local cache warning (should be a debug message)
PyPI v1.7.0 - ClearML
New Features and Improvements
- ClearML Data: Support providing list of links
- Upload artifacts with a custom serializer (#689)
- Allow user to specify extension when using custom serializer functions (for artifacts)
- Skip server URL verification in clearml-init wizard process
- When calling
Dataset.get()
without "alias" field, tell user that he can use alias to log it in the UI - Add mmcv support for logging models
- Add support for Azure and GCP storage in
Task.setup_upload()
- Support pipeline retrying tasks which are failing on suspected non-stable failures
- Better storage (AWS, GCP) internal load balancing and configurations
- Add
Task.register_abort_callback
Bug Fixes
- Allow getting datasets with non-semantic versioning (#776)
- Fix interactive plots (instead of a generated png)
- Fix Python 2.7 support
- Fix clearml datasets list functionality
- Fix
Dataset.init()
modifies task (moved toDataset.create()
) - Fix failure with large files upload on HTTPS
- Fix 3d plots with plt shows to show 2d plot on task results page
- Fix uploading files with project's
default_upload_destination
(#734) - Fix broken reporting of Matplotlib - Using logarithmic scale breaks reporting
- Fix supporting of wildcards in clearml-data CLI
- Fix
report_histogram
- does not show "horizontal" orientation (#699) - Fix table reporting 'series' arg does not appear on UI when using
logger.report_table(title, series, iteration...)
(#684) - Fix artifacts (and models) use task original name and not new name
- Fix very long filenames from S3 can't be downloaded (with
get_local_copy()
) - Fix overwrite of existing output models on pipeline task with monitor_models (#758)
PyPI v1.6.4 - ClearML
Bug Fixes
- Fix
APIClient
fails when callingget_all
endpoints with API 2.20 (affects CLI tools such asclearml-session
)
PyPI v1.6.3 - ClearML
New Features and Improvements
- Add option to specify an endpoint URL when creating S3 resource service (#679, thanks @AndolsiZied!)
- Add support for providing
ExtraArgs
to boto3 when uploading files using thesdk.aws.s3.extra_args
configuration option - Add support for Server API 2.20
- Add
Task.get_num_enqueued_tasks()
to get the number of tasks enqueued in a specific queue - Add support for updating model metadata using
Model.set_metadata()
,Model.get_metadata()
,Model.get_all_metadata()
,Model.get_all_metadata_casted()
andModel.set_all_metadata()
- Add
Task.get_reported_single_value()
- Add a retry mechanism for models and artifacts upload
- Pipelines with empty configuration takes it from code
- Add support for running pipeline steps on preemptible instances
- Datasets
- Add description to Datasets
- Add wild-card support in
clearml-data
Bug Fixes
- Fix dataset download (#713, thanks @dankirsdot!)
- Fix lock is not released after dataset cache is downloaded (#708, thanks @mralgos!)
- Fix deadlock might occur when using process pool large number processes (#674)
- Fix 'series' not appearing on UI when using
logger.report_table()
(#684) - Fix
Task.init()
docstring to include behavior when executing remotely (#737, thanks @mmiller-max!) - Fix
KeyError
when running remotely and no params were passed to click (allegroai/clearml-agent#111) - Fix full path is stored when uploading a single artifact file
- Fix passing non-alphanumeric filename in
sdk.development.detect_with_pip_freeze
- Fix Python 3.6 and 3.10 support
- Fix mimetype cannot be
None
when uploading to S3 - Pipelines
- Fix pipeline DAG
- Add support for pipelines with spot instances
- Fix pipeline proxy object is always resolved in main pipeline logic
- Fix pipeline steps with empty configuration should try and take it from code
- Fix wait for jobs based on local/remote pool frequency
- Fix
UniformIntegerParameterRange.to_list()
ignores min value - Fix pipeline component returning a list of length 1
- Datasets
- Fix
Dataset.get()
does not respectauto_create
- Fix getting datasets fails with new ClearML Server v1.6
- Fix datasets can't be queried by project/name alone
- Fix adding child dataset to older parent dataset without stats
- Fix
- Fix error when connecting an input model
- Fix deadlocks, including:
- Change thread Event/Lock to a process fork safe threading objects
- Use file lock instead of process lock to avoid future deadlocks since python process lock is not process safe (killing a process holding a lock will Not release the lock)
- Fix
StorageManager.list()
on a local Windows path - Fix model not created in the current project
- Fix
keras_tuner_cifar
example raisesDeprecationWarning
andValueError
PyPI v1.6.2 - ClearML
Bug Fixes
- Fix format string construction sometimes causing delayed evaluation errors (#706)