All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to PEP 440 and uses Semantic Versioning.
- The
static-analysis
Github Actions workflow now usesruff
rather thanflake8
for linting.
- Removed the unapproved user warning implemented in v6.2.0 (see #276). This feature had unintended consequences which broke some processing pipelines that rely on the HyP3 SDK (see #285 for more details).
- Support for Python 3.9 has been removed.
Job.priority
attribute- Unapproved
hyp3-sdk
users receive an error message when connecting toHyP3
HyP3.costs
method to retrieve the job cost lookup table, following the addition of the/costs
API endpoint in HyP3 v6.2.0Batch.total_credit_cost
method to calculate the total credit cost for a batch of jobs
This release accommodates changes to the HyP3 API schema introduced in HyP3 v6.0.0
credit_cost
attribute to theJob
classHyP3.check_credits
method to determine your remaining processing credits
HyP3.my_info()
: A newremaining_credits
field replaces thequota
field in the return valueHyP3.check_quota
may return a float or an integer if the user has processing credits
HyP3.check_quota
has been deprecated in favor ofHyP3.check_credits
legacy
option for thedem_name
argument ofHyP3.prepare_rtc_job()
andHyP3.submit_rtc_job()
.
- The HyP3 SDK now explicitly supports Python 3.9-3.12
- Added
HyP3.submit_insar_isce_burst_job
andHyP3.prepare_insar_isce_burst_job
methods for submitting InSAR ISCE burst jobs to HyP3. - A
pending
method to theJob
class. - A
pending
argument to theBatch.filter_jobs()
method.
- The order of the arguments for
Batch.filter_jobs()
. The new order issucceeded, pending, running, failed, include_expired
.
- Support for Python 3.8 has been dropped.
- The
running
method of theJob
class now only returnsTrue
if job has statusRUNNING
. Jobs in thePENDING
state now returnTrue
when calling thepending
method ofJob
.
- Added the
phase_filter_parameter
keyword argument for theHyP3.submit_insar_job
andHyP3.prepare_insar_job
methods.
- Removed the
Job.subscription_id
attribute in response to the Subscriptions feature being removed from HyP3.
- The
user_id
parameter has been moved to the end of theHyP3.find_jobs
parameter list, to avoid introducing breaking changes for users who rely on the order of the parameters.
- The
HyP3.find_jobs
method now includes auser_id
parameter that allows retrieving jobs for a given user. If not provided, jobs are retrieved for the current user.
- 20 m can now be provided to the
resolution
keyword argument ofhyp3.submit_rtc_job
andhyp3.prepare_rtc_job
.
- Display the improved error messages regardless of whether the Earthdata credentials were provided by a
.netrc
file.
- Improved error messages when Earthdata user must select Study Area or accept EULA, thanks to @kevinxmorales in #170
- The
hyp3_sdk.TESTING
constant has been removed in favor of mocking objects in unit tests.
- Path to
README.md
inpyproject.toml
so that there is a package description on PyPI
hyp3-sdk
now uses asrc
layout per this recommendation.hyp3-sdk
now only usespyproject.toml
for package creation now thatsetuptools
recommends not using setup.py.hyp3_sdk.util
is now included in the mainhyp3_sdk
API and does not need to be imported separately
- 10 m can now be provided to the
resolution
keyword argument ofhyp3.submit_rtc_job
andhyp3.prepare_rtc_job
.
- In addition to
power
andamplitude
,decibel
can now be provided to thescale
keyword argument ofhyp3.submit_rtc_job
andhyp3.prepare_rtc_job
.
- Updated SDK example notebook to only use the ESA S2 naming convention for autoRIFT jobs.
- Added a
processing_times
attribute to thehyp3_sdk.Job
class to support jobs with multiple processing steps.
- Removed the
processing_time_in_seconds
attribute from thehyp3_sdk.Job
class.
Job
now has asubscription_id
attribute.
Job.expired()
now returnsFalse
ifexpiration_time
isNone
.
- Updated return type for
HyP3.check_quota()
to reflect the case where a user has no quota.
- Added
processing_time_in_seconds
toJob
class.
hyp3_sdk.asf_search
has been removed and its functionality has been superseded by theasf_search
Python package, which provides a more comprehensive ASF search experience and is available on conda-forge and PyPI.
- Slicing a
Batch
object will now return a newBatch
instead oflist
of jobs Batch
equality now compares the contained jobs and not object identity
- Exposed new
include_displacement_maps
parameter forHyP3.prepare_insar_job
andHyP3.submit_insar_job
, which will cause both a line-of-sight displacement and a vertical displacement GeoTIFF to be included in the product.
- The
include_los_displacement
parameter ofHyP3.prepare_insar_job
andHyP3.submit_insar_job
has been deprecated in favor of theinclude_displacement_maps
parameter, and will be removed in the future. hyp3_sdk.asf_search
is deprecated and will be removed in future releases. Functionality has been superseded by theasf_search
Python package, which provides a more comprehensive ASF search experience and is available on conda-forge and PyPI.- instead of
hyp3_sdk.asf_search.get_metadata
, tryasf_search.granule_search
orasf_search.product_search
- instead of
hyp3_sdk.asf_search.get_nearest_neighbors
, tryasf_search.baseline_search.stack_from_product
orasf_search.baseline_search.stack_from_id
- instead of
- Exposed new
apply_water_mask
parameter for InSAR jobs inHyP3.submit_insar_job()
andHyP3.prepare_insar_job()
, which sets pixels over coastal waters and large inland waterbodies as invalid for phase unwrapping
- Resolved an issue where
HyP3.find_jobs()
did not correctly filter results when using thestatus_code
parameter
extract_zipped_product
function tohyp3_sdk.util
which will extract zipped HyP3 productschunk
function tohyp3_sdk.util
which will split a sequence into small chunks and is particularly useful for submitting large batches
- HyP3 API URL constants have been renamed to be more descriptive
hyp3_sdk.HyP3_PROD
is nowhyp3_sdk.PROD_API
hyp3_sdk.HyP3_TEST
is nowhyp3_sdk.TEST_API
Job
class now has alogs
attribute containing links to job log files- Added missing container methods
- batches are now subscriptable:
batch[0]
- jobs can be searched for in batches:
job in batch
- jobs can be deleted from batches:
del batch[0]
- batches can be reversed now using the
reversed()
function
- batches are now subscriptable:
find_jobs()
now accepts datetimes with no timezone info and defaults to UTC.
FoundZeroJobs
warning fromfind_jobs()
- #92 --
ImportError
being raised when showing a progress bar becauseipywidgets
may not always be installed when running in a Jupyter kernel
- Exposed new
include_wrapped_phase
parameter for InSAR jobs inHyP3.submit_insar_job()
andHyP3.prepare_insar_job()
- Exposed new
include_dem
parameter for InSAR jobs inHyP3.submit_insar_job()
andHyP3.prepare_insar_job()
- Exposed new
include_inc_map
parameter for InSAR jobs inHyP3.submit_insar_job()
andHyP3.prepare_insar_job()
- A
dem_name
parameter has been added toHyP3.submit_rtc_job
andHyP3.prepare_rtc_job
to control which DEM data set is used for RTC processingdem_name='copernicus'
will use the Copernicus GLO-30 Public DEMdem_name='legacy'
will use the DEM with the best coverage from ASF's legacy SRTM/NED data sets
HyP3.find_jobs
now supports filtering byjob_type
HyP3.find_jobs
now pages through truncated responses to get all requested jobshyp3_sdk.exceptions
now includesServerError
for exceptions that are a result of system errors.
hyp3_sdk.exceptions
now hasHyP3SDKError
as a module base exception, andHyP3Error
is now specific errors in thehyp3
submoduleHyP3.find_jobs
argumentstatus
renamed tostatus_code
to be consistent with api-spec
asf_search
module will now raise anexceptions.ASFSearchError
when it encounters problems and will include the Search API response detailsHyP3.__init__
now accepts aprompt=True
(defaultFalse
) keyword argument which will prompt users for their username or password if not provided
- HyP3 prepare and submit methods now include processing options as named parameters
- Exceptions raised for HyP3 errors will include the HyP3 API response details
asf_search.get_nearest_neighbors
is no longer dependent on state vector information in CMR- now limited to Sentinel-1 granules
- now raises
ASFSearchError
when the reference granule cannot be found - results no longer include
perpendicularBaseline
ortemporalBaseline
fields
get_authenticated_session
now correctly throwsAuthenticationError
when no.netrc
file exists and no credentials are provided
- Methods to prepare jobs for submission to HyP3
HyP3.prepare_autorift_job
HyP3.prepare_rtc_job
HyP3.prepare_insar_job
HyP3.watch
,Job.download_files
, andBatch.download_files
now display progress bars
-
HyP3
Job
objects provide a better string representation>>> print(job) HyP3 RTC_GAMMA job dd884703-cdbf-47ff-848c-de1e2b9917c1
-
HyP3
Batch
objects- are now iterable
- provide a better string representation
>>> print(batch) 2 HyP3 Jobs: 0 succeeded, 0 failed, 2 running, 0 pending.
-
HyP3 submit methods will always return a
Batch
containing the submitted job(s) -
HyP3.submit_job_dict
has been renamed toHyP3.submit_prepared_jobs
and can submit one or more prepared job dictionaries. -
Job.download_files
andBatch.download_files
will (optionally) create the download location if it doesn't exist -
Hyp3._get_job_by_id
has been made public and renamed toHyp3.get_job_by_id
hyp3_sdk.asf_search
module to find granule(s) metadata, and a granule's nearest neighbors for InSAR
- SDK will attach a
User-Agent
statement likehyp3_sdk/VERSION
to all API interactions
- Providing a job list to
Batch.__init__()
is now optional; an empty batch will be created if the job list is not provided Batch.__init__()
no longer issues a warning when creating an empty batchHyP3.find_jobs()
will now issue a warning when a zero jobs were found
Job.download_files
andBatch.download_files
now default to downloading to working directory
- Corrected syntax errors in documentation examples
- Correctly specifies the minimum python version (3.8) in
setup.py
- Job.download_files and
Batch.download_files
now except strings for location in addition topathlib.Path
- Updated documentation to represent version 0.3.0
- This is a complete refactor of the API, please view updated documentation.
- API responses now return Batch objects if multiple jobs present.
- Job and Batch objects now have the following member functions to help with common tasks
- API can now watch Jobs or Batches for completion
- Jobs are no longer created then submitted, instead submission through the API is how to get Jobs
- hyp3-sdk has dropped support for python <= 3.7
- typehints and docstrings throughout the SDK for auto-documentation of the API
- Documentation now is mainly contained in The HyP3 Docs and the README just contains quick installation and usage information
- Updated the documentation with mostly minor style and language changes
hyp3.get_jobs
now accepts aname
parameter to search by job name
- Removed space from auth URL that prevents successful sign in
- HyP3 module
- HyP3 class which wraps a hyp3 API
- Job class used to define Jobs to submit and are created with the
make_[job_type]_job()
factory functions