Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v2.11.1 #158

Merged
merged 1 commit into from
Nov 8, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 14 additions & 8 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,12 @@
# Changelog
## [Latest](https://github.com/int-brain-lab/ONE/commits/main) [2.11.0]
This version deprecates one.alf.files in preperation for replacing with one.alf.path in version 3.
## [Latest](https://github.com/int-brain-lab/ONE/commits/main) [2.11.1]

### Modified

- HOTFIX: consistent behaviour in OneAlyx.list_datasets when keep_eid_index == True

## [2.11.0]
This version deprecates one.alf.files in preparation for replacing with one.alf.path in version 3.

### Modified

Expand All @@ -24,9 +30,9 @@ This version improves behaviour of loading revisions and loading datasets from l
- bugfix of spurious error raised when loading dataset with a revision provided
- default_revisions_only parameter in One.list_datasets filters non-default datasets
- permit data frame input to One.load_datasets and load precise relative paths provided (instead of default revisions)
- redundent session_path column has been dropped from the datasets cache table
- redundant session_path column has been dropped from the datasets cache table
- bugfix in one.params.setup: suggest previous cache dir if available instead of always the default
- bugfix in one.params.setup: remove all extrenuous parameters (i.e. TOKEN) when running setup in silent mode
- bugfix in one.params.setup: remove all extraneous parameters (i.e. TOKEN) when running setup in silent mode
- warn user to reauthenticate when password is None in silent mode
- always force authentication when password passed, even when token cached
- bugfix: negative indexing of paginated response objects now functions correctly
Expand Down Expand Up @@ -102,7 +108,7 @@ This version of ONE adds support for Alyx 2.0.0 and pandas 3.0.0 with dataset QC

- One.load_dataset
- add an option to skip computing hash for existing files when loading datasets `check_hash=False`
- check filesize before computing hash for performance
- check file size before computing hash for performance

## [2.5.5]

Expand Down Expand Up @@ -320,7 +326,7 @@ This version of ONE adds support for Alyx 2.0.0 and pandas 3.0.0 with dataset QC

- RegistrationClient.find_files is now itself a generator method (previously returned a generator)
- exists kwarg in RegistrationClient.register_files
- support for loading 'table' attribute as dataframe with extra ALF parts
- support for loading 'table' attribute as data frame with extra ALF parts
- bugfix: tag assertion should expect list of tags in cache info

## [1.17.0]
Expand Down Expand Up @@ -361,7 +367,7 @@ This version of ONE adds support for Alyx 2.0.0 and pandas 3.0.0 with dataset QC

### Modified

- squeeze pandas dataframe on csv_read
- squeeze pandas data frame on csv_read
- load_cache now public method; option to load specific remote cache with tag arg
- fix incorrect error message in one.alf.exceptions.ALFMultipleObjectsFound

Expand Down Expand Up @@ -537,7 +543,7 @@ This version of ONE adds support for Alyx 2.0.0 and pandas 3.0.0 with dataset QC

### Modified

- rest command loging includes the whole json field on error
- rest command logging includes the whole json field on error
- added silent option to instantiate One on local files

## [1.6.0]
Expand Down
2 changes: 1 addition & 1 deletion one/__init__.py
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
"""The Open Neurophysiology Environment (ONE) API."""
__version__ = '2.11.0'
__version__ = '2.11.1'
4 changes: 3 additions & 1 deletion one/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -1855,8 +1855,10 @@ def list_datasets(
return self._cache['datasets'].iloc[0:0] if details else [] # Return empty
assert set(datasets.index.unique('eid')) == {eid}
del filters['default_revisions_only']
if not keep_eid_index and 'eid' in datasets.index.names:
datasets = datasets.droplevel('eid')
datasets = util.filter_datasets(
datasets.droplevel('eid'), assert_unique=False, wildcards=self.wildcards, **filters)
datasets, assert_unique=False, wildcards=self.wildcards, **filters)
# Return only the relative path
return datasets if details else datasets['rel_path'].sort_values().values.tolist()

Expand Down
5 changes: 5 additions & 0 deletions one/tests/test_one.py
Original file line number Diff line number Diff line change
Expand Up @@ -1448,6 +1448,11 @@ def test_list_datasets(self):
self.assertEqual(183, len(dsets)) # this may change after a BWM release or patch
self.assertEqual(1, dsets.index.nlevels, 'details data frame should be without eid index')

# Test keep_eid_index
dsets = self.one.list_datasets(
self.eid, details=True, query_type='remote', keep_eid_index=True)
self.assertEqual(2, dsets.index.nlevels, 'details data frame should be with eid index')

# Test missing eid
dsets = self.one.list_datasets('FMR019/2021-03-18/008', details=True, query_type='remote')
self.assertIsInstance(dsets, pd.DataFrame)
Expand Down
15 changes: 8 additions & 7 deletions one/webclient.py
Original file line number Diff line number Diff line change
Expand Up @@ -589,7 +589,7 @@ def _generic_request(self, reqfunction, rest_query, data=None, files=None):
rest_query = rest_query.replace(self.base_url, '')
if not rest_query.startswith('/'):
rest_query = '/' + rest_query
_logger.debug(f"{self.base_url + rest_query}, headers: {self._headers}")
_logger.debug(f'{self.base_url + rest_query}, headers: {self._headers}')
headers = self._headers.copy()
if files is None:
data = json.dumps(data) if isinstance(data, dict) or isinstance(data, list) else data
Expand Down Expand Up @@ -678,8 +678,8 @@ def authenticate(self, username=None, password=None, cache_token=True, force=Fal
rep = requests.post(self.base_url + '/auth-token', data=credentials)
except requests.exceptions.ConnectionError:
raise ConnectionError(
f"Can't connect to {self.base_url}.\n" +
"Check your internet connections and Alyx database firewall"
f'Can\'t connect to {self.base_url}.\n' +
'Check your internet connections and Alyx database firewall'
)
# Assign token or raise exception on auth error
if rep.ok:
Expand Down Expand Up @@ -839,6 +839,7 @@ def download_cache_tables(self, source=None, destination=None):

def _validate_file_url(self, url):
"""Asserts that URL matches HTTP_DATA_SERVER parameter.

Currently only one remote HTTP server is supported for a given AlyxClient instance. If
the URL contains only the relative path part, the full URL is returned.

Expand Down Expand Up @@ -1069,7 +1070,7 @@ def rest(self, url=None, action=None, id=None, data=None, files=None,
if 'django' in kwargs.keys():
kwargs['django'] = kwargs['django'] + ','
else:
kwargs['django'] = ""
kwargs['django'] = ''
kwargs['django'] = f"{kwargs['django']}pk,{id}"
# otherwise, look for a dictionary of filter terms
if kwargs:
Expand Down Expand Up @@ -1135,7 +1136,7 @@ def json_field_write(
# Prepare data to patch
patch_dict = {field_name: data}
# Upload new extended_qc to session
ret = self.rest(endpoint, "partial_update", id=uuid, data=patch_dict)
ret = self.rest(endpoint, 'partial_update', id=uuid, data=patch_dict)
return ret[field_name]

def json_field_update(
Expand Down Expand Up @@ -1181,7 +1182,7 @@ def json_field_update(

if not isinstance(current, dict):
_logger.warning(
f'Current json field {field_name} does not contains a dict, aborting update'
f'Current json field "{field_name}" does not contains a dict, aborting update'
)
return current

Expand Down Expand Up @@ -1236,7 +1237,7 @@ def json_field_remove_key(
f'{key}: Key not found in endpoint {endpoint} field {field_name}'
)
return current
_logger.info(f"Removing key from dict: '{key}'")
_logger.info(f'Removing key from dict: "{key}"')
current.pop(key)
# Re-write contents without removed key
written = self.json_field_write(
Expand Down
Loading