-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feat/multiscale tests #25
Conversation
…l coordinate transformations, and for axes vs coordinate transformations
I added validation + tests to ensure that the number of |
i think this is ready for review! |
this should resolve #33 |
pyproject.toml
Outdated
@@ -3,8 +3,8 @@ name = "ome-zarr-models" | |||
dynamic = ["version"] | |||
description = "Data models for OME-Zarr" | |||
readme = "README.md" | |||
requires-python = ">=3.11" | |||
dependencies = [] | |||
requires-python = ">=3.10" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't we keep this as >=3.11, as per SPEC 0?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ah right, this change needs to be reverted
scale: list[float] | ||
|
||
@classmethod | ||
def build(cls, data: Iterable[float]) -> Self: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we instead give type
a defualt (type: Literal["scale"] = "scale"
), and then if you want to construct one of these you can just do VectorScale(scale=[1, 2, 3])
? I think that's slightly nicer API and avoids us having to write a build
method.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the argument against giving the type a default value was that this would allow a JSON object without a type
field to pass validation, even though that JSON object is technically an invalid axis
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(I totally agree that the API is nicer if the default is the de-facto default)
@@ -69,3 +92,32 @@ class PathTranslation(Base): | |||
|
|||
type: Literal["translation"] | |||
translation: str | |||
|
|||
|
|||
def ndim(transform: VectorScale | VectorTranslation) -> int: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the spirit of don't repeat yourself, I'm not mad about having ndim
defined as both a method on the classes and a function. Was there a particular reason for doing this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the main reason to have a free-floating function is to support the niche case of using a functional idiom like map
or filter
on transformation objects, e.g. ndims = map(ndim, transforms)
, but maybe this is too niche :)
|
||
|
||
def ensure_transform_dimensionality( | ||
transforms: tuple[VectorScale | PathScale] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would be good to extract the type here as a type alias (or type var???) to avoid repetition.
TransformType = tuple[VectorScale | PathScale] | tuple[VectorScale | PathScale, VectorTranslation | PathTranslation]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
definitely, we can define + export this union in the transformations module
NUM_TX_MAX = 2 | ||
|
||
|
||
def ensure_transform_dimensionality( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Having read this function, to simplify it I would actually decouple the validation from the spec a bit, and write the validation function to check that an arbitrary list of transforms (that therefore have a .ndim
attribute) have the same number of dimensions.
So:
- Remove all transforms from the list that are Path transforms
- If
>= 2
transforms left, check they have same dimensions
I think this will make typing/logic a bit nicer, and also make this a bit more re-usable for whatever comes in coordinate transform changes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good idea! I will implement this
Co-authored-by: David Stansby <[email protected]>
Co-authored-by: David Stansby <[email protected]>
Co-authored-by: David Stansby <[email protected]>
_check_unique = field_validator("multiscales")(_unique_items_validator) | ||
|
||
|
||
class MultiscaleGroup(GroupSpec[MultiscaleGroupAttrs, ArraySpec | GroupSpec]): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How is this meant to be used? I am a bit confused/not understanding what the purpose of this class is.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this class will model an actual zarr group, i.e. the attributes (which are JSON and ought to contain multiscales etc) and the arrays (which ought to be the ones named by the path
attribute in datasets
, and which ought to have shape that's consistent with the number of axes, etc)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this class is also the thing we deserialize a zarr group into
…ve num_tx_max variable
Co-authored-by: David Stansby <[email protected]>
I bring in a lot of the tests and validation logic from pydantic-zarr.
A few notable changes I add here:
.build
methods toVectorScale
andVectorTranslation
that allow you to create the respective transform without explicitly passing in thetype
as a (redundant) keyword argument_build_transforms
function that createstuple[VectorScale] | tuple[VectorScale, VectorTranslation]
, this is a convenience function for makingcoordinateTransformations
metadata.