Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't open uint16 dataset, Fiji crashes #102

Open
balintbalazs opened this issue Aug 3, 2020 · 7 comments
Open

Can't open uint16 dataset, Fiji crashes #102

balintbalazs opened this issue Aug 3, 2020 · 7 comments

Comments

@balintbalazs
Copy link

When opening a unit16 dataset, BigDataViewer tries to convert it to int16, which can fail if a value is too large for an int16. It produces the error below for each pixel where the value is larger than 32767. Because the error happens for each pixel, this can make Fiji unresponsive and it can crash. The datasets can be loaded with the HDF5 plugin correctly.

java.util.concurrent.ExecutionException: ncsa.hdf.hdf5lib.exceptions.HDF5DatatypeInterfaceException: Datatype:Can't convert datatypes ["..\..\src\H5Tconv.c line 5377 in H5T__conv_ushort_short(): can't handle conversion exception"]
	at net.imglib2.cache.ref.SoftRefLoaderCache.get(SoftRefLoaderCache.java:112)
	at net.imglib2.cache.util.LoaderCacheKeyAdapter.get(LoaderCacheKeyAdapter.java:20)
	at net.imglib2.cache.util.LoaderCacheAsCacheAdapter.get(LoaderCacheAsCacheAdapter.java:43)
	at net.imglib2.cache.ref.WeakRefVolatileCache.getBlocking(WeakRefVolatileCache.java:451)
	at net.imglib2.cache.ref.WeakRefVolatileCache.access$000(WeakRefVolatileCache.java:20)
	at net.imglib2.cache.ref.WeakRefVolatileCache$FetchEntry.call(WeakRefVolatileCache.java:498)
	at net.imglib2.cache.ref.WeakRefVolatileCache$FetchEntry.call(WeakRefVolatileCache.java:475)
	at net.imglib2.cache.queue.FetcherThreads$Fetcher.run(FetcherThreads.java:177)
Caused by: ncsa.hdf.hdf5lib.exceptions.HDF5DatatypeInterfaceException: Datatype:Can't convert datatypes ["..\..\src\H5Tconv.c line 5377 in H5T__conv_ushort_short(): can't handle conversion exception"]
	at ch.systemsx.cisd.hdf5.hdf5lib.H5.H5Dread(Native Method)
	at ch.systemsx.cisd.hdf5.hdf5lib.H5D.H5Dread(H5D.java:381)
	at bdv.img.hdf5.HDF5AccessHack.readShortMDArrayBlockWithOffset(HDF5AccessHack.java:198)
	at bdv.img.hdf5.HDF5AccessHack.readShortMDArrayBlockWithOffset(HDF5AccessHack.java:183)
	at bdv.img.hdf5.Hdf5VolatileShortArrayLoader.loadArray(Hdf5VolatileShortArrayLoader.java:47)
	at bdv.img.hdf5.Hdf5VolatileShortArrayLoader.loadArray(Hdf5VolatileShortArrayLoader.java:35)
	at bdv.img.cache.VolatileGlobalCellCache.lambda$createImg$0(VolatileGlobalCellCache.java:213)
	at net.imglib2.cache.util.LoaderCacheKeyAdapter.lambda$get$0(LoaderCacheKeyAdapter.java:22)
	at net.imglib2.cache.ref.SoftRefLoaderCache.get(SoftRefLoaderCache.java:102)
	... 7 more

Environment:

  • Windows 10 64 bit
  • ImageJ 1.53c
  • Java 1.8.0_172 (64-bit)
  • ran Update Fiji on 2020-08-03
@bogovicj
Copy link
Contributor

How was your hdf5 file created?

There is a similar issue with tips on resolving it here:
PreibischLab/BigStitcher#60

@balintbalazs
Copy link
Author

The files are from the Luxendo software. We directly copy the uint16 camera images to a uint16 HDF5 dataset using the C API. When opening them with the HDF5 Fiji plugin they are okay.

It seems the bdv reader assumes int16 datatype without checking the actual datatype of the dataset:

H5Dread( dataset.dataSetId, H5T_NATIVE_INT16, memorySpaceId, dataset.fileSpaceId, numericConversionXferPropertyListID, dataBlock );

I will try the suggestions in the linked thread to see if that helps, thanks for the info.

@zacsimile
Copy link

zacsimile commented Apr 22, 2024

I made a minimal example that shows this breakage (attached) and discussed it a bit at https://forum.image.sc/t/bigstitcher-image-fusion-produces-black-bars/85726/10 and PreibischLab/BigStitcher#129. The int16 file opens perfectly, while the uint16 file crashes BDV.

Ultimately, I fixed my writer in the same fashion as described at PreibischLab/BigStitcher#60: I explicitly cast everything to int16 before writing to file. Should we be able to pass things into BDV as int16? Or should only uint16 be supported.

test_bdv.zip

@tpietzsch
Copy link
Member

The behaviour you describe (assuming int16 always) was how BDV worked for a long time.

I added proper support for more datatypes a while ago in #157. However, for this to be picked up, the datatype must be added as an attribute to each setup group. E.g., the info (pyramid resolutions, etc) for the first setup is under group "/s00" in the h5 file. If "/s00" has a "dataType" string attribute with value "uint8", that means that the datasets for this setup are UnsignedByteType. If this is missing, then BDV falls back to the legacy behaviour (to keep supporting the old files).

So, you can make your test_uint16.h5 work by putting adding attributes to the s00 .. s05 groups.

Ultimately, I fixed my writer in the same fashion as described at PreibischLab/BigStitcher#60: I explicitly cast everything to int16 before writing to file. Should we be able to pass things into BDV as int16? Or should only uint16 be supported.

Please don't do this... Moving forward, it would be better to add the dataType = uint16 attributes.

Something like

IHDF5Writer hdf5Writer = HDF5Factory.open( "test_uint16.h5" );
hdf5Writer.string().setAttr( "s00", "dataType", "uint16" );

in Java, or

hdf5_writer['s00'].attrs['dataType'] = 'uint16'

in python

@imagesc-bot
Copy link

This issue has been mentioned on Image.sc Forum. There might be relevant details there:

https://forum.image.sc/t/bigstitcher-image-fusion-produces-black-bars/85726/14

@zacsimile
Copy link

zacsimile commented Apr 24, 2024

Yep--that attribute specification fixes it. Thanks!

@balintbalazs
Copy link
Author

This is great, thanks a lot! I think this issue can be closed, since the new spec seems to cover this use case.

Is there a place with an up to date specification for the BigDataViewer format? Until now we just used the original Nature Methods paper, plus the export from Fiji function, but those don't cover all use cases. Thanks again!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants