-
Notifications
You must be signed in to change notification settings - Fork 105
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
model save in .h5 #191
Comments
well, trial and error, it finally worked
but i get the following warning
I am not sure if this affect in any way the usability of the network. |
I ran into same issue. I tried your workaround os.path.dirname, some h5 seemed to be saved. but i also go this error: serialize_concrete_function(concrete_function, node_ids, coder) KeyError: "Failed to add concrete function 'b'__inference_similarity_model_layer_call_fn_145923'' to object-based SavedModel as it captures tensor <tf.Tensor: shape=(), dtype=resource, value=> which is unsupported or not reachable from root. One reason could be that a stateful object or a variable that the function depends on is not assigned to an attribute of the serialized trackable object (see SaveTest.test_captures_unreachable_variable)." |
Update: My particular model has some data augmentation layers in them, once i remove them, the model is saved. There may be serialization related issue. One such layer is like: tensorflow.keras.layers.experimental.preprocessing.RandomFlip |
Actually, I discovered this is indeed a bug, that is distinct from inability to save in .h5 format (which may necessitate more code to save the _index?). The error message I got was for saving with the expected 'tf' format. Trackable Python objects referring to this tensor (from gc.get_referrers, limited to two hops): This happens for EfficientNetSim (which is used in supervised_visualization.ipynb)
What i found is a bit surprising. The default for augmentation is "basic" (not nothing), and it involves:
So when i used this same model in work, I didn't know it has applied these augmentations. I am biased to think default should be None, since this is highly data-centric and task-centric. Eg. one may argue you shouldn't do this for x-ray if the disease doesn't have left/right symmetry. Model summary sort of gives a high level sequential layer and i miss this as a result. I suspect if i don't use augmentation, the model will save correctly (avoid using RandomFlip). I will try this and update more if that doesn't resolve. Finally, for those who really want to save and not redo expensive training, you can try model.save_weights(...) and load_weights(...) as a tmp workaround. Let me know if this should be a separate issue. |
I found it is possible to save if I don't use augmentation in EfficientNetSim. model = EfficientNetSim(train_ds.example_shape, embedding_size, augmentation=None)
tf.keras.models.save_model(model, "./sim_model.h5")
# OR
tf.keras.models.save_model(model, "./sim_model/0") |
We just pushed 0.15 to the main branch and removed the augmentation arg from all the tfsim.architectures. This is a breaking change but should resolve some of these issues. |
Hi, i have tried to save the model as shown in the tensorflow page https://www.tensorflow.org/tutorials/keras/save_and_load and i get the following error
tensorflow.python.framework.errors_impl.FailedPreconditionError: model.h5 is not a directory [Op:WriteFile]
I can save it in the other way, is the h5 format that is not working, and it would be better for what i need if i could save it in just one file.
The version are tensorflow==2.6.0 h5py==3.1.0 pyyaml==6.0 tensorflow_similarity==0.14.8
I think it could be a versions issue
The text was updated successfully, but these errors were encountered: