Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem with loading model via SavedModelBundle.load() #540

Open
EllenEn opened this issue May 13, 2024 · 1 comment
Open

Problem with loading model via SavedModelBundle.load() #540

EllenEn opened this issue May 13, 2024 · 1 comment

Comments

@EllenEn
Copy link

EllenEn commented May 13, 2024

Hello

I am facing an issue with Tensorflow for java. I am trying to load in a Tensorflow model.

Here is the code
TensorFlow.loadLibrary("/app/tensorflow/inference.so"); model = SavedModelBundle.load("/app/tensorflow/mc_performance_model_rf", "serve");

It worked fine locally, but when I tried to deploy it to digital ocean, the deployment failed. There seems to be no error in the logs. The logs just end after starting to load the model and the container gets killed.

Would really appreciate any info which could help solving or troubleshooting the problem.

Logs can be seen here:

[2024-05-13 08:40:42] [TIMESTAMP]: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
[2024-05-13 08:40:46] [TIMESTAMP]: I tensorflow/cc/saved_model/reader.cc:83] Reading SavedModel from: /app/tensorflow/mc_performance_model_rf
[2024-05-13 08:40:46] [TIMESTAMP]: I tensorflow/cc/saved_model/reader.cc:51] Reading meta graph with tags { serve }
[2024-05-13 08:40:46] [TIMESTAMP]: I tensorflow/cc/saved_model/reader.cc:146] Reading SavedModel debug info (if present) from: /app/tensorflow/mc_performance_model_rf
[2024-05-13 08:40:46] [TIMESTAMP]: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
[2024-05-13 08:40:46] To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
[2024-05-13 08:40:46] [TIMESTAMP]: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:388] MLIR V1 optimization pass is not enabled
[2024-05-13 08:40:46] [TIMESTAMP]: I tensorflow/cc/saved_model/loader.cc:234] Restoring SavedModel bundle.
[2024-05-13 08:40:46] [TIMESTAMP]: I tensorflow/cc/saved_model/loader.cc:218] Running initialization op on SavedModel bundle at path: /app/tensorflow/mc_performance_model_rf
[2024-05-13 08:40:46] [INFO [TIMESTAMP] UTC kernel.cc:1233] Loading model from path /app/tensorflow/mc_performance_model_rf/assets/ with prefix a52beec4ed424f4b
@Craigacp
Copy link
Collaborator

Craigacp commented May 13, 2024

What version of TF-Java are you using, what's the deployment environment (OS, Java version etc), and what platform does it work locally on? Is this using TF-DF?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants