You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tried to create the selective build of the tensorflow lite, following this guide.
The build succeeded and I added the tensorflow-lite-select-tf-ops.aar to the android studio project, yet it resulted in an error:
java.lang.UnsatisfiedLinkError: dlopen failed: cannot locate symbol "_ZN6google8protobuf8internal26fixed_address_empty_stringE" referenced by "/data/app/~~PgTBlp4bIwZ8yl02UBjqqg==/com.inseye.core.test-yoruTyRORPvs5Ap3TE6dGw==/base.apk!/lib/x86_64/libtensorflowlite_flex_jni.so".
I noticed, there are fixes for this problem, e.g. here - one needs just add following line:
--config=monolithic
to the .bazelrc file.
But build with this configuration fails with the following error:
ERROR: /tensorflow_src/tensorflow/BUILD:1652:19: Action tensorflow/_api/v2/v2.py [for tool] failed: (Aborted): bash failed: error executing command (from target //tensorflow:tf_python_api_gen_v2)
(cd /root/.cache/bazel/_bazel_root/43801f1e35f242fb634ebbc6079cf6c5/execroot/org_tensorflow && \
exec env - \
DOCKER_HOST_CACHEBUSTER=1682977560680045781 \
LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64 \
PATH=/root/.cache/bazelisk/downloads/bazelbuild/bazel-6.1.0-linux-x86_64/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/android/sdk/cmdline-tools/latest/bin:/android/sdk/platform-tools:/android/ndk \
/bin/bash -c 'bazel-out/k8-opt-exec-50AE0418/bin/tensorflow/create_tensorflow.python_api_tf_python_api_gen_v2 --root_init_template=tensorflow/api_template.__init__.py --apidir=bazel-out/k8-opt-exec-50AE0418/bin/tensorflow/_api/v2/ --apiname=tensorflow --apiversion=2 --compat_apiversion=1 --compat_apiversion=2 --compat_init_template=tensorflow/compat_template_v1.__init__.py --compat_init_template=tensorflow/compat_template.__init__.py --packages=tensorflow.python,tensorflow.dtensor.python.accelerator_util,tensorflow.dtensor.python.api,tensorflow.dtensor.python.config,tensorflow.dtensor.python.d_checkpoint,tensorflow.dtensor.python.d_variable,tensorflow.dtensor.python.input_util,tensorflow.dtensor.python.layout,tensorflow.dtensor.python.mesh_util,tensorflow.dtensor.python.tpu_util,tensorflow.dtensor.python.save_restore,tensorflow.lite.python.analyzer,tensorflow.lite.python.lite,tensorflow.lite.python.authoring.authoring,tensorflow.python.modules_with_exports --output_package=tensorflow._api.v2 --use_relative_imports=True --loading=default bazel-out/k8-opt-exec-50AE0418/bin/tensorflow/tf_python_api_gen_v2.params')
# Configuration: 9f320c2ab265ab521267a555257ce138659ef24e739cde7fde73c7665f4cf5e4
# Execution platform: @local_execution_config_platform//:platform
2023-06-09 21:58:46.512000: I tensorflow/core/util/port.cc:110] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2023-06-09 21:58:46.546669: F ./tensorflow/core/framework/variant_op_registry.h:114] Check failed: existing == nullptr (0x2df3898 vs. nullptr)UnaryVariantDeviceCopy for direction: 1 and type_index: tensorflow::Tensor already registered
Target //tmp:tensorflow-lite-select-tf-ops failed to build
ERROR: /tensorflow_src/tensorflow/python/tools/BUILD:303:17 Middleman _middlemen/tensorflow_Spython_Stools_Sprint_Uselective_Uregistration_Uheader-runfiles failed: (Aborted): bash failed: error executing command (from target //tensorflow:tf_python_api_gen_v2)
(cd /root/.cache/bazel/_bazel_root/43801f1e35f242fb634ebbc6079cf6c5/execroot/org_tensorflow && \
exec env - \
DOCKER_HOST_CACHEBUSTER=1682977560680045781 \
LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64 \
PATH=/root/.cache/bazelisk/downloads/bazelbuild/bazel-6.1.0-linux-x86_64/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/android/sdk/cmdline-tools/latest/bin:/android/sdk/platform-tools:/android/ndk \
/bin/bash -c 'bazel-out/k8-opt-exec-50AE0418/bin/tensorflow/create_tensorflow.python_api_tf_python_api_gen_v2 --root_init_template=tensorflow/api_template.__init__.py --apidir=bazel-out/k8-opt-exec-50AE0418/bin/tensorflow/_api/v2/ --apiname=tensorflow --apiversion=2 --compat_apiversion=1 --compat_apiversion=2 --compat_init_template=tensorflow/compat_template_v1.__init__.py --compat_init_template=tensorflow/compat_template.__init__.py --packages=tensorflow.python,tensorflow.dtensor.python.accelerator_util,tensorflow.dtensor.python.api,tensorflow.dtensor.python.config,tensorflow.dtensor.python.d_checkpoint,tensorflow.dtensor.python.d_variable,tensorflow.dtensor.python.input_util,tensorflow.dtensor.python.layout,tensorflow.dtensor.python.mesh_util,tensorflow.dtensor.python.tpu_util,tensorflow.dtensor.python.save_restore,tensorflow.lite.python.analyzer,tensorflow.lite.python.lite,tensorflow.lite.python.authoring.authoring,tensorflow.python.modules_with_exports --output_package=tensorflow._api.v2 --use_relative_imports=True --loading=default bazel-out/k8-opt-exec-50AE0418/bin/tensorflow/tf_python_api_gen_v2.params')
# Configuration: 9f320c2ab265ab521267a555257ce138659ef24e739cde7fde73c7665f4cf5e4
# Execution platform: @local_execution_config_platform//:platform
This issue originally reported by @dachshund-ncu has been moved to this dedicated repository for LiteRT to enhance issue tracking and prioritization. To ensure continuity, we have created this new issue on your behalf.
We appreciate your understanding and look forward to your continued involvement.
Click to expand!
Issue Type
Bug
Have you reproduced the bug with TF nightly?
No
Source
source
Tensorflow Version
2.13
Custom Code
No
OS Platform and Distribution
Ubuntu 22.04 lts, Ubuntu 23.04
Mobile device
No response
Python version
No response
Bazel version
6.1.0
GCC/Compiler version
9.4.0
CUDA/cuDNN version
No response
GPU model and memory
No response
Current Behaviour?
I tried to create the selective build of the tensorflow lite, following this guide.
The build succeeded and I added the tensorflow-lite-select-tf-ops.aar to the android studio project, yet it resulted in an error:
I noticed, there are fixes for this problem, e.g. here - one needs just add following line:
to the .bazelrc file.
But build with this configuration fails with the following error:
Standalone code to reproduce the issue
Relevant log output
No response
The text was updated successfully, but these errors were encountered: