Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TfLite 2.13 with -DTFLITE_ENABLE_GPU=ON fails to build with Visual Studio 2019 and 2022 #61269

Closed
misterBart opened this issue Jul 13, 2023 · 9 comments
Assignees
Labels
comp:lite TF Lite related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower subtype:windows Windows Build/Installation Issues TF 2.13 For issues related to Tensorflow 2.13 type:build/install Build and install issues

Comments

@misterBart
Copy link
Contributor

Issue type

Bug

Have you reproduced the bug with TensorFlow Nightly?

No

Source

source

TensorFlow version

tf 2.13

Custom code

No

OS platform and distribution

Windows 10

Mobile device

No response

Python version

No response

Bazel version

No response

GCC/compiler version

Visual Studio 2019 and 2022

CUDA/cuDNN version

No response

GPU model and memory

No response

Current behavior?

Building TfLite 2.13 with cmake and -DTFLITE_ENABLE_GPU=ON fails if Visual Studio 2019 or 2022 is used. Tested on two different machines, it fails on both machines.
Steps executed in Windows Command Prompt:

git clone --single-branch --branch r2.13 https://github.com/tensorflow/tensorflow tensorflow_src
mkdir tflite_build_x64
cd tflite_build_x64
cmake -DTFLITE_ENABLE_GPU=ON ..\tensorflow_src\tensorflow\lite
cmake --build . -j 8 --config Release

The cmake build command yields two errors:

C:\Users\bartp\source\TfLite2.13Gpu\tensorflow_src\tensorflow\lite\delegates\gpu\common\selectors\operation_selector.cc(313,20): error C2039: 'any_cast': is not a member of 'std' [C:\Users\bartp\source\TfLite2.13Gpu\tflite_build_x64\tensorflow-lite.vcxproj]
C:\Users\bartp\source\TfLite2.13Gpu\tensorflow_src\tensorflow\lite\delegates\gpu\common\tasks\special\conv_pointwise.cc(129,12): error C2039: 'any_cast': is not a member of 'std' [C:\Users\bartp\source\TfLite2.13Gpu\tflite_build_x64\tensorflow-lite.vcxproj]

To fix this, add #include <any> to tensorflow\lite\delegates\gpu\common\selectors\operation_selector.cc and tensorflow\lite\delegates\gpu\common\selectors\operation_selector.cc

Standalone code to reproduce the issue

To reproduce the issue: see my earlier writing.

Relevant log output

No response

@google-ml-butler google-ml-butler bot added the type:bug Bug label Jul 13, 2023
@tilakrayal tilakrayal added TF 2.13 For issues related to Tensorflow 2.13 comp:lite TF Lite related issues labels Jul 14, 2023
@tilakrayal tilakrayal assigned pjpratik and unassigned sushreebarsa Jul 14, 2023
@pjpratik pjpratik added the subtype:windows Windows Build/Installation Issues label Jul 14, 2023
@pjpratik pjpratik assigned pkgoogle and unassigned pjpratik Jul 14, 2023
@pkgoogle
Copy link

Hi @misterBart, thanks for reporting the issue.

git & cmake are not available natively on windows command prompt so I have a couple of questions in order to reproduce your issue.

Are you using powershell or command prompt? Are you using WSL? Are you using MinGW? Are you using git for windows? If you have trouble understanding these questions, a good first pass is to ask bard: https://bard.google.com/. Ex: "How to tell if I'm using _______?"

Usually the more information you provide, the faster I am able to assist you. Thanks!

@pkgoogle pkgoogle added the stat:awaiting response Status - Awaiting response from author label Jul 18, 2023
@misterBart
Copy link
Contributor Author

I'm using Windows Command Prompt.
Installed Git with the Windows installer from https://git-scm.com/download/win
Installed CMake with the Windows installer from https://cmake.org/download/
After installing, git and cmake are available in Windows Command Prompt

By the way, you can also solve the two mentioned compile errors if your replace std::any_cast with absl::any_cast. I believe this is the preferable solution, because I notice you use absl::any_cast more often in the two C++ files in question.

@google-ml-butler google-ml-butler bot removed the stat:awaiting response Status - Awaiting response from author label Jul 19, 2023
@pkgoogle
Copy link

Hi @misterBart, our internal tests/tools show it compiles correctly with clang, Is there a way to adjust your cmake installation to use clang? (I'm currently guessing it is using gcc but I'm not sure).

@pkgoogle pkgoogle added stat:awaiting response Status - Awaiting response from author type:build/install Build and install issues and removed type:bug Bug labels Jul 19, 2023
@misterBart
Copy link
Contributor Author

misterBart commented Jul 20, 2023

I am using cmake with Visual Studio (also see title and opening post), so I'm using Microsoft Visual C++ (MSVC) compiler.

As for your follow-up email: "Hi @misterBart, let us know if you have tried multiple models to help us look into the problem further."
The issue is building TfLite. I have to build TfLite before I can use TfLite models.

To make things clear, I never asked for help. I reported an issue and posted a solution in my opening post and a second solution in my previous comment. I posted these solutions so that one of them could be applied to the TfLite code, so that other people using Visual Studio will not experience this error. Hopefully things are clear now.

@google-ml-butler google-ml-butler bot removed the stat:awaiting response Status - Awaiting response from author label Jul 20, 2023
@pkgoogle
Copy link

Hi @misterBart, there are complications/restrictions which make applying those solutions not that simple, but we'll take a deeper look, in the mean time, can you use bazel to unblock yourself? Generally the bazel workflow is better supported for windows/macos and the cmake workflow is better supported for *nix systems.

@pkgoogle
Copy link

pkgoogle commented Jul 20, 2023

Hi @terryheo, can you please take a look? Thanks

@pkgoogle pkgoogle added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Jul 20, 2023
@misterBart
Copy link
Contributor Author

misterBart commented Jul 31, 2023

@terryheo Have you been able to look at the matter yet?

@gaikwadrahul8
Copy link
Contributor

Hi, @misterBart

Thanks for raising this issue. Are you aware of the migration to LiteRT? This transition is aimed at enhancing our project's capabilities and providing improved support and focus for our users. As we believe this issue is still relevant to LiteRT we are moving your issue there. Please follow progress here: google-ai-edge/LiteRT#165

Let us know if you have any questions. Thanks.

Copy link

Are you satisfied with the resolution of your issue?
Yes
No

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:lite TF Lite related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower subtype:windows Windows Build/Installation Issues TF 2.13 For issues related to Tensorflow 2.13 type:build/install Build and install issues
Projects
None yet
Development

No branches or pull requests

7 participants