We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Android crash when inference with mobilenet ssd. hread #15, name = 'inference', stop reason = signal SIGSEGV: invalid address (fault address: 0x0)
ncnn::Net::forward_layer(int, std::__ndk1::vector<ncnn::Mat, std::__ndk1::allocator<ncnn::Mat> >&, ncnn::Option&) const [inlined] ncnn::Mat::release() at mat.h:636 frame #2: 0xca0da542 libncnn_jni.so
ncnn::Net::forward_layer(this=0x00000067, layer_index=<unavailable>, blob_mats=size=1, opt=0xcab0146c) const at net.cpp:884 frame #4: 0xca0d9c4a libncnn_jni.so
ncnn::Net::forward_layer(this=0xca1f7a24, layer_index=<unavailable>, blob_mats=size=1, opt=0xcab0146c) const at net.cpp:896 frame #6: 0xca0da818 libncnn_jni.so
::Java_com_davidchiu_ncnncam_Ncnn_nativeDetect(env=0xcacea540, thiz=0xcab0152c, bitmap=0xcab01530) at ncnn_jni.cpp:194 frame #8: 0xcb25c60e base.odex
art_quick_invoke_stub_internal + 70 frame #10: 0xe5de1eea libart.so
The text was updated successfully, but these errors were encountered:
solved, it proves model and ncnn lib does not match
Sorry, something went wrong.
No branches or pull requests
Android crash when inference with mobilenet ssd.
hread #15, name = 'inference', stop reason = signal SIGSEGV: invalid address (fault address: 0x0)
frame How to use this? Any Doc or Samples? #1: 0xca0da56c libncnn_jni.so
ncnn::Net::forward_layer(int, std::__ndk1::vector<ncnn::Mat, std::__ndk1::allocator<ncnn::Mat> >&, ncnn::Option&) const [inlined] ncnn::Mat::release() at mat.h:636 frame #2: 0xca0da542 libncnn_jni.so
ncnn::Net::forward_layer(int, std::__ndk1::vector<ncnn::Mat, std::__ndk1::allocatorncnn::Mat >&, ncnn::Option&) const [inlined] ncnn::Mat::~Mat() at mat.h:262frame Added the execute permission to the shell scrips. #3: 0xca0da542 libncnn_jni.so
ncnn::Net::forward_layer(this=0x00000067, layer_index=<unavailable>, blob_mats=size=1, opt=0xcab0146c) const at net.cpp:884 frame #4: 0xca0d9c4a libncnn_jni.so
ncnn::Net::forward_layer(this=0xca1f7a24, layer_index=, blob_mats=size=1, opt=0xcab0146c) const at net.cpp:896frame 后续是否会支持将 tensorflow pb模型转为ncnn的模型呢? #5: 0xca0d9c4a libncnn_jni.so
ncnn::Net::forward_layer(this=0xca1f7a24, layer_index=<unavailable>, blob_mats=size=1, opt=0xcab0146c) const at net.cpp:896 frame #6: 0xca0da818 libncnn_jni.so
ncnn::Extractor::extract(this=0xcab0145c, blob_index=, feat=0xcab01438) at net.cpp:1016frame Update README.md #7: 0xca0cc92c libncnn_jni.so
::Java_com_davidchiu_ncnncam_Ncnn_nativeDetect(env=0xcacea540, thiz=0xcab0152c, bitmap=0xcab01530) at ncnn_jni.cpp:194 frame #8: 0xcb25c60e base.odex
nativeDetect(this=, (null)=) + 94frame How to run the example project? #9: 0xe5ddcd76 libart.so
art_quick_invoke_stub_internal + 70 frame #10: 0xe5de1eea libart.so
art_quick_invoke_stub + 234The text was updated successfully, but these errors were encountered: