Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TFLite Android #50

Open
zeynepgulhanuslu opened this issue Jan 21, 2022 · 6 comments
Open

TFLite Android #50

zeynepgulhanuslu opened this issue Jan 21, 2022 · 6 comments

Comments

@zeynepgulhanuslu
Copy link

zeynepgulhanuslu commented Jan 21, 2022

Hi,
I want to use tflite model in android project. When I load model to android studio it generates a code like below:

`
val model = Dtln.newInstance(context)

// Creates inputs for reference.
val inputFeature0 = TensorBuffer.createFixedSize(intArrayOf(1, 1, 512), DataType.FLOAT32)
inputFeature0.loadBuffer(byteBuffer)
val inputFeature1 = TensorBuffer.createFixedSize(intArrayOf(1, 2, 128, 2), DataType.FLOAT32)
inputFeature1.loadBuffer(byteBuffer)

// Runs model inference and gets result.
val outputs = model.process(inputFeature0, inputFeature1)
val outputFeature0 = outputs.outputFeature0AsTensorBuffer
val outputFeature1 = outputs.outputFeature1AsTensorBuffer

// Releases model resources if no longer used.
model.close()
`

My question is what is the inputFeature0 and inputFeature1 in this code? Should I read wav file as byte array than reshape it? Or Should I create feature vector of wav file? Can you help me with this?

Thanks

@zeynepgulhanuslu zeynepgulhanuslu changed the title Hi, TFLite Android Jan 21, 2022
@ShivamSrivastavaAkaike
Copy link

Hi ,
I also want to use this model on android , if you have used it on android then could you help?

@zeynepgulhanuslu
Copy link
Author

Hi,

I used it but did not have a chance to use it as streaming. It only runs as a batch. If you want, I can send the code but it's a little messy since I didn't use it and didn't get the time to finish it properly. If you give me your mail, I can send it to you as a reference.

@ShivamSrivastavaAkaike
Copy link

yeah sure ,it will help.
Thanks in advance
here is my email id :- [email protected]

@zeynepgulhanuslu
Copy link
Author

Okay I will send it to you, If you want you can delete the comment. I hope it helps, have a nice day.

@mcig
Copy link

mcig commented May 2, 2023

Hello all 👋, I have also been trying to make this model work on android devices with kotlin. And unfortunately, same like you @zeynepgulhanuslu i got stuck while running the model inference from a recorded batch and with real time data. I have tried to reimplement the real_time_processing_tf_lite.py in kotlin but it got way too complicated to perform array operations and fft calculations without numpy.

Have you open sourced or uploaded your version anywhere? Honestly, it would help me a lot at this point 😅.

@zeynepgulhanuslu
Copy link
Author

Hello Mustafa I didn't complete the code either. But I created a tflite model for fft calculations. You can find more information with searching "create a TFLite model from one or more concrete functions in TensorFlow." . This way you can do numpy operations in a tflite model. I hope this help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants