Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wdunn001/gpu #1

Open
wants to merge 6 commits into
base: gpu
Choose a base branch
from
Open

Wdunn001/gpu #1

wants to merge 6 commits into from

Conversation

wdunn001
Copy link

@wdunn001 wdunn001 commented Oct 8, 2023

adding commands from non gpu version

@wdunn001 wdunn001 changed the base branch from master to gpu October 8, 2023 16:55
docker-compose.gpu.yml Outdated Show resolved Hide resolved
Signed-off-by: Jason Dunn <[email protected]>
@wdunn001
Copy link
Author

wdunn001 commented Oct 9, 2023

resolved this issue now receiving two issues with piper and openwakeword

Piper:

usage: __main__.py [-h] --piper PIPER --voice VOICE [--uri URI] --data-dir
                   DATA_DIR [--download-dir DOWNLOAD_DIR] [--speaker SPEAKER]
                   [--noise-scale NOISE_SCALE] [--length-scale LENGTH_SCALE]
                   [--noise-w NOISE_W] [--auto-punctuation AUTO_PUNCTUATION]
                   [--samples-per-chunk SAMPLES_PER_CHUNK]
                   [--max-piper-procs MAX_PIPER_PROCS] [--update-voices]
                   [--debug]
__main__.py: error: unrecognized arguments: --cuda

OpenWakeWord:

Traceback (most recent call last):
  File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/usr/local/lib/python3.10/dist-packages/wyoming_openwakeword/__main__.py", line 176, in <module>
    asyncio.run(main())
  File "/usr/lib/python3.10/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.10/dist-packages/wyoming_openwakeword/__main__.py", line 87, in main
�
    assert (
AssertionError: Missing model: hey_jarvis
 (looked in: /usr/local/lib/python3.10/dist-packages/wyoming_openwakeword/models)

@wdunn001
Copy link
Author

wdunn001 commented Oct 9, 2023

Do I need a PyTorch instance to run piper on gpu?

@edurenye
Copy link
Owner

About piper, see the issue I created here: rhasspy/rhasspy3#49
I couldn't fix that issue, we would need to build our own python library or just wait for the maintainer of the library to add the support for that, there is no repo where we can make a PR for that sadly

@@ -32,13 +32,13 @@ docker compose -f docker-compose.base.yml build --no-cache
Run it with:

``` sh
docker composer -f docker-compose.base.yml up -d
docker compose -f docker-compose.base.yml up -d
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for that

@@ -1,3 +1,2 @@
#!/usr/bin/env bash
python3 -m wyoming_openwakeword \
--uri 'tcp://0.0.0.0:10400' "$@"
python3 -m wyoming_openwakeword --uri 'tcp://0.0.0.0:10400' "$@"
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why did you change it to single line? Seems unrelated and reduces legibility

Copy link
Author

@wdunn001 wdunn001 Oct 11, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh I did this because if I don't it won't run the script. its not recognizing the / I think this is a different in my environment refactoring it ensures its environment agnostic. I get a command not recognized error for each new line

Copy link
Owner

@edurenye edurenye Oct 11, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Weird, this is supposed to run inside the container, should not care about your environment.

@edurenye
Copy link
Owner

Do I need a PyTorch instance to run piper on gpu?

No, according to the documentation, you just need onnxruntime-gpu

@edurenye
Copy link
Owner

I think preloading the model should fix the opwnwakeword issue. Not sure if we should preload that one, feels the more neutral and easy to pronounce to me, but maybe I'm wrong.

Seems like the maintainer was using ok_nabu, but because 'Nabu Casa' is the company where he works, personally I think the dafault should be something neutral and non comercial word, then they can use it for their products with whatever model they want.

@wdunn001
Copy link
Author

I just wanted to update you I cannot get OpenWakeWord going I tried preloading a model but they can't be found. I need to redo the docker file so I can peak into the default folder to see if the model is actually there. What I think might need to happen is we allow the end user to define the model location and provide a ling for download to a list of default models. I haven't had time to set all that up recently.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants