Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SUSI.AI Too Slow: Shorten the time to get an answer from SUSI.AI #1417

Open
mariobehling opened this issue Dec 15, 2019 · 3 comments
Open

SUSI.AI Too Slow: Shorten the time to get an answer from SUSI.AI #1417

mariobehling opened this issue Dec 15, 2019 · 3 comments
Labels

Comments

@mariobehling
Copy link
Member

Currently the server will just request an answer from the next service if it does not get an answer from the first source. This makes the service very slow. Speed is an important thing to keep users onboard. A fast user experience is very very important.

EXPECTED:
Therefore a good alternative is: If the system is not able to get an answer from a source it could just answer something like: “I tried to get an answer for your questions by searching [source here], but I was not able to get a good response. Please ask me something else.”

@norbusan
Copy link
Member

We already have timeouts for each API call (by default), but we don't have a timeout for the whole thinking process of susi server. That means, it can in principle continue for a long long time. (Max ideas is I think 100).

Maybe we need a total time limit until an answer needs to be found, and if no answer is found by that time a null answer is returned.

@Dilshaad21
Copy link
Member

@norbusan @mariobehling I want to work on this issue

@norbusan
Copy link
Member

Great, fine with me. It shouldn't be too difficult to add a simple timeout feature to the one call. But it might be better to implement that in susi_python directly, at least that is what I suggest

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants