You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Inspired by the “where is this?” task, I think it would be really interesting to stress test our NLP by asking the robot questions about all the knowledge we have modeled. (Think of analysis mode in Westworld).
This can be built in stages and members can add the different things they’re working on to this “Analysis mode”.
Based on the last few days, things I can think of:
Things about the robot itself
knowledge it has: where it is, what it can see
low level, e.g. whether there is something in its hand? An obstacle in front of it?
some meta stuff: where it comes from, it’s name
other “fun” knowledge: pop culture, jokes, stuff it can look up in the internet
Things about it’s environment:
Directions (this is already mostly covered by where is this, but has room for improvement)
how many rooms there are
Contents of a room
What category an object is
Which people does it know
properties of objects
The text was updated successfully, but these errors were encountered:
I suppose it should be doable in a semester-long project - at least a basic version of the system. We should however possibly try to decompose it into clear steps that students should follow; otherwise, they might easily get lost in details.
Inspired by the “where is this?” task, I think it would be really interesting to stress test our NLP by asking the robot questions about all the knowledge we have modeled. (Think of analysis mode in Westworld).
This can be built in stages and members can add the different things they’re working on to this “Analysis mode”.
Based on the last few days, things I can think of:
The text was updated successfully, but these errors were encountered: