Applications for grammars and formal semantics #60
Replies: 2 comments
-
It seems to me that ML is (at least currently) better at handling broad or unbounded areas like copilot (or chat bots or internet search engines) where you can ask literally anything and the user expects some kind of answer. Being generous, and generalizing, I'd say you get broad coverage but pay for it with a lack of control which can often lead to quite random responses. Formal grammars provide wonderful predictability and explainability but they are costly to build and take domain expertise. So, they work well for bounded areas (like the games I am building), but are harder to scale. I have seen enough success using formal grammars with games to still be investing in this approach for Interactive Fiction type games, so I'd submit that as one area that formal grammars can make a difference. Users expect predictable and logical interactions with most games (unless the point is to be random like: https://aidungeon.cc/.) BTW: I do notice in my systems the same thing that your IBM Colleague noticed: People currently have a really hard time speaking normally to a computer, they have been trained that computers are awful at the task and quickly resort to keywords or very simplified (non grammatically correct) phrases. I suspect that will improve if we ever get a system that consistently responds well to full sentences. That is my current theory at least. |
Beta Was this translation helpful? Give feedback.
-
Cleaning out my inbox and found this message. Sounds like a great discussion topic for the Summit! Have you perhaps proposed it already, @arademaker ? |
Beta Was this translation helpful? Give feedback.
-
maybe an off-topic or recurrent useless discussion…
I was having a conversation with an old friend. A computer scientist author of the Lean theorem prover (https://leanprover.github.io/) and the Z3 SMT Solver (https://en.m.wikipedia.org/wiki/Z3_Theorem_Prover). So a very influential person in the automated and interactive theorem provers community.
In particular, I was discussing some initial ideas that I have about the conversion of MRS to dependent types instead of FOL. At some point, he challenged me with a question: "How Grammar, semantic representations like MRS and my ideas about dependent types semantics of MRS could be used in the context of project like https://copilot.github.com/?"
I remember a discussion that I have a year back with a colleague at IBM that was implementing an NL question to DB query translator and conclude that despite the good results in 'NL to queries', the users were not writing NL questions, but only keywords. So all NL analyses turn out to be irrelevant… She called it the 'Google effect'.
My friend suggests that this example demonstrates why ML dominates the NLP area… at least similar projects on NL to any other formal language translation. In what applications we can still make the difference? Surely one application is the use of computational methods for linguistics studies.
Beta Was this translation helpful? Give feedback.
All reactions