You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have Followed the instruction here to add a new model and load locally: https://llm.mlc.ai/docs/deploy/webllm.html. I was able to successfully convert the weights, generate a config and compile a model. However I get this error when trying to initialize the model. Error: Cannot find global function mlc.grammar.BNFGrammarGetGrammarOfJSON
I have Followed the instruction here to add a new model and load locally: https://llm.mlc.ai/docs/deploy/webllm.html. I was able to successfully convert the weights, generate a config and compile a model. However I get this error when trying to initialize the model. Error: Cannot find global function mlc.grammar.BNFGrammarGetGrammarOfJSON
I have also run the compilation on this model from huggingface https://huggingface.co/mlc-ai/SmolLM2-135M-Instruct-q4f32_1-MLC and got the same issue, so I know the issue is with the compilation to the .wasm file. When I use the wasm file from github the model loads fine SmolLM-135M-Instruct-q4f32_1-ctx2k_cs1k-webgpu.wasm.
I also tested this on gemma-2b-it-q4f32_1-MLC with the same issue
compilation tested on mlc-llm v0.18, v0.17.2
Tested Compilation on the following systems
Tested Loading with web-llm on the following browsers
The text was updated successfully, but these errors were encountered: