Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Allow custom answer generation function in WWB (#507)
This functionally will help for enable different model API that has different interface for generation answers (e.g. OpenVINO GenAI) example with GenAI: ``` from transformers import AutoModelForCausalLM, AutoTokenizer import huggingface_hub as hf_hub import whowhatbench import openvino_genai model_id = "databricks/dolly-v2-3b" base_model = AutoModelForCausalLM.from_pretrained(model_id) ov_model_dir = "./dolly-v2-3b-int4-ov" hf_hub.snapshot_download("OpenVINO/dolly-v2-3b-int4-ov", local_dir=ov_model_dir) optimized_model = openvino_genai.LLMPipeline(ov_model_dir, "CPU") tokenizer = AutoTokenizer.from_pretrained(model_id) def genai_gen_answer(model, tokenizer, question, max_new_tokens, skip_question): out = model.generate(question, max_new_tokens=max_new_tokens) return out.texts[0] evaluator = whowhatbench.Evaluator(base_model=base_model, tokenizer=tokenizer) metrics_per_prompt, metrics = evaluator.score(optimized_mode, gen_answer_fn=genai_gen_answer) ```
- Loading branch information