-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update documentation for Hunflair2 release #3410
Conversation
…ow warnings if hunflair v1 models are loaded
flair/models/multitask_model.py
Outdated
@@ -260,6 +260,14 @@ def _fetch_model(model_name) -> str: | |||
|
|||
cache_dir = Path("models") | |||
if model_name in model_map: | |||
if model_name.startswith("hunflair") or model_name == "bioner": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"hunflair2" also starts with "hunflair", so I think this warning would always be printed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
True. However, HunFlair2 will not be loaded as MultitaskModel. I fix it anyway.
@@ -781,6 +781,14 @@ def _fetch_model(model_name) -> str: | |||
elif model_name in hu_model_map: | |||
model_path = cached_path(hu_model_map[model_name], cache_dir=cache_dir) | |||
|
|||
if model_name.startswith("hunflair"): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
True. However, HunFlair2 will not be loaded as SequenceTaggerModel. I fix it anyway.
Added full description
Fixed some syntax errors
Fixed descriptions
@@ -648,6 +649,8 @@ def p(text: str) -> str: | |||
emb = emb / torch.norm(emb) | |||
dense_embeddings.append(emb.cpu().numpy()) | |||
sent.clear_embeddings() | |||
|
|||
# empty cuda cache if device is a cuda device | |||
if flair.device.type == "cuda": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@sg-wbi Is this really required?
# Sanity conversion: if flair.device was set as a string, convert to torch.device | ||
if isinstance(flair.device, str): | ||
flair.device = torch.device(flair.device) | ||
|
||
if flair.device.type == "cuda": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@sg-wbi Is this really required?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All good, thanks for adding this and thanks for your patience! There are some smaller points that we will address with follow-up PRs.
One question: is the manual deleting of the cuda cache really necessary?
Great thanks! I am not sure either. @helpmefindaname / @mariosaenger can you remember why this was added? |
Hi @alanakbik @sg-wbi |
Alright, thanks! |
This PR updates the documentation for HunFlair lifting it to new release. More specifically the PR contains: