Skip to content

Commit

Permalink
more general approach
Browse files Browse the repository at this point in the history
  • Loading branch information
Ita Zaporozhets authored and Ita Zaporozhets committed May 30, 2024
1 parent a47a209 commit 76b89ee
Showing 1 changed file with 0 additions and 4 deletions.
4 changes: 0 additions & 4 deletions src/transformers/convert_slow_tokenizer.py
Original file line number Diff line number Diff line change
Expand Up @@ -1391,10 +1391,6 @@ def tokenizer(self, proto):
AddedToken(self.original_tokenizer.convert_ids_to_tokens(2), normalized=False, special=True),
]
)
user_defined_symbols = [
AddedToken(token, normalized=True, special=False) for token in proto.trainer_spec.user_defined_symbols
]
tokenizer.add_tokens(user_defined_symbols)
else:
raise Exception(
"You're trying to run a `Unigram` model but you're file was trained with a different algorithm"
Expand Down

0 comments on commit 76b89ee

Please sign in to comment.