Spaces:
Running
Running
Use tokenizers from π€ Tokenizers | |
The [PreTrainedTokenizerFast] depends on the π€ Tokenizers library. The tokenizers obtained from the π€ Tokenizers library can be | |
loaded very simply into π€ Transformers. | |
Before getting in the specifics, let's first start by creating a dummy tokenizer in a few lines: | |
thon | |
from tokenizers import Tokenizer | |
from tokenizers.models import BPE | |
from tokenizers.trainers import BpeTrainer | |
from tokenizers.pre_tokenizers import Whitespace | |
tokenizer = Tokenizer(BPE(unk_token="[UNK]")) | |
trainer = BpeTrainer(special_tokens=["[UNK]", "[CLS]", "[SEP]", "[PAD]", "[MASK]"]) | |
tokenizer.pre_tokenizer = Whitespace() | |
files = [] | |
tokenizer.train(files, trainer) | |
We now have a tokenizer trained on the files we defined. We can either continue using it in that runtime, or save it to | |
a JSON file for future re-use. | |
Loading directly from the tokenizer object | |
Let's see how to leverage this tokenizer object in the π€ Transformers library. The | |
[PreTrainedTokenizerFast] class allows for easy instantiation, by accepting the instantiated | |
tokenizer object as an argument: | |
thon | |
from transformers import PreTrainedTokenizerFast | |
fast_tokenizer = PreTrainedTokenizerFast(tokenizer_object=tokenizer) | |
This object can now be used with all the methods shared by the π€ Transformers tokenizers! Head to the tokenizer | |
page for more information. | |
Loading from a JSON file | |
In order to load a tokenizer from a JSON file, let's first start by saving our tokenizer: | |
thon | |
tokenizer.save("tokenizer.json") | |
The path to which we saved this file can be passed to the [PreTrainedTokenizerFast] initialization | |
method using the tokenizer_file parameter: | |
thon | |
from transformers import PreTrainedTokenizerFast | |
fast_tokenizer = PreTrainedTokenizerFast(tokenizer_file="tokenizer.json") | |
This object can now be used with all the methods shared by the π€ Transformers tokenizers! Head to the tokenizer | |
page for more information. |