Common Collection#
The common collection contains things that could be used across all collections.
- Callbacks
- Losses
- Metrics
- Tokenizers
AutoTokenizer
AutoTokenizer.__init__()
AutoTokenizer.add_special_tokens()
AutoTokenizer.additional_special_tokens_ids
AutoTokenizer.bos_id
AutoTokenizer.cls_id
AutoTokenizer.eos_id
AutoTokenizer.ids_to_text()
AutoTokenizer.ids_to_tokens()
AutoTokenizer.mask_id
AutoTokenizer.name
AutoTokenizer.pad_id
AutoTokenizer.save_vocabulary()
AutoTokenizer.sep_id
AutoTokenizer.text_to_ids()
AutoTokenizer.text_to_tokens()
AutoTokenizer.token_to_id()
AutoTokenizer.tokens_to_ids()
AutoTokenizer.tokens_to_text()
AutoTokenizer.unk_id
AutoTokenizer.vocab
AutoTokenizer.vocab_size
SentencePieceTokenizer
SentencePieceTokenizer.__init__()
SentencePieceTokenizer.add_special_tokens()
SentencePieceTokenizer.additional_special_tokens_ids
SentencePieceTokenizer.bos_id
SentencePieceTokenizer.cls_id
SentencePieceTokenizer.eos_id
SentencePieceTokenizer.ids_to_text()
SentencePieceTokenizer.ids_to_tokens()
SentencePieceTokenizer.mask_id
SentencePieceTokenizer.pad_id
SentencePieceTokenizer.sep_id
SentencePieceTokenizer.text_to_ids()
SentencePieceTokenizer.text_to_tokens()
SentencePieceTokenizer.token_to_id()
SentencePieceTokenizer.tokens_to_ids()
SentencePieceTokenizer.tokens_to_text()
SentencePieceTokenizer.unk_id
SentencePieceTokenizer.vocab
TokenizerSpec
- Data