Python transformers.PreTrainedModel() Examples
The following are 2
code examples of transformers.PreTrainedModel().
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example.
You may also want to check out all available functions/classes of the module
transformers
, or try the search function
.
Example #1
Source File: bert_token_embedder.py From NLP_Toolkit with Apache License 2.0 | 6 votes |
def __init__( self, bert_model: PreTrainedModel, top_layer_only: bool = False, max_pieces: int = 512, num_start_tokens: int = 1, num_end_tokens: int = 1 ) -> None: super().__init__() # self.bert_model = bert_model self.bert_model = deepcopy(bert_model) self.output_dim = bert_model.config.hidden_size self.max_pieces = max_pieces self.num_start_tokens = num_start_tokens self.num_end_tokens = num_end_tokens self._scalar_mix = None
Example #2
Source File: bert_token_embedder.py From NLP_Toolkit with Apache License 2.0 | 5 votes |
def load(cls, model_name: str, cache_model: bool = True) -> PreTrainedModel: if model_name in cls._cache: return PretrainedBertModel._cache[model_name] model = AutoModel.from_pretrained(model_name) if cache_model: cls._cache[model_name] = model return model