create_deberta_v2_model {aifeducation} | R Documentation |
Function for creating a new transformer based on DeBERTa-V2
Description
This function creates a transformer configuration based on the DeBERTa-V2 base architecture and a vocabulary based on SentencePiece tokenizer by using the python libraries 'transformers' and 'tokenizers'.
Usage
create_deberta_v2_model(
ml_framework = aifeducation_config$get_framework(),
model_dir,
vocab_raw_texts = NULL,
vocab_size = 128100,
do_lower_case = FALSE,
max_position_embeddings = 512,
hidden_size = 1536,
num_hidden_layer = 24,
num_attention_heads = 24,
intermediate_size = 6144,
hidden_act = "gelu",
hidden_dropout_prob = 0.1,
attention_probs_dropout_prob = 0.1,
sustain_track = TRUE,
sustain_iso_code = NULL,
sustain_region = NULL,
sustain_interval = 15,
trace = TRUE,
pytorch_safetensors = TRUE
)
Arguments
ml_framework |
|
model_dir |
|
vocab_raw_texts |
|
vocab_size |
|
do_lower_case |
|
max_position_embeddings |
|
| |
| |
num_attention_heads |
|
intermediate_size |
|
| |
| |
attention_probs_dropout_prob |
|
sustain_track |
|
sustain_iso_code |
|
sustain_region |
Region within a country. Only available for USA and Canada See the documentation of codecarbon for more information. https://mlco2.github.io/codecarbon/parameters.html |
sustain_interval |
|
trace |
|
pytorch_safetensors |
|
Value
This function does not return an object. Instead the configuration and the vocabulary of the new model are saved on disk.
Note
To train the model, pass the directory of the model to the function train_tune_deberta_v2_model.
For this model a WordPiece tokenizer is created. The standard implementation
of DeBERTa version 2 from HuggingFace uses a SentencePiece tokenizer. Thus, please
use AutoTokenizer
from the 'transformers' library to use this model.
References
He, P., Liu, X., Gao, J. & Chen, W. (2020). DeBERTa: Decoding-enhanced BERT with Disentangled Attention. doi:10.48550/arXiv.2006.03654
Hugging Face Documentation https://huggingface.co/docs/transformers/model_doc/deberta-v2#debertav2
See Also
Other Transformer:
create_bert_model()
,
create_funnel_model()
,
create_longformer_model()
,
create_roberta_model()
,
train_tune_bert_model()
,
train_tune_deberta_v2_model()
,
train_tune_funnel_model()
,
train_tune_longformer_model()
,
train_tune_roberta_model()