I am training a sequence-to-sequence model using HuggingFace Transformers' Seq2SeqTrainer
. When I execute the training process, it reports the following warning:
/path/to/python3.9/site-packages/transformers/generation/utils.py:1219: UserWarning: You have modified the pretrained model configuration to control generation. This is a deprecated strategy to control generation and will be removed soon, in a future version. Please use a generation configuration file (see https://huggingface.co/docs/transformers/main_classes/text_generation)
Note the HuggingFace documentation link is dead.
I use the following codes:
model = BartForConditionalGeneration.from_pretrained(checkpoint)
model.config.output_attentions = True
model.config.output_hidden_states = True
arguments = Seq2SeqTrainingArguments(
output_dir = "output_dir_here",
evaluation_strategy = IntervalStrategy.STEPS, #"epoch",
optim = "adamw_torch", # Use new PyTorch optimizer
eval_steps = 1000, # New
logging_steps = 1000,
save_steps = 1000,
learning_rate = 2e-5,
per_device_train_batch_size = batch_size,
per_device_eval_batch_size = batch_size,
weight_decay = 0.01,
save_total_limit = 3,
num_train_epochs = 30,
predict_with_generate=True,
remove_unused_columns=True,
fp16 = True,
push_to_hub = True,
metric_for_best_model = 'bleu', # New or "f1"
load_best_model_at_end = True # New
)
trainer = Seq2SeqTrainer(
model = model,
args = training_args,
train_dataset = train_ds,
eval_dataset = eval_ds,
tokenizer = tokenizer,
data_collator = data_collator,
compute_metrics = compute_metrics,
callbacks = [EarlyStoppingCallback(early_stopping_patience=3)]
)
trainer.train()
The training process can be completed without any problem, but I am concerned about the deprecation warning. How should I modify the codes to solve the problem?
Version:
- Transformers 4.28.1
- Python 3.9.7
from HuggingFace Transformers model config reported "This is a deprecated strategy to control generation and will be removed soon"
No comments:
Post a Comment