Skip to content

Latest commit

 

History

History
105 lines (87 loc) · 2.72 KB

File metadata and controls

105 lines (87 loc) · 2.72 KB
license base_model tags metrics model-index widget datasets
apache-2.0
distilbert-base-uncased
generated_from_trainer
accuracy
f1
precision
recall
name results
DistilBERT-yelp-sentiment-analysis
text example_title
This restaurant has the best food
Positive review
text example_title
This restaurant has the worst food
Negative review
noahnsimbe/yelp-dataset

DistilBERT-yelp-sentiment-analysis

This model is a fine-tuned version of distilbert-base-uncased on yelp-dataset dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2995
  • Accuracy: 0.8930
  • F1: 0.7863
  • Precision: 0.7976
  • Recall: 0.7768

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
0.2996 1.0 9645 0.2995 0.8930 0.7863 0.7976 0.7768
0.2233 2.0 19290 0.3381 0.8966 0.7957 0.8015 0.7907

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.2.2+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2