This model is a fine-tuned version of distilbert-base-uncased on the imdb dataset (training notebook is here).
It achieves the following results on the evaluation set:
- Loss: 0.1903
 - Accuracy: 0.928
 
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
 - train_batch_size: 16
 - eval_batch_size: 16
 - seed: 42
 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
 - lr_scheduler_type: linear
 - num_epochs: 1
 
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | 
|---|---|---|---|---|
| 0.2195 | 1.0 | 1563 | 0.1903 | 0.928 | 
Framework versions
- Transformers 4.15.0
 - PyTorch 1.10.0+cu111
 - Datasets 1.17.0
 - Tokenizers 0.10.3
 
数据统计
数据评估
关于lvwerra/distilbert-imdb特别声明
            本站Ai导航提供的lvwerra/distilbert-imdb都来源于网络,不保证外部链接的准确性和完整性,同时,对于该外部链接的指向,不由Ai导航实际控制,在2023年5月15日 下午3:14收录时,该网页上的内容,都属于合规合法,后期网页的内容如出现违规,可以直接联系网站管理员进行删除,Ai导航不承担任何责任。
相关导航
暂无评论...

