一个使用 Vicuna13B 基础的完...
bert-base-multilingual-unca...
DeBERTa: Decoding-enhanced ...
Twitter-roBERTa-base for Se...
xlm-roberta-base-language-d...
Emotion English DistilRoBER...
DistilBERT base uncased fin...
Distilbert-base-uncased-emo...
roberta-large-mnli Tab...
Twitter-roBERTa-base for Em...
distilbert-imdb This mode...
Cross-Encoder for MS Marco ...
FinBERT is a pre-trained NL...
Model description This mo...
Non Factoid Question Catego...
Parrot THIS IS AN ANCILLARY...
BERT是一个transformers模型,它是在一个大型英文语料库上进行自监督预训练的。这意味着它仅在原始文本上进行预训练,没有任何人类以任何方式对其进行标注(这就是为什么它可以使用大量公开可用的数据),并使用自动过程从这些文本中生成输入和标签。更准确地说,它是通过两个目标进行预训练的:
CodeBERT fine-tuned for Ins...
FinBERT is a BERT model pre...
Sentiment Analysis in Spani...
distilbert-base-uncased-go-...
Model Trained Using AutoNLP...
BERT codemixed base model f...
RoBERTa Base OpenAI Detecto...
German Sentiment Classifica...
SiEBERT - English-Language ...
Fine-tuned DistilRoBERTa-ba...