distilbert base-uncased-finetuned-sst-2-english - Axtarish в Google
4 янв. 2024 г. · This model is a fine-tune checkpoint of DistilBERT-base-uncased, fine-tuned on SST-2. This model reaches an accuracy of 91.3 on the dev set. Files Files and versions · Stanfordnlp/sst2 · Community 34 · README.md
4 янв. 2024 г. · This model is a fine-tune checkpoint of DistilBERT-base-uncased, fine-tuned on SST-2. This model reaches an accuracy of 91.3 on the dev set (for ...
Powering AWS purpose-built machine learning chips. Blazing fast and cost effective, natively integrated into PyTorch and TensorFlow and integrated with your ...
This model is a fine-tune checkpoint of DistilBERT-base-uncased, fine-tuned on SST-2. This model reaches an accuracy of 91.3 on the dev set.
Kaggle is the world's largest data science community with powerful tools and resources to help you achieve your data science goals.
28 мая 2024 г. · The distilbert-base-uncased-finetuned-sst-2-english model is capable of performing sentiment analysis - predicting whether a given text has a ...
The DistilBERT base uncased finetuned SST-2 model is a fine-tuned version of the DistilBERT model, specifically designed for text classification tasks.
This model is converted using official transforms.js convert.py Works great in js, browser. Enjoy! Downloads last month: 2.
23 авг. 2022 г. · distilbert-base-uncased-finetuned-sst-2-english is a English model originally trained by HuggingFace. Predicted Entities. POSITIVE , NEGATIVE.
12 мая 2022 г. · My dataset is only 10 thousand sentences. I run it in batches of 100, and clear the memory on each run. I manually slice the sentences to ...
Novbeti >

 -  - 
Axtarisha Qayit
Anarim.Az


Anarim.Az

Sayt Rehberliyi ile Elaqe

Saytdan Istifade Qaydalari

Anarim.Az 2004-2023