.

Technology

DistilBERT-base-uncased

DistilBERT-base-uncased: The distilled BERT variant, delivering 97% of BERT's performance at 40% the size and 60% faster inference speed.

This is the 'uncased' base model for DistilBERT: a highly efficient, smaller, and faster alternative to the original BERT. It was created using knowledge distillation, a process that shrinks the architecture by 40% (reducing parameters to 66 million) while preserving approximately 97% of BERT's accuracy on key NLP benchmarks (e.g., GLUE). The model achieves a 60% increase in inference speed, making it ideal for production environments and resource-constrained applications. Pre-trained on the same English Wikipedia and BookCorpus data as its teacher, it is ready for fine-tuning on tasks like sequence classification and question answering.

https://huggingface.co/distilbert-base-uncased
1 project · 1 city

Related technologies

Recent Talks & Demos

Showing 1-1 of 1

Members-Only

Sign in to see who built these projects