.

Technology

MobileBERT

A thin version of BERT-large optimized for mobile devices through deep bottleneck structures and progressive knowledge transfer.

MobileBERT achieves performance parity with BERT-base on the SQuAD v1.1 task while being 4.3x smaller and 5.5x faster on a standard Pixel 4 smartphone. Developed by researchers from Google and the Chinese University of Hong Kong, it utilizes a unique bottleneck architecture to compress the 24-layer BERT-large teacher model into a compact student model. This design allows it to maintain a 90.0 GLUE score and 89.4 F1 score on SQuAD, making high-accuracy natural language processing viable for on-device applications without heavy cloud dependencies.

https://arxiv.org/abs/2004.02984
1 project · 1 city

Related technologies

Recent Talks & Demos

Showing 1-1 of 1

Members-Only

Sign in to see who built these projects