.

Technology

ResNet

ResNet (Residual Network) uses skip connections to enable the effective training of ultra-deep neural networks (e.g., 152 layers), winning the ILSVRC 2015 classification task with a 3.57% top-5 error.

ResNet, short for Residual Network, is a foundational convolutional neural network (CNN) architecture introduced by Microsoft Research in 2015. Its core innovation is the residual block, which incorporates 'skip connections' (identity mappings) to bypass one or more layers, adding the input directly to the block's output. This design explicitly reformulates layers to learn the *residual* function, effectively solving the degradation problem and vanishing gradient issues that plagued previous deep networks. This breakthrough allowed researchers to train networks with unprecedented depth—up to 152 layers—while maintaining high accuracy. ResNet secured first place in all five main tracks of the ILSVRC & COCO 2015 competitions (classification, detection, localization, and segmentation), establishing a new state-of-the-art benchmark for computer vision.

https://arxiv.org/abs/1512.03385
1 project · 2 cities

Related technologies

Recent Talks & Demos

Showing 1-1 of 1

Members-Only

Sign in to see who built these projects