Technology
ResNet-50
A 50-layer convolutional neural network using skip connections to enable the training of deep architectures without gradient degradation.
Kaiming He and his Microsoft Research team debuted ResNet-50 in 2015 to overcome the accuracy degradation found in ultra-deep models. The architecture employs identity shortcut connections (skip connections) to allow gradients to flow through 25.6 million parameters without loss. By stacking 48 convolutional layers with a final fully connected layer, the model secured first place in the ILSVRC 2015 classification task with a 3.57% top-5 error rate. It serves as the industry-standard backbone for complex vision pipelines: including Mask R-CNN for segmentation and various transfer learning applications in TensorFlow and PyTorch.
Related technologies
Recent Talks & Demos
Showing 1-2 of 2