Deep Networks for Image Classification
Exploring very deep convolutional networks based on VGG, ResNet, GoogLeNet principles.
Summary
Implementation of both the original models - VGG16, ResNet18, Inception v1 - and respective alternative versions on MNIST and CIFAR10 datasets.
The comparison between these six models indicate that the "improved" versions slightly outperform their vanilla counterparts
For the avid reader, an extensive six page paper is also compiled, describing models' principles, training process and experimental results on the aforementioned datasets.
Paper can be found at:
Important Notes
- Due to time and hardware restrictions, all images were resized the size of 64 x 64.
- Necessary changes (e.g changed the input of the first convolution layer to 1, accounting for grayscale images) were introduced to allow the networks' training.
- Altered training process - Label smoothing, weight decay, multi-step learning rate scheduler, momentum -
- Augmented existing data depending on the dataset of choice.
Overview of changes in VGG16
- Added batch normalization layers between each convolution layer and before their activation layer.
- Changed VGG's original implementation Adaptive Average Pooling layer with an 5 x 5 Adaptive Max Pooling layer.
- Initialized network layers' weights and biases from an appropriate distribution.
Overview of changes in ResNet18
- Replaced the original large 7 x 7 convolution layer by a stack of 3 x 3 convolutions.
- Added batch normalization layers to the aforementioned stack between each convolution layer and before their activation layer.
- Introduced a dropout layer with a minor dropout ratio before the linear classifier
- Initialized network layers' weights and biases from an appropriate distribution.
Overview of changes in Inception v1
- Replaced the original 7 x 7 convolution layer by a stack of 3 x 3 convolutions, inspired by VGG's findings.
Code & Installation Process available at:
Written with:
Python, Pytorch, Matplotlib, Pandas, Scikit-Learn
Since:
April 2021 - June 2021