Compressing U-net using Knowledge Distillation

Mountain View This is my Master’s Research project mentored by Dr. Mathieu Salzmann at Prof. Pascal Fua’s Computer Vision Lab at École polytechnique fédérale de Lausanne, Switzerland. We focus on compressing Fully Convolutional Networks wit skip connections such as the U-net or the Stacked Hourglass Network with minimal loss in performance using Knowledge Distillation (original paper by Hinton et al.). Alongwith proposnig minor changes in the U-net architecture to improve performance, such as introduction of Batch Normalization layer, we compress the original Unet architecture with over 31 Million traininable parameters to just 30,900 paramters (over 100x compression!).

The project report and presentation slides are available here and here. Also, entire codebase (U-net architecture in Pytorch + Distillation code) is available on Github.