1

Network Pruning that Matters: A Case Study on Retraining Variants

We study the effective of different retraining mechanisms while doing pruning.

Paying more attention to snapshots of Iterative Pruning: Improving Model Compression via Ensemble Distillation

We propose KESI - a method that combine knowledge distillation, network pruning and ensemble learning to improve the performance of compact networks.