Skip to content

Prune DNN using Alternating Direction Method of Multipliers (ADMM)

License

Notifications You must be signed in to change notification settings

bzantium/pytorch-admm-pruning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Oct 15, 2019
3ac3e40 · Oct 15, 2019

History

10 Commits
Oct 15, 2019
Aug 21, 2019
Aug 21, 2019
Aug 22, 2019
Aug 21, 2019
Aug 21, 2019
Oct 15, 2019

Repository files navigation

pytorch-admm-prunning

It is a pytorch implementation of DNN weight prunning with ADMM described in A Systematic DNN Weight Pruning Framework using Alternating Direction Method of Multipliers.

Train and test

  • You can simply run code by
$ python main.py
  • In the paper, authors use l2-norm regularization so you can easily add by
$ python main.py --l2
  • Beyond this paper, if you don't want to use predefined prunning ratio, admm with l1 norm regularization can give a great solution and can be simply tested by
$ python main.py --l1
  • There are two dataset you can test in this code: [mnist, cifar10]. Default setting is mnist, you can change dataset by
$ python main.py --dataset cifar10

Models

  • In this code, there are two models: [LeNet, AlexNet]. I use LeNet for mnist, AlexNet for cifar10 by default.

Optimizer

  • To prevent prunned weights from updated by optimizer, I modified Adam (named PruneAdam).

References

For this repository, I refer to KaiqiZhang's tensorflow implementation.

About

Prune DNN using Alternating Direction Method of Multipliers (ADMM)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages