Skip to content

Thytu/The-Lottery-Ticket-Hypothesis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

90 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Contributors Forks Stargazers Issues MIT License LinkedIn


Logo

The Lottery Ticket Hypothesis

Implementation of the "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" paper.
Explore the docs »

View results · Report Bug · Request Feature


Table of Contents
  1. About The Project
  2. Videos
  3. Getting Started
  4. Usage
  5. Roadmap
  6. Contributing
  7. License
  8. Contact
  9. Acknowledgments

About The Project


The Lottery Ticket Hypothesis:
A randomly-initialized, dense neural network contains a subnetwork that is initialized such that—when trained in isolation—it can match the test accuracy of the original network after training for at most the same number of iterations

I found the Lottery Ticket hypothesis fascinating so I decided to re-implement the paper (fully for fun).

Key features:

  • Code and results with the Lenet-300-100 architecture on MNIST dataset
  • Code and results with the Conv-2 architecture, variants of VGG (Simonyan & Zisserman, 2014) on CIFAR10 dataset
  • Code and results with the Conv-4 architecture, variants of VGG (Simonyan & Zisserman, 2014) on CIFAR10 dataset
  • Code and results with the Conv-6 architecture, variants of VGG (Simonyan & Zisserman, 2014) on CIFAR10 dataset

The paper also experiments with Resnet-18 and VGG-19 which I didn't had time to on include (yet).
If you would like to add any of those models, please consider to fork this repo and to create a pull request.


Videos


The Lottery Ticket Hypothesis - Paper discussion


The Lottery Ticket Hypothesis


The Lottery Ticket Hypothesis - Live coding

The Lottery Ticket Hypothesis

(back to top)


Built With

(back to top)

Getting Started

To get a local copy up and running follow these simple example steps.

Make sure to install the python dependencies : python3 -m pip install requirements.txt (having access to a GPU will greatly increase the training speed but it's not mandatory)

(back to top)

Usage

Each folder corresponds to one of the main experiments described in the paper:

To reproduce the experiments, simply follow the insctructions described in each README.md file.

Roadmap

  • Add a results section for each model architecture
  • Plot the evolution of iteration for early-stopping
  • Plot the evolution of iteration for early-stopping with weight resetting
  • Plot the graph based on the mean of five exeperiments
  • Add the min and max values in each plots
  • Add experiments with Conv-2 on CIFAR10
  • Add experiments with Conv-4 on CIFAR10
  • Add experiments with Conv-6 on CIFAR10
  • Add experiments with Resnet-18 on CIFAR10
  • Add experiments with VGG-19 on CIFAR10

See the open issues for a full list of proposed features and known issues.

(back to top)

Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make would improve this project, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/my-feature)
  3. Commit your Changes (git commit -m 'feat: my new feature)
  4. Push to the Branch (git push)
  5. Open a Pull Request

Please try to follow Conventional Commits.

(back to top)

License

Distributed under the MIT License. See LICENSE.txt for more information.

(back to top)

Contact

Valentin De Matos - @ThytuVDM - vltn.dematos@gmail.com

Project Link: https://github.com/Thytu/The-Lottery-Ticket-Hypothesis

(back to top)

Acknowledgments

(back to top)

Releases

No releases published

Packages

No packages published

Languages