Implementation for paper "Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization".
A poster illustrating the proposed algorithm and its relation to the previous BNN optimization strategy is included at ./poster.pdf.
Note: Bop is now added to Larq, the open source training library for BNNs. We recommend using the Larq implementation of Bop: it is compatible with more versions of TensorFlow and will be more actively maintained.
- Python version 
3.6or3.7 - Tensorflow version 
1.14+or2.0.0 - Larq version 
0.2.0 - Zookeeper version 
0.1.1 
You can also check out one of our prebuilt docker images.
This is a complete Python module. To install it in your local Python environment, cd into the folder containing setup.py and run:
pip install -e .
To train a model locally, you can use the cli:
bnno train binarynet --dataset cifar10
To reproduce the runs exploring various hyperparameters, run:
bnno train binarynet \
    --dataset cifar10 \
    --preprocess-fn resize_and_flip \
    --hparams-set bop \
    --hparams threshold=1e-6,gamma=1e-3
where you use the appropriate values for threshold and gamma.
To achieve the accuracy in the paper of 91.3%, run:
bnno train binarynet \
    --dataset cifar10 \
    --preprocess-fn resize_and_flip \
    --hparams-set bop_sec52 \
To reproduce the reported results on ImageNet, run:
bnno train alexnet --dataset imagenet2012 --hparams-set bop
bnno train xnornet --dataset imagenet2012 --hparams-set bop
bnno train birealnet --dataset imagenet2012 --hparams-set bop
This should give the results listed below. Click on the tensorboard icons to see training and validation accuracy curves of the reported runs.
| Network | Bop - top-1 accuracy | |
|---|---|---|
| Binary Alexnet | 41.1% | 
       
     | 
  
| XNOR-Net | 45.9% | 
       
     | 
  
| Bi-Real Net | 56.6% | 
       
     | 
  
