Understanding the signal propagation properties in a neural network is not only essential for improving its trainability but it also sheds some light on the robustness properties of neural networks. In this project, we study the effects of signal propagation on both of these aspects in neural networks including the compressed (either sparse or quantized) networks.


Bidirectional Self-Normalizing Neural Networks.
Yao Lu, Stephen Gould, and Thalaiyasingam Ajanthan.
arxiv:2006.12169, June 2020.
[pdf] [arxiv] [bib]

Improved Gradient based Adversarial Attacks for Quantized Networks.
Kartik Gupta, and Thalaiyasingam Ajanthan.
arxiv:2003.13511, March 2020.
[pdf] [arxiv] [bib]

A Signal Propagation Perspective for Pruning Neural Networks at Initialization.
Namhoon Lee, Thalaiyasingam Ajanthan, Stephen Gould, and Philip H. S. Torr.
International Conference on Learning Representations (ICLR), April 2020. (spotlight)
[pdf] [arxiv] [talk] [code] [bib]