Weight Initialization in Neural Networks: A Journey From the Basics to Kaiming
03 Apr 2019, Prathyush SPI’d like to invite you to join me on an exploration through different approaches to initializing layer weights in neural networks. Step-by-step, through various short experiments and thought exercises, we’ll discover why adequate weight initialization is so important in training deep neural nets. Along the way we’ll cover various approaches that researchers have proposed over the years, and finally drill down on what works best for the contemporary network architectures that you’re most likely to be working with.
For more details, visit the source.