Importance of initialization of weight matrices in deep learning neural networks

Authors

  • Nicholas Christopher A. Colina ⋅ PH National Institute of Physics, University of the Philippines Diliman
  • Carlos E. Perez ⋅ US ceperez@alluviate.com, Alluviate, Germantown, Maryland
  • Francis N. C. Paraan ⋅ PH National Institute of Physics, University of the Philippines Diliman

Abstract

The success of deep neural networks relies on optimized weight matrices are initialized in different ways. This work reports learning improvement in a six-layer deep neural network that is initialized with orthogonal weight matrices when compared to other commonly-used initialization schemes. An analysis of the eigenvalue spectra of the optimized solutions implies that the space of orthogonal weight matrices lies close to the manifold of learned states.

Downloads

Issue

Physics at the front and center: Strengthening core values in physics research
18–21 August 2016, University of the Philippines Visayas, Iloilo City

SPP2016 Conference Organizers
SPP2016 Editorial Board
SPP2016 Partners and Sponsors

Article ID

SPP-2016-PA-21

Section

Poster Session PA

Published

2016-08-18

How to Cite

[1]
NCA Colina, CE Perez, and FNC Paraan, Importance of initialization of weight matrices in deep learning neural networks, Proceedings of the Samahang Pisika ng Pilipinas 34, SPP-2016-PA-21 (2016). URL: https://proceedings.spp-online.org/article/view/SPP-2016-PA-21.