Einstein: A proposed activation function for convolutional neural network (CNN) in image classification and its comparison to ReLU, Leaky ReLU, and Tanh

Authors

  • Luther A. Villacruz Department of Physics, Ateneo de Manila University
  • Anthony M. Maagma Department of Mathematics, West Visayas State University
  • Mariane Desiree C. Avendaño Department of Physics, Ateneo de Manila University
  • Carlo H. Godoy Jr. Office of the AC of NS for C4ISR Systems, N6, Philippine Navy and Department of Physics, Ateneo de Manila University
  • Benjamin B. Dingel Department of Physics and Ateneo Innovation Center, Ateneo de Manila University and Nasfine Photonics Inc.

Abstract

Activation functions (AFs) are integral in enabling convolutional neural networks (CNNs) to perform image classification and recognition effectively. Here, we propose a new AF called "Einstein"— a novel, piecewise, and dynamic AF defined by parameter r. Einstein is inspired by our earlier work on analogies with Einstein's theory of special relativity, whose equation "morphs to its present form" when we apply it to AF. Our goal is to optimize Einstein by measuring its performance accuracies under CNN for different values of r. Einstein outperforms well-established AFs, like (i) Tanh and (ii) Leaky ReLU when r-value is from 0.80 to 1.00, and (iii) ReLU when r-values are 0.93, 0.96, and 1.00.

Downloads

Issue

Article ID

SPP-2023-PB-03

Section

Poster Session B (Complex Systems, Simulations, and Theoretical Physics)

Published

2023-07-03

How to Cite

[1]
LA Villacruz, AM Maagma, MDC Avendaño, CH Godoy, and BB Dingel, Einstein: A proposed activation function for convolutional neural network (CNN) in image classification and its comparison to ReLU, Leaky ReLU, and Tanh, Proceedings of the Samahang Pisika ng Pilipinas 41, SPP-2023-PB-03 (2023). URL: https://proceedings.spp-online.org/article/view/SPP-2023-PB-03.