SIAM Undergraduate Research Online

Volume 17

In This Volume

  • DOI: 10.1137/24S1658528

    Authors

    John C. Breedis (Corresponding author – University of Texas at Austin)

    Project Advisors

    Dr. Tan Bui-Thanh (University of Texas at Austin)

    Abstract

    We provide a new perspective on comparing activation functions via neural network approximate identity (nAI), a family of bell-curve activation functions Bθ with parameter θ > 0. This was done by comparing the performance of several neural networks across multiple data sets, where each neural network has one hidden layer and uses nAI as activation functions. The data sets used include polynomial, discontinuous, and non-differentiable regression problems, as well as MNIST classification. Across all regression problems, hinge activation functions outperformed sigmoidal activation functions. For classification, sigmoidal activation functions perform slightly better than hinge functions. In addition to traditional neural networks, we develop a machine learning architecture that utilizes integral convolution and quadrature by fixing the initial weight and bias matrices of the network. Supplemental modifications were implemented to improve the accuracy of these networks. This modified architecture was used alongside standard Feedforward networks across each regression problem with similar efficacy.

  • Machine Learning for Hotel Reservation Prediction

    Published electronically October 17, 2024
  • Mixed Precision MINRES

    Published electronically October 4, 2024
  • MCRAGE: Synthetic Healthcare Data for Fairness

    Published electronically August 13, 2024
  • Pooling Matrix Designs for Group Testing

    Published electronically May 30, 2024

Become a SIURO Author