Artificial Intelligence

2206 Submissions

[2] viXra:2206.0142 [pdf] submitted on 2022-06-26 16:10:32

FASFA: A Novel Next-Generation Backpropagation Optimizer

Authors: Philip Naveen
Comments: 18 Pages.

This paper introduces the fast adaptive stochastic function accelerator (FASFA) for gradient-based optimization of stochastic objective functions. It works based on Nesterov-enhanced first and second momentum estimates. The method is simple and effective during implementation because it has intuitive/familiar hyperparameterization. The training dynamics can be progressive or conservative depending on the decay rate sum. It works well with a low learning rate and mini batch size. Experiments and statistics showed convincing evidence that FASFA could be an ideal candidate for optimizing stochastic objective functions, particularly those generated by multilayer perceptrons with convolution and dropout layers. In addition, the convergence properties and regret bound provide results aligning with the online convex optimization framework. In a first of its kind, FASFA addresses the growing need for diverse optimizers by providing next-generation training dynamics for artificial intelligence algorithms. Future experiments could modify FASFA based on the infinity norm.
Category: Artificial Intelligence

[1] viXra:2206.0132 [pdf] submitted on 2022-06-24 04:59:04

Fractal Belief Jensen–Shannon Divergence

Authors: Yingcheng Huang, Fuyuan Xiao
Comments: 1 Page.

In this paper, a novel belief divergence measurement method, fractal belief Jensen–Shannon (FBJS) divergence is proposed to better measure conflicts between evidences. The proposed FBJS divergence is the first belief divergence that combines the belief divergence theory and the concept of fractal.
Category: Artificial Intelligence