Amanote Research
Register
Sign In
Discover open access scientific publications
Search, annotate, share and cite publications
Publications by D.Randall Wilson
The General Inefficiency of Batch Training for Gradient Descent Learning
Neural Networks
Artificial Intelligence
Cognitive Neuroscience
Related publications
The Inefficiency of Batch Training for Large Training Sets
Fully Distributed Privacy Preserving Mini-Batch Gradient Descent Learning
Lecture Notes in Computer Science
Computer Science
Theoretical Computer Science
Width of Minima Reached by Stochastic Gradient Descent Is Influenced by Learning Rate to Batch Size Ratio
Lecture Notes in Computer Science
Computer Science
Theoretical Computer Science
Learning Rate Adaptation in Stochastic Gradient Descent
Advances in Convex Analysis and Global Optimization
Learning for Hierarchical Fuzzy Systems Based on the Gradient-Descent Method
Gradient Descent Learning for Utility Current Compensation Using Active Regenerative PWM Filter
Journal of Computer Science
Computer Networks
Software
Artificial Intelligence
Communications
Stochastic Gradient Descent Training for L1-Regularized Log-Linear Models With Cumulative Penalty
An Experimental Study on Training Radial Basis Functions by Gradient Descent
Lecture Notes in Computer Science
Computer Science
Theoretical Computer Science
XGANDALF – Extended Gradient Descent Algorithm for Lattice Finding
Acta Crystallographica Section A: Foundations and Advances
Materials Science
Condensed Matter Physics
Theoretical Chemistry
Biochemistry
Structural Biology
Inorganic Chemistry
Physical