Amanote Research

Amanote Research

    RegisterSign In

New Error Bounds for Deep ReLU Networks Using Sparse Grids

SIAM Journal on Mathematics of Data Science
doi 10.1137/18m1189336
Full Text
Open PDF
Abstract

Available in full text

Date

January 1, 2019

Authors
Hadrien MontanelliQiang Du
Publisher

Society for Industrial & Applied Mathematics (SIAM)


Related search

Cascade Deep Networks for Sparse Linear Inverse Problems

2018English

New Error Bounds for the Linear Complementarity Problem

Mathematics of Operations Research
Management ScienceComputer Science ApplicationsOperations ResearchMathematics
1994English

Learning Algorithm Analysis for Deep Neural Network With ReLu Activation Functions

ITM Web of Conferences
2018English

Cubature Error Bounds for Analytic Functions

Mathematics of Computation
Computational MathematicsApplied MathematicsNumber TheoryAlgebra
1973English

Error Bounds for Affine Fractal Interpolation

Mathematical Inequalities and Applications
MathematicsApplied Mathematics
2006English

Addressing Global Sensitivity in Chemical Kinetic Models Using Adaptive Sparse Grids

Chemie-Ingenieur-Technik
ChemistryChemical EngineeringManufacturing EngineeringIndustrial
2018English

Sparse Bounds for a Prototypical Singular Radon Transform

Canadian Mathematical Bulletin
Mathematics
2019English

Reconstruction Error Based Deep Neural Networks for Coronary Heart Disease Risk Prediction

PLoS ONE
Multidisciplinary
2019English

Error Bounds for General Mixed Quasivariational Inequalities

International Journal of Mathematics and Mathematical Sciences
Mathematics
2005English

Amanote Research

Note-taking for researchers

Follow Amanote

© 2025 Amaplex Software S.P.R.L. All rights reserved.

Privacy PolicyRefund Policy