
Sparsegrid sampling recovery and deep ReLU neural networks in highdimensional approximation
We investigate approximations of functions from the HölderZygmund space...
read it

Approximation in shiftinvariant spaces with deep ReLU neural networks
We construct deep ReLU neural networks to approximate functions in dilat...
read it

Approximation in L^p(μ) with deep ReLU neural networks
We discuss the expressive power of neural networks which use the nonsmo...
read it

Approximation capabilities of neural networks on unbounded domains
We prove universal approximation theorems of neural networks in L^p(R× [...
read it

Highdimensional nonlinear approximation by parametric manifolds in HölderNikol'skii spaces of mixed smoothness
We study highdimensional nonlinear approximation of functions in Hölder...
read it

A global universality of twolayer neural networks with ReLU activations
In the present study, we investigate a universality of neural networks, ...
read it

Deep ReLU Networks Preserve Expected Length
Assessing the complexity of functions computed by a neural network helps...
read it
Computation complexity of deep ReLU neural networks in highdimensional approximation
The purpose of the present paper is to study the computation complexity of deep ReLU neural networks to approximate functions in HölderNikol'skii spaces of mixed smoothness H_∞^α(𝕀^d) on the unit cube 𝕀^d:=[0,1]^d. In this context, for any function f∈ H_∞^α(𝕀^d), we explicitly construct nonadaptive and adaptive deep ReLU neural networks having an output that approximates f with a prescribed accuracy ε, and prove dimensiondependent bounds for the computation complexity of this approximation, characterized by the size and the depth of this deep ReLU neural network, explicitly in d and ε. Our results show the advantage of the adaptive method of approximation by deep ReLU neural networks over nonadaptive one.
READ FULL TEXT
Comments
There are no comments yet.