Access Restriction

Author Sanguineti, Marcello
Source CiteSeerX
Content type Text
File Format PDF
Subject Domain (in DDC) Computer science, information & general works ♦ Data processing & computer science
Subject Keyword Limited Complexity ♦ Neural Network ♦ Hidden Unit ♦ Regularized Empirical Error ♦ Approx-imate Minimization ♦ Global Infimum ♦ Network Unit ♦ Regularized Empirical Error Functionals ♦ Upper Bound
Abstract Learning from data formalized as a minimization of a regularized empirical error is studied in terms of approx-imate minimization over sets of functions computable by networks with increasing number of hidden units. There are derived upper bounds on speed of convergence of in-fima achievable over networks with hidden units to the global infimum. The bounds are expressed in terms of norms tailored to the type of network units and moduli of continuity of regularized empirical error functionals. 1
Educational Role Student ♦ Teacher
Age Range above 22 year
Educational Use Research
Education Level UG and PG ♦ Career/Technical Study