Thumbnail
Access Restriction
Subscribed

Author Buschermöhle, Andreas ♦ Brockmann, Werner
Source SpringerLink
Content type Text
Publisher Springer Berlin Heidelberg
File Format PDF
Copyright Year ©2014
Language English
Subject Domain (in DDC) Technology ♦ Engineering & allied operations
Subject Keyword On-line learning ♦ Regression ♦ Nonstationary environments ♦ Big data ♦ Real-time ♦ Complexity ♦ Artificial Intelligence (incl. Robotics) ♦ Statistical Physics, Dynamical Systems and Complexity
Abstract On-line learning regression has been extensively studied as it has the advantages of allowing continuous adaptation to nonstationary environments, handling big data, and a fixed low computation and memory demand. Most research deals with direct linear regression. But the influence of a nonlinear transformation of the inputs through a fixed model structure is still an open problem. We present an on-line learning approach which is able to deal with all kinds of nonlinear model structures. Its emphasis is on minimizing the effect of local training examples on changes of the global mapping. Thus it yields a robust behavior by preventing overfitting on sparse data as well as fatal forgetting. This paper presents a first order version called incremental risk minimization algorithm (IRMA) in detail. It then extends this approach to a second order version of IRMA, which continuously adapts the learning process itself to the data at hand. For both versions it is proven that every learning step minimizes the worst case loss. We finally demonstrate the effectiveness by a series of experiments with synthetic and real data sets.
ISSN 18686478
Age Range 18 to 22 years ♦ above 22 year
Educational Use Research
Education Level UG and PG
Learning Resource Type Article
Publisher Date 2014-10-12
Publisher Place Berlin, Heidelberg
e-ISSN 18686486
Journal Evolving Systems
Volume Number 6
Issue Number 2
Page Count 21
Starting Page 131
Ending Page 151


Open content in new tab

   Open content in new tab
Source: SpringerLink