Thumbnail
Access Restriction
Open

Author Ting, Kai Ming ♦ Low, Boon Toh
Source CiteSeerX
Content type Text
Publisher Springer Verlag
File Format PDF
Language English
Subject Domain (in DDC) Computer science, information & general works ♦ Data processing & computer science
Subject Keyword Relative Performance ♦ Multiple Model ♦ Classification Task ♦ Practical Implication ♦ Base-line Behaviour ♦ Single Model ♦ Data Combination ♦ Available Data ♦ Common Practice ♦ Multiple-data-batches Scenario ♦ Multiple Batch ♦ Disjoint Batch ♦ Model Combination ♦ Data Combination Approach ♦ Interesting Result ♦ Nearasymptotic Performance ♦ Model Combination Approach
Description Proc. 9th ECML
The approach of combining models learned from multiple batches of data provide an alternative to the common practice of learning one model from all the available data (i.e., the data combination approach). This paper empirically examines the base-line behaviour of the model combination approach in this multiple-data-batches scenario. We find that model combination can lead to better performance even if the disjoint batches of data are drawn randomly from a larger sample, and relate the relative performance of the two approaches to the learning curve of the classifier used. The practical implication of our results is that one should consider using model combination rather than data combination, especially when multiple batches of data for the same task are readily available. Another interesting result is that we empirically show that the nearasymptotic performance of a single model, in some classification task, can be significantly improved by combining multiple models (derived from t...
Educational Role Student ♦ Teacher
Age Range above 22 year
Educational Use Research
Education Level UG and PG ♦ Career/Technical Study
Learning Resource Type Article
Publisher Date 1997-01-01