Thumbnail
Access Restriction
Open

Author Klinke, Sigbert ♦ Grassmann, Janet
Source CiteSeerX
Content type Text
File Format PDF
Language English
Subject Domain (in DDC) Computer science, information & general works ♦ Data processing & computer science
Subject Keyword Real-world Application ♦ Hidden Unit ♦ Black Box ♦ Non-metric Multidimensional Scaling ♦ Feedforward Neural Network ♦ Individual Input ♦ Final Model ♦ Training Process ♦ Two-dimensional Figure ♦ Internal Behaviour ♦ Credit Data ♦ Different Layer
Description Humboldt Universitat Berlin
Feedforward neural networks are often used methods for regression and classification. But mostly they are treated as black boxes, that will find the "right" model by themselves. The advantage of flexibility is then compensated by nontransparency of the training process and the final model. To understand the internal behaviour of a feedforward neural network we have applied non-metric multidimensional scaling. The weights of the connections between the units of different layers are transformed into distances between these units. Finally, we get two-dimensional figures of projected units, where the distances between them give us an idea of the influence of individual input or hidden units to other units. This method should be seen as an opportunity to play with a feedforward neural network by removing or adding units or clusters of them and then to see what happens. To represent the idea of our work we have chosen two real-world applications as the credit data from Fahrmeier and Hamerle ...
Educational Role Student ♦ Teacher
Age Range above 22 year
Educational Use Research
Education Level UG and PG ♦ Career/Technical Study
Learning Resource Type Article
Publisher Date 1996-01-01