Bitte wählen Sie ihr Lieferland und ihre Kundengruppe
This paper presents a general theoretical framework for ensemble methods of constructing significantly improved regression estimates. Given a population of regression estimators, we construct a hybrid estimator which is as good or better in the MSE sense than any estimator in the population. We argue that the ensemble method presented has several properties: (1) it efficiently uses all the networks of a population - none of the networks need be discarded. (2) it efficiently uses all the available data for training without over-fitting. (3) It inherently performs regularization by smoothing in functional space which helps to avoid over-fitting. (4) It utilizes local minima to construct improved estimates whereas other neural network algorithms are hindered by local minima. (5) It is ideally suited for parallel computation. (6) It leads to a very useful and natural measure of the number of distinct estimators in a population. (7) The optimal parameters of the ensemble estimator are given in closed form. Experimental results are provided which show that the ensemble method dramatically improves neural network performance on difficult real-world optical character recognition tasks. Generalized Ensemble Method, Hybrid Networks, Over-Fitting, Jackknife Method, Local Minima.