Improving ensemble decision tree performance using Adaboost and Bagging
Ensemble classifier systems are considered as one of the most promising in medical data classification and the performance of deceision tree classifier can be increased by the ensemble method as it is proven to be better than single classifiers.However, in a ensemble settings the performance depends...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference or Workshop Item |
Language: | English |
Published: |
2015
|
Subjects: | |
Online Access: | http://repo.uum.edu.my/16741/1/14.pdf http://repo.uum.edu.my/16741/ http://doi.org/10.1063/1.4937027 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Ensemble classifier systems are considered as one of the most promising in medical data classification and the performance of deceision tree classifier can be increased by the ensemble method as it is proven to be better than single classifiers.However, in a ensemble settings the performance depends on the selection of suitable base classifier.This research employed two prominent esemble s namely Adaboost and Bagging with base classifiers such as Random Forest, Random Tree, j48, j48grafts and Logistic Model Regression (LMT) that have been selected independently. The empirical study shows that the performance varries when different base classifiers are selected and even some places overfitting issue also been noted.The evidence shows that ensemble decision tree classfiers using Adaboost and Bagging improves the performance of selected medical data sets. |
---|