Random subspace method


Contributors to Wikimedia projects

Article Images

Random subspace method [1] (or attribute bagging[2]) is an ensemble classifier that consists of several classifiers each operating in a subspace of the original feature space, and outputs the class based on the outputs of these individual classifiers. Random subspace method has been used for decision trees (random decision forests),[3][1] linear classifiers,[4] support vector machines,[5] nearest neighbours[6] and other types of classifiers. This method is also applicable to one-class classifiers.[7][8]

The algorithm is an attractive choice for classification problems where the number of features is much larger than the number of training objects, such as fMRI data[9] or gene expression data.[10]

Algorithm

The ensemble classifier is constructed using the following algorithm:

  1. Let the number of training objects be N and the number of features in the training data be D.
  2. Choose L to be the number of individual classifiers in the ensemble.
  3. For each individual classifier l, choose dl (dl < D) to be the number of input variables for l. It is common to have only one value of dl for all the individual classifiers
  4. For each individual classifier l, create a training set by choosing dl features from D without replacement and train the classifier.
  5. For classifying a new object, combine the outputs of the L individual classifiers by majority voting or by combining the posterior probabilities.

References

  1. ^ a b Ho, Tin Kam (1998). "The Random Subspace Method for Constructing Decision Forests" (PDF). IEEE Transactions on Pattern Analysis and Machine Intelligence. 20 (8): 832–844. doi:10.1109/34.709601.
  2. ^ Bryll, R. (2003). "Attribute bagging: improving accuracy of classifier ensembles by using random feature subsets". Pattern Recognition. 36 (6): 1291–1302. doi:10.1016/s0031-3203(02)00121-8.
  3. ^ Ho, Tin Kam (1995). Random Decision Forest (PDF). Proceedings of the 3rd International Conference on Document Analysis and Recognition, Montreal, QC, 14–16 August 1995. pp. 278–282.
  4. ^ Skurichina, Marina (2002). "Bagging, boosting and the random subspace method for linear classifiers". Pattern Analysis and Applications. 5 (2): 121–135. doi:10.1007/s100440200011.
  5. ^ Tao, D. (2006). "Asymmetric bagging and random subspace for support vector machines-based relevance feedback in image retrieval". IEEE Transactions on Pattern Analysis and Machine Intelligence. doi:10.1109/tpami.2006.134.
  6. ^ Tremblay, G. (2004). "Optimizing Nearest Neighbour in Random Subspaces using a Multi-Objective Genetic Algorithm" (PDF). 17th International Conference on Pattern Recognition: 208–211.
  7. ^ Nanni, L. (2006). "Experimental comparison of one-class classifiers for online signature verification". Neurocomputing. 69 (7).
  8. ^ Cheplygina, Veronika, "Pruned random subspace method for one-class classifiers", Multiple Classifier Systems 2011 (PDF), pp. 96–105
  9. ^ Kuncheva, Ludmila; et al. (2010). IEEE Transactions on Medical Imaging. 29 (2): 531–542 http://pages.bangor.ac.uk/~mas00a/papers/lkjrcpdlsjtmi10.pdf. ;
  10. ^ Bertoni, Alberto; Folgieri, Raffaella; Valentini, Giorgio (2005). "Bio-molecular cancer prediction with random subspace ensembles of support vector machines". Neurocomputing. 63: 535–539. doi:10.1016/j.neucom.2004.07.007.