Page 123 - Data Science Algorithms in a Week
P. 123

Texture Descriptors for The Generic Pattern Classification Problem   107

                       Min-Sum  matrix  Products  (MSP)  (Felzenszwalb  &  McAuley,  2011),  which  has  been
                       shown  to  efficiently  solve  the  Maximum-A-Posteriori  (MAP)  inference  problem,
                       Nonnegative  Matrix  Factorization  (NMF)  (Seung  &  Lee,  2001),  which  has  become  a
                       popular choice for solving general pattern recognition problems, and the Matrix-pattern-
                       oriented Modified Ho-Kashyap classifier (MatMHKS) (S. Chen, Wang, & Tian, 2007),
                       which  significantly  decreases  memory  requirements.  MatMHKS  has  recently  been
                       expanded  to  UMatMHKS  (H.  Wang  &  Ahuja,  2005),  so  named  because  it  combines
                       matrix learning with Universum learning (Weston, Collobert, Sinz, Bottou, & Vapnik,
                       2006),  a  combination  that  was  shown  in  that  study  to  improve  the  generalization
                       performance of classifiers.
                          In the last ten years, many studies focused on generic classification problems have
                       investigated the discriminative gains offered by matrix feature extraction methods (see,
                       for instance, (S. C. Chen, Zhu, Zhang, & Yang, 2005; Liu & Chen, 2006; Z. Wang &
                       Chen,  2008;  Z.  Wang  et  al.,  2008)).  Relevant  to  the  work  presented  here  is  the
                       development of novel methods that take vectors and reshape them into matrices so that
                       state-of-the-art two-dimensional feature extraction methods can be applied. Some studies
                       along these lines include the reshaping methods investigated in (Z. Wang & Chen, 2008)
                       and  (Z.  Wang  et  al.,  2008)  that  were  found  capable  of  diversifying  the  design  of
                       classifiers, a diversification that was then exploited by a technique based on AdaBoost. In
                       (Kim & Choi, 2007) a composite feature matrix representation, derived from discriminant
                       analysis,  was  proposed.  A  composite  feature takes  a  number  of  primitive features  and
                       corresponds  them  to  an  input  variable.  In  (Loris  Nanni,  2011)  Local  Ternary  Patterns
                       (LTP), a variant of LBP, were extracted from vectors rearranged into fifty matrices by
                       random assignment; an SVM was then trained on each of these matrices, and the results
                       were combined using the mean rule. This method led the authors in (Loris Nanni, 2011)
                       to  observe  that  both  one-dimensional  vector  descriptors  and  two-dimensional  texture
                       descriptors can be combined to improve classifier performance; moreover, it was shown
                       that linear SVMs consistently perform well with texture descriptors.
                          In this work, we propose a new classification system, composed of an ensemble of
                       Support  Vector  Machines  (SVMs).  The  ensemble  is  built  training  each  SVM  with  a
                       different set of features. Three novel approaches for representing a feature vector as an
                       image are proposed; texture descriptors are then extracted from the images and used to
                       train  an  SVM.  To  validate  this  idea,  several  experiments  are  carried  out  on  several
                       datasets.


                       Proposed Approach

                          As mentioned in the introduction, it is quite common to represent a pattern as a one
                       dimensional  feature  vector,  but  a  vector  is  not  necessarily  the  most  effect  shape  for
   118   119   120   121   122   123   124   125   126   127   128