IUP Publications Online
Home About IUP Magazines Journals Books Archives
     
Recommend    |    Subscriber Services    |    Feedback    |     Subscribe Online
 
The IUP Journal of Computer Sciences :
Improved SVM Classifier Based on FODPSO and GSA Algorithms and Its Application in Face Recognition
:
:
:
:
:
:
:
:
:
 
 
 
 
 
 
 

Support Vector Machine (SVM) is widely used as a classifier in a number of applications in image processing. However, the penalty parameter (C) of SVM controls the tradeoff between margin and penalty factor. And the kernel parameter gamma (g) of SVM with the Radial Basis Function (RBF) is inversely proportional to width of kernel which affects the classification performance seriously. The parameters C and g are the deciding factors which affect the performance of SVM classifier. The paper tunes these factors using hybrid optimization algorithms. Two optimization algorithms, namely, Fractional Order Darwinian Particle Swarm Optimization (FODPSO), which is advanced level of DPSO, and Gravitational Search Algorithm (GSA) are combined. Out of these, the first is local optimization algorithm and the second one is global optimization algorithm. Two common datasets, namely, JAFFE and Yale, are used in the experiments to validate the performance of the proposed algorithms. The results demonstrate that the SVM parameters selected by FODPSO-GSA-SVM contribute to higher accuracy than PSO, GA, GSA and other methods mentioned in the literature.

 
 
 

Classification problems have been studied extensively in the field of face recognition. Support Vector Machine (SVM), a popular and benchmark classification technique, was proposed by Vapnik (1998). It is based on the idea of structural risk minimization and Vapnik dimensions theory of Statistical Learning Theory. For its outstanding performance, SVM has obtained extensive recognition across numerous domains such as biological classification (Yu et al., 1999), pattern recognition, (Pontil and Verri, 1998), text categorization (Joachims, 1998), regression, microarray classification, ranking, etc. The major advantages of SVM are: (a) complete theoretical foundation; (b) short training time; (c) global optimization; and (d) good generalization performance and many more. The parameter selection has an important influence on the classification accuracy, and so in order to achieve best generalization ability, penalty parameter C and the kernel parameter, gamma g with Radial Basis Function (RBF) of SVM and kernel function of SVM must be determined carefully.

 
 
 

Computer Sciences IUP Journal , Support Vector Machine (SVM), Fractional Order Darwinian Particle Swarm Optimization (FODPSO), Gravitational Search Algorithm (GSA), Radial basis function, Tuning SVM parameters.