Friday, January 2, 2009

Classifiers: Bayes, PCA, ANN,SVM

May be it's difficult for me to write all I get from book about classifier .But I think it will be useful for me to write notes here.
These classifiers need be trained before use.

For Bayes: we need know the class conditional probability density function(f(feature/class)), and we can get it by two ways:
1, Assume the function is normal distribution. Using the training data to get mean and variance of normal distribution;
2, Get the class conditional density by doing much experiments(training);
If we get the density function, we can calculate posterior probability (f(class/feature)) by Bayes rule(See Computer vision :a modern approach for detail);

For PCA: PCA(principal component analysis) in fact is not a classifier, but a tool to get new better feature vector from original feature vector, which have the most variance between these feature vector that mean no useless information in new feature vector.
We can get new feature vector by : First, calculating the eigenvectors of original feature vector variance matrix ; Second, projecting the original feature to the direction of eigenvectors, then we get new feature vectors;

For ANN: ANN(artificial neural network) is a method using iteration to get new parameters that can making the error between real output and the ideal output becoming smaller. We can use stochastic gradient descent minimizes the error and use backpropagation to compute the derivatives;

For SVM: SVM(support vector machines) is a classifier that using the training data to get a hyperplane which can separate the classes, and the minimum distance between hyperplane and class is same.Why we call it SVM, because not all sample data will affect the parameters of this hyperplane, only some points can determine hyperplane parameters that's the support vectors;

No comments: