RSS

Tag Archives: Statistics

Comparing Classifiers – Revisiting Bayes

I have been quite interested in data and its analysis lately. One of the major tasks involved in dealing with data is classifying it. Is a customer credit worthy or not? Would a customer be interested in buying the latest flavor of ice cream? Or better still, which flavor/brand is she likely to choose?  While these questions require predicting the future, more specifically they require you to classify people/objects into different bins based on what has been observed historically.

To address this issue many types of classifiers have been developed by mathematicians, statisticians and computer scientists. Most of these make some kind of assumptions about underlying data and are varied in their complexity as well as accuracy. As a rule of thumb, the more complex classifiers make less stringent assumptions about the underlying data and thereby give more accurate results for data which isn’t as well behaved as a statistician would ideally like it to be.

Since this piqued my interest I decided to test out two well known classifiers very varied in their level of complexity on the famous iris data. The classifiers I tried out are the Naïve Bayes Classifier and the Multinomial Logistic Regression Model.

I think I’ll talk a little about the data first, the Iris dataset is pretty famous and is available as one of the pre-loaded data sets in R(Open source Statistical Software).

The dataset is a table of observations collected by the botanist Edgar Anderson. It has measurements of the Petal Width, Petal Height, Sepal Width and Sepal Height of three species of the Iris flower namely Iris Setosa, Iris Virginica and Iris Versicolor.

Sepal.Length Sepal.Width Petal.Length Petal.Width    Species
          6.2         2.9          4.3        1.3 versicolor
          4.8         3.4          1.9         0.2     setosa
          6.4         3.2          4.5         1.5 versicolor
          6.6         2.9          4.6         1.3 versicolor
         7.9         3.8          6.4         2.0  virginica
          6.0         2.9          4.5         1.5 versicolor

Our objective would be to create this classifier that could learn from this data and classify future observations about iris flowers into one of these three species based on these parameters.

The Naïve Bayes classifer is an extremely simple classifier based on Bayes Theorem. It makes the strict assumption that each of the attributes (Petal Width , Petal Height, Sepal Width) are conditionally independent that is the probability of a flower having a larger petal width wouldn’t depend on the fact that it has a large petal length, once you know which type of flower species it is. If this is true, we would expect that there wouldn’t be any correlation between a set of attributes within a flower species.

A quick look at the scatter plot below (click and expand the gallery) would tell us that this isn’t exactly true, looking at the third box on the last row, there is an evident correlation between Petal width and Petal Length given that the species is Versicolor (Red). There are plenty of observations which look not too correlated as well. I am going to go ahead and use the Naïve Bayes Classifier and see what it does anyway. Naïve Bayes is a classifier whose algorithm breaks down into simple counting and so its very easy to understand and computationally simple to implement. Professore Andrew Moore’s website is an excellent source for understanding these and other algorithms in data analysis.

The Mulinomial Logistic Regression Model uses a more complex estimation process. It is an extension to multiple groups of the logistic regression model which provides estimators for data which can be classified into binary groups. Here we use multinomial logit rather than the basic logit model as the data has 3 groups namely Versicolor, Setosa and Virginica.  Mainly a regression analysis is done for two of these classes while one of them is regarded as the base class. Therefore 5(number of parameters = 4, namely Sepal Length, Sepal Width, Petal Length and Petal Width + 1 (intercept)) parameters are estimated for each regression, bringing it to a total of 10 parameters. The estimation involves a Bayesian concept called Maximum a Posteriori estimation which is far more complex than the simple counting of the Naïve Bayes Classifier.

So why go into all this trouble? The answer is that multinomial logistic regression models make far less stricter assumptions about the nature of the underlying data. The assumption made in the case of this model is one called Independence of Irrelevant Alternatives. That is adding another category to the existing three categories of species should not change the relative odds between any two of the species already listed. This condition only applies to the choice variable (Species) and says nothing about the attributes unlike the conditional independence assumption in the Naïve Bayes classifier.

So I used both the classifiers. The iris dataset has 150 rows of data , that is 150 flowers were observed and recorded in terms of the attributes mentioned and their species. In order to test these classifiers, I used only 75% of the data to train them and the other 25% to test the predictions made by them with the true value of categories that are available.

A simple function to calculate their respective error rates/accuracy was written in  R.

The result ? Mlogit was more accurate but only marginally so. Over 10 runs, on average the Naïve Bayes classifier gave 95% accurate results and the Mlogit gave 97% accurate results. Small price to pay for getting rid of complex computation ? Maybe not if you have powerful processors and efficient algorithms but every mistake could cost you a lot, Maybe so if you just want to make quick classifications and a little loss in accuracy wont cost you a lot compared to the gains in speed. Points to ponder..?

You can download the code to do these comparisons here — It would work with any data frame as long as the last column is the choice variable.(Or so I believe) .Classifiers

Photo Credits

Wiki Commons –

Setosa – Radomil

Versicolor – Daniel Langlois

Virginica – Frank Mayfield

Advertisements
 
1 Comment

Posted by on March 4, 2012 in Data Analysis, Uncategorized

 

Tags: , , , ,

 
%d bloggers like this: