In the previous article we have discussed bagging and random forest classifier. There is another approach to reduce variance. This is called boosting. Boosting can be used for both regression and for classification. The main difference between bagging and boosting is that bagging is a parallel algorithm. I mean the decision trees are built independently. Hence bagging … [Read more...]

## Face Detection with OpenCV – Computer Vision

Face Detection and computer vision in the main is a hot topic nowadays. Thats because of the fact that it is working quite fine. Facebook uses this algorithm to detect the faces on images. It is also useful for self-driving cars and pedestrian detection. But what is is exactly? … [Read more...]

## Random Forest Classifier – Machine Learning

In the previous article we have discussed Decision Tree Classifier. We have come to the conclusion that it has the tendency to overfit. Are there any solutions to this issue? Of course. There are two main methods to reduce overfitting pruning bagging (such as Random Forest Classifier) BIAS-VARIANCE TRADEOFF If we want to understand pruning or bagging, first we … [Read more...]

## Decision Tree Classifier – Machine Learning

Decision Tree Classifier is a type of supervised learning approach. It is mostly used in classification problems but it is useful when dealing with regession as well. The main advantage of decision trees is that they can handle both categorical and continuous inputs. categorical variables: YES/NO or FAIL/PASS continuous values: values for regression (for example: house … [Read more...]

## Naive Bayes Classifier Explained Step by Step

Naive Bayes Classifier is a very efficient supervised learning algorithm. So far we have discussed Linear Regression and Logistic Regression approaches. For both of these algorithms we had to solve an optimization related problem. Why? In order to find the optimal β parameters. If we know these parameters, we can make predictions for new data samples. Naive Bayes … [Read more...]