Machine Learning Models Explained in 5 Minutes

Share this post

LINEAR REGRESSION

  • The ultimate goal of linear regression is to find a line that best fits the data.
  • The goal of multiple linear regression and polynomial regression is to find the plane that best fits the data in n-dimension.
Linear Regression

LOGISTIC REGRESSION

  • Logistic regression is used to describe data and to explain the relationship between one dependent binary variable and one or more independent variables.
  • The logistic equation is created in such a way that the output a probability value that can be mapped to classes and values can only be between 0 and 1.
Logistic Regression

SUPPORT VECTOR MACHINE

  • SVM finds the hyperplane between classes of data which maximizes the margin between classes.
  • There can be many hyperplanes that can separate the classes. but only one plane can maximize the distance between the classes.
SVM

K NEAREST NEIGHBOR

  • K nearest neighbors is a simple algorithm that stores all available cases and classifies new cases based on a similarity measure.
  • It is an approach to data classification that estimates how likely a data point is to be a member of one group or the other depending on what group the data points nearest to it are in.
KNN

NAIVE BAYES

  • Naive Bayes is another popular classification algorithm.
  • The goal is to find the class with the maximum proportional probability.
  • It answers the following question. “What is the probability of A given B? And because of the naive assumption that variables are independent given the class
Naive Bayes

DECISION TREES

  • Each circle above is called a node.
  • The last nodes of the decision tree, where a decision is made, are called the leaves of the tree.
  • Decision trees are intuitive and easy to build but fall short when it comes to accuracy.
Decision Trees

RANDOM FOREST

  • It creates multiple decision trees using bootstrapped datasets of the original data and randomly selecting a subset of variables at each step of the decision tree.
  • The model then selects the mode of all of the predictions of each decision tree.
  • By multiple trees it reduces the risk of error from an individual tree.
Random Forest

CLUSTERING

  • Clustering is an unsupervised technique that involves the grouping, or clustering, of data points. It’s frequently used for customer segmentation, fraud detection, and document classification.
  • Common clustering techniques include k-means clustering, hierarchical clustering, mean shift clustering, and density-based clustering. While each technique has a different method in finding clusters, they all aim to achieve the same thing.
Clustring

After reading this blog, I am sure you will get the basics concepts of the machine learning model.


Share this post

25 thoughts on “Machine Learning Models Explained in 5 Minutes”

  1. Wow that was strange. I just wrote an really long comment but after I clicked
    submit my comment didn’t show up. Grrrr… well I’m not writing all
    that over again. Anyhow, just wanted to say great blog!

  2. You really make it seem really easy together with your presentation however I in finding this topicto be really something that I believe I’d never understand.It kind of feels too complex and very wide for me.I’m taking a look forward for your next put up, I will attempt to get the cling of it!

  3. Sweet blog! I found it while surfing around on Yahoo News.
    Do you have any suggestions on how to get listed in Yahoo News?
    I’ve been trying for a while but I never seem to get there!
    Thank you

  4. Hi! This post could not be written any better! Reading
    this post reminds me of my good old room mate! He always kept talking about this.
    I will forward this page to him. Fairly certain he will have a
    good read. Many thanks for sharing!

  5. Simply desire to say your article is as astonishing. The clarity in your post is simply cool and i could assume you are an expert on this subject. Well with your permission let me to grab your RSS feed to keep updated with forthcoming post. Thanks a million and please keep up the gratifying work.

Leave a Comment

Your email address will not be published. Required fields are marked *