# A Quick Guide to Machine Learning Algorithms

As a Machine Learning beginner you have lots of question like which algorithm should i use, what are the advantages of this algorithm, what are the use cases of all the algorithms and so on. Toady, we will answer all the questions and clear all your doubts with simple explanation. So, stay tuned till the end...

1. Support vector machines

- Support vector machines classify groups of data with the help of hyperplanes.

- Support vectors determine a margin's boundaries so the margin or — hyperplane can act as a linear classifier.

- SVMs are good for the binary classification of X versus other variables and are useful whether or not the relationship between variables is linear.

Use Cases of SVM:

- This algorithm is used for News Categorization, Handwriting Recognition, etc

2. Naive Bayes Classification

- Naive Bayes classifiers compute probabilities and given tree branches of possible conditions. Each individual feature is "naive" or conditionally independent of, and therefore does not influence, the others.

- For example, what's the probability you would draw two yellow marbles in a row, given a jar of five yellow and red marbles total? The probability, following the topmost branch of two yellow in a row, is one in ten. Naive Bayes classifiers compute the combined, conditional probabilities of multiple attributes.

- Naive Bayes method allow quick classification of relevant items in small data sets that have distinct features

Use Cases of Naive Bayes:

- This algorithm are used for Sentiment Analysis and Consumer Segmentation

3. Hidden Markov Model

- Observable Markov processes are purely deterministic, one given state always follows another given state. Traffic light patterns are an example.

- Hidden Markov models, by contrast, compute the probability of hidden states occurring by analyzing observable data, and then estimating the likely pattern of future observation with the help of the hidden state analysis. In this example, the probability of high or low pressure (the hidden state) is used to predict the likelihood of sunny, rainy, or cloudy weather.

- Markov Model Tolerates data variability and it is effective for recognition and prediction.

Use cases of this Model:

- This algorithm is used in Facial Expression Analysis and Weather Prediction.

4. Random forest

- Random forest algorithms improve the accuracy of decision trees by using multiple trees with randomly selected subsets of data. This example reviews the expression levels of various genes associated with breast cancer relapse and computes a relapse risk.

Advantages of Random Forest Algorithm are as follows:

- Random forest methods prove useful with large data sets and items that have numerous and sometimes irrelevant Matures.

Use Cases of this Algorithm:

- It is used for Customer Churn Analysis and Risk Assesment.

5. Recurrent neural networks

- Each neuron in any neural network converts many inputs into single outputs via one or more hidden layers. Recurrent neural networks [RNNs] additionally pass values from step to step, making step-by-step learning possible. In other words, RNNs have a form of memory, allowing previous outputs to affect subsequent inputs.

Advantages of RNN are listed below:

- Recurrent neural networks have predictive when used with large amounts of sequenced information.

Use Cases of RNN:

- This algorithm is used mainly for Image Classification and Captioning, Political Sentiment Analysis, etc

6. Long short-term memory & gated recurrent unit neural networks

- Older forms of RNNs can be lossy. While these older recurrent neural networks only allow small amounts of older information to persist, newer long short-term memory (LSTM) and gated recurrent unit (GRU) neural networks have both long- and short-term memory. In other words, these newer RNNs have greater memory control, allowing previous values to persist or to be reset as necessary for many sequences of steps, avoiding "gradient decay" or eventual degradation of the values passed from step to step. LSTM and GRU networks make this memory control possible with memory blocks and structures called gates that pass or reset values as appropriate.

- Long short-term memory and gated recurrent unit neural networks have the same advantages as other recurrent neural networks and are more frequently used than other recurrent neural networks because of their greater memory capabilities.

Use Cases of LSTM and GRUNN are given below:

- LSTM and GRUNN are used in Natural Language Processing and Translation

7. Convolutional Neural Networks

- Convolution Neural Networks are blends of weights from a subsequent layer that are used to label the output layer.