# Data Science Simplified

Learning the Machine Learning, in a Human-friendly Way

### Feature Selection using sklearn

In this post, we will understand how to perform Feature Selection using sklearn.

• Dropping features which have low variance
• Dropping features with zero variance
• Dropping features with variance below the threshold variance
• Univariate feature selection
• Model based feature selection
• Feature Selection using pipeline

### Feature Engineering for Machine Learning

In this post, let us explore:

• What is the difference between Feature Selection, Feature Extraction, Feature Engineering and Feature Learning
• Process of Feature Engineering
• And examples of Feature Engineering

### Feature Selection: Filter method, Wrapper method and Embedded method

In this post, let us explore:
• What is feature selection?
• Why we need to perform feature selection?
• Methods

### Naïve Bayes classification model for Natural Language Processing problem using Python

In this post, let us understand how to fit a classification model using Naïve Bayes (read about Naïve Bayes in this post) to a natural language processing (NLP) problem.

Hi,

Happy learning. I wish you all the best.

### Natural Language Processing made simple: Word Cloud, Sentiment Analysis and Topic Modelling

In this chapter, let us understand
• What is NLP?
• Concepts
• How to get word cloud?
• How to perform sentiment analysis?
• How to build Topic modelling?
• Summary

### Hierarchical and K-means cluster analysis with examples using sklearn

In this post, we will explore:

• What is cluster analysis?
• Hierarchical cluster analysis
• K-means cluster analysis
• Applications

### Demystifying Principal Component Analysis (PCA): A Beginner's Guide with Intuitive Examples & Illustrations

In this post, let us understand

• What is Principal Component Analysis (PCA)
• When to use it and what are the advantages
• How to perform PCA in Python with an example

### Understanding Naive Bayes: A Beginner's Guide with Visual Illustrations & Examples

Thomas Bayes was an English statistician. As Stigler states, Thomas Bayes was born in 1701, with a probability value of 0.8! (link). Bayes' theorem has a useful application in machine learning. His papers were published by his friend, after his death. It is also said that his friend has used the theorem to prove existence of God.

### Support Vector Machines (SVM) Explained with Visual Illustrations

Suppose there are two independent variables (features): x1 and x2. And there are two classes Class A and Class B. The following graphic shows the scatter diagram.

### Logistic Regression: A Beginner's Visual Guide

Logistic regression is a supervised learning technique applied to classification problems.

In this post, let us explore:
• Logistic Regression model
• Example
• Hyperparemeters and Tuning

### Building a Deep Learning Model using Keras

In this post, let us see how to build a deep learning model using Keras. If you haven't installed Tensorflow and Keras, I will show the simple way to install these two modules.

### Mastering the Basics of Deep Learning with Illustrations

Deep learning is a powerful machine learning technique. These are widely used in

• Natural Language Processing (NLP)
• image/speech recognition
• robotics
• and many other artificial intelligence projects

### ARIMA/SARIMA with Python: Understand with Real-life Example, Illustrations and Step-by-step Descriptions

Autoregressive Integrated Moving Average (ARIMA) is a popular time series forecasting model. It is used in forecasting time series variable such as price, sales, production, demand etc.