Have you ever been tired of writing multiple lines of code even for a simple graph during EDA?
Did you ever wish for recommendation-based interactive graphs within the jupyter notebook itself?
If that’s a big yes!
Thankfully! We now have the new Python library, Lux.
This article is based on Doris Jung-Lin Lee’s session in WiCDS 2021.
Lux is a Python API for intelligent visual discovery, which comes with an inbuilt interactive jupyter widget.
A precise overview on how similar or dissimilar is the Linear Discriminant Analysis dimensionality reduction technique from the Principal Component Analysis
This is the second part of my earlier article which is The power of Eigenvectors and Eigenvalues in dimensionality reduction techniques such as PCA.
In this article, I will start with a brief explanation of the differences between LDA and PCA. Let’s then deep dive into the working of the Linear discriminant analysis and unravel the mystery, How it achieves classification of the data along with the dimensionality reduction.
Linear discriminant analysis is very similar to PCA both look…
Let’s get started…
A confusion matrix is also known as an error matrix. It helps us to analyze the performance of all the classification models. It gives a clear insight into how well your classification algorithm is working.
The confusion matrix contains detailed information about how many of the data points have been correctly classified as per the class of interest and the total number of misclassified data points. It gives a summary of the True positives, True negatives, False positives, and False negatives.
Consider an example of a binary classification problem.
Let’s say you have been assigned the task…
A no-code deep dive into the core concepts of Eigenvectors, Eigenvalues, and Principal Component Analysis
Here goes my first article, In this, I will be giving an overview about Eigenvectors and how they play a powerful role in feature extraction to achieve dimensionality reduction of the data.
I would further explain the complete working principle of the most commonly used unsupervised dimensionality reduction technique the Principal Component Analysis (PCA).
With the unprecedented growth of data collection and the revolution of Big data. The datasets that we would deal with will have extremely high dimensions. …