Sign in

Data Enthusiast
Image by Colin Behrens from Pixabay

Data Analysis

Have you ever been tired of writing multiple lines of code even for a simple graph during EDA?

Did you ever wish for recommendation-based interactive graphs within the jupyter notebook itself?

If that’s a big yes!

Thankfully! We now have the new Python library, Lux.

This article is based on Doris Jung-Lin Lee’s session in WiCDS 2021.

Lux is a Python API for intelligent visual discovery, which comes with an inbuilt interactive jupyter widget.

  • Lux could be your intelligent assistant which can automate the visual aspects of the exploratory data analysis.
  • It provides powerful abstractions of the visualizations soon after…


Photo by Robert Katzki on Unsplash

Data Science

A precise overview on how similar or dissimilar is the Linear Discriminant Analysis dimensionality reduction technique from the Principal Component Analysis

This is the second part of my earlier article which is The power of Eigenvectors and Eigenvalues in dimensionality reduction techniques such as PCA.

In this article, I will start with a brief explanation of the differences between LDA and PCA. Let’s then deep dive into the working of the Linear discriminant analysis and unravel the mystery, How it achieves classification of the data along with the dimensionality reduction.

Introduction

LDA vs. PCA

Linear discriminant analysis is very similar to PCA both look…


Photo by Robert Ruggiero on Unsplash

Let’s get started…

Introduction

A confusion matrix is also known as an error matrix. It helps us to analyze the performance of all the classification models. It gives a clear insight into how well your classification algorithm is working.

The confusion matrix contains detailed information about how many of the data points have been correctly classified as per the class of interest and the total number of misclassified data points. It gives a summary of the True positives, True negatives, False positives, and False negatives.

Consider an example of a binary classification problem.

Let’s say you have been assigned the task…


Photo by Johannes Plenio on Unsplash

A no-code deep dive into the core concepts of Eigenvectors, Eigenvalues, and Principal Component Analysis

Here goes my first article, In this, I will be giving an overview about Eigenvectors and how they play a powerful role in feature extraction to achieve dimensionality reduction of the data.

I would further explain the complete working principle of the most commonly used unsupervised dimensionality reduction technique the Principal Component Analysis (PCA).

What is the curse of dimensionality?

With the unprecedented growth of data collection and the revolution of Big data. The datasets that we would deal with will have extremely high dimensions. …

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store