As 2021 rolls into our lives and we make resolutions, I decided to come back to writing and also start refreshing my memory on the details of analytics and machine learning that is often over-looked while working at a day job.

With that said, below is the plan for the next 30 days to get the basics of statistics, data science and machine learning refreshed. This is NOT going to include deep learning and bleeding edge stuff — that will be part of a future study guide. We will be using An Introduction To Statistical Learning (ISLR) as a reference…

Keras has support for most of the optimizers and loss functions that are needed, but sometimes you need that extra out of Keras and you don’t want to know what to do. Worry not! Keras supports custom loss and optimizers.

Recently at work I had to figure out a custom loss function that suited best for the problem at hand and also I want to not tweak the learning rate — so some research-paper-reading later I found SMAPE and CoCoB!

SMAPE — Symmetric Mean Absolute Percentage Error

CoCoB — Continuous Coin Betting algorithm

Now this article is not meant to…

Dogs are man’s best friend and they deserve to be identified correctly. In pursuit of differentiating a Husky (Go Dawgs!) from an Alaskan Malamute, let’s learn how to use transfer learning to classify dog breeds.

Find the entire Jupyter Notebook on my GitHub.

*NOTE**: This project/article is based off of Udacity’s skeleton **Dog Breed Classifier project** as part of the AIND program with certain modifications.*

As always with most of my technical posts, we need to make sure we have the data we want to work with. Now this post is not for building the entire architecture from scratch and…

In this article we will try to build a Convolution Neural Network model for the MNIST dataset which contains hand written digits and labels. We will use Keras to create & train the model and also visualize at certain steps in way what the model actually sees and does. Let’s get started.

You can find the entire code in a Jupyter Notebook in my Github Repo.

The very first thing for anyone to build a model is to get the data. Here are some code snippets to get the data ready and loaded into your python environment. …

There is a constant fight to fit your model to the data, to find the global minima for the loss function, to get a < 0.5 p-value on the variables selected for the model, to push the ROC curve as close to 1. This fight ensues between every data scientist and their data and it is a constant struggle.

Being an engineer, I have pondered over the fact that if it is an optimization problem, why can’t we automate the process — throw a data set to an application, set the type of output variables and the app will iterate…

ML | Deep Learning | Analytics