“Once upon a time, I, Chuang Tzu, dreamt I was a butterfly, fluttering hither and thither, to all intents and purposes a butterfly. I was conscious only of following my fancies as a butterfly, and was unconscious of my individuality as a man. Suddenly, I awoke, and there I lay, myself again. Now I do not know whether I was then a man dreaming I was a butterfly, or whether I am now a butterfly dreaming that I am a man.”from The Brain: The Story of you – David Eagleman “Thought is a great big vector of neural activity”Prof Geoffrey Hinton This is the third part in my series on Deep Learning from first principles in Python, R and Octave. In the first part Deep Learning from first principles in Python, R and Octave-Part 1, I implemented logistic regression as a…

Original Post: Deep Learning from first principles in Python, R and Octave – Part 3

# Posts by Tinniam V Ganesh

## Deep Learning from first principles in Python, R and Octave – Part 2

“What does the world outside your head really ‘look’ like? Not only is there no color, there’s also no sound: the compression and expansion of air is picked up by the ears, and turned into electrical signals. The brain then presents these signals to us as mellifluous tones and swishes and clatters and jangles. Reality is also odorless: there’s no such thing as smell outside our brains. Molecules floating through the air bind to receptors in our nose and are interpreted as different smells by our brain. The real world is not full of rich sensory events; instead, our brains light up the world with their own sensuality.”The Brain: The Story of You” by David Eagleman “The world is Maya, illusory. The ultimate reality, the Brahman, is all-pervading and all-permeating, which is colourless, odourless, tasteless, nameless and formless“Bhagavad Gita 1.…

Original Post: Deep Learning from first principles in Python, R and Octave – Part 2

## Deep Learning from first principles in Python, R and Octave – Part 1

“You don’t perceive objects as they are. You perceive them as you are.”“Your interpretation of physical objects has everything to do with the historical trajectory of your brain – and little to do with the objects themselves.”“The brain generates its own reality, even before it receives information coming in from the eyes and the other senses. This is known as the internal model” David Eagleman – The Brain: The Story of You This is the first in the series of posts, I intend to write on Deep Learning. This post is inspired by the Deep Learning Specialization by Prof Andrew Ng on Coursera and Neural Networks for Machine Learning by Prof Geoffrey Hinton also on Coursera. In this post I implement Logistic regression with a 2 layer Neural Network i.e. a Neural Network that just has an input layer and an output layer and with…

Original Post: Deep Learning from first principles in Python, R and Octave – Part 1

## The 3rd paperback editions of my books on Cricket, now on Amazon

The 3rd paperback edition of both my books on cricket is now available on Amazon for $12.99 a) Cricket analytics with cricketr, Third Edition ($12.99). This book is based on my R package ‘cricketr‘, available on CRAN and uses ESPN Cricinfo Statsguru b) Beaten by sheer pace! Cricket analytics with yorkr, 3rd edition ($12.99). This is based on my R package ‘yorkr‘ on CRAN and uses data from Cricsheet Pick up your copies today!! Note: In the 3rd edition of the paperback book, the charts will be in black and white. If you would like the charts to be in color, please check out the 2nd edition of these books You may also like1. My book ‘Practical Machine Learning with R and Python’ on Amazon2. A crime map of India in R: Crimes against women3. What’s up Watson? Using IBM Watson’s…

Original Post: The 3rd paperback editions of my books on Cricket, now on Amazon

## My book ‘Practical Machine Learning with R and Python’ on Amazon

My book ‘Practical Machine Learning with R and Python – Machine Learning in stereo’ is now available in both paperback ($9.99) and kindle ($6.97/Rs449) versions. In this book I implement some of the most common, but important Machine Learning algorithms in R and equivalent Python code. This is almost like listening to parallel channels of music in stereo! This book is ideal both for beginners and the experts in R and/or Python. Those starting their journey into datascience and ML will find the first 3 chapters useful, as they touch upon the most important programming constructs in R and Python and also deal with equivalent statements in R and Python. Those who are expert in either of the languages, R or Python, will find the equivalent code ideal for brushing up on the other language. And finally,those who are proficient…

Original Post: My book ‘Practical Machine Learning with R and Python’ on Amazon

## Practical Machine Learning with R and Python – Part 6

This is the final and concluding part of my series on ‘Practical Machine Learning with R and Python’. In this series I included the implementations of the most common Machine Learning algorithms in R and Python. The algorithms implemented were 1. Practical Machine Learning with R and Python – Part 1 In this initial post, I touch upon regression of a continuous target variable. Specifically I touch upon Univariate, Multivariate, Polynomial regression and KNN regression in both R and Python2. Practical Machine Learning with R and Python – Part 2 In this post, I discuss Logistic Regression, KNN classification and Cross Validation error for both LOOCV and K-Fold in both R and Python3. Practical Machine Learning with R and Python – Part 3 This 3rd part included feature selection in Machine Learning. Specifically I touch best fit, forward fit, backward fit, ridge(L2 regularization)…

Original Post: Practical Machine Learning with R and Python – Part 6

## Practical Machine Learning with R and Python – Part 5

This is the 5th and probably penultimate part of my series on ‘Practical Machine Learning with R and Python’. The earlier parts of this series included 1. Practical Machine Learning with R and Python – Part 1 In this initial post, I touch upon univariate, multivariate, polynomial regression and KNN regression in R and Python2.Practical Machine Learning with R and Python – Part 2 In this post, I discuss Logistic Regression, KNN classification and cross validation error for both LOOCV and K-Fold in both R and Python3.Practical Machine Learning with R and Python – Part 3 This post covered ‘feature selection’ in Machine Learning. Specifically I touch best fit, forward fit, backward fit, ridge(L2 regularization) & lasso (L1 regularization). The post includes equivalent code in R and Python.4.Practical Machine Learning with R and Python – Part 4 In this part I discussed SVMs, Decision…

Original Post: Practical Machine Learning with R and Python – Part 5

## Practical Machine Learning with R and Python – Part 4

This is the 4th installment of my ‘Practical Machine Learning with R and Python’ series. In this part I discuss classification with Support Vector Machines (SVMs), using both a Linear and a Radial basis kernel, and Decision Trees. Further, a closer look is taken at some of the metrics associated with binary classification, namely accuracy vs precision and recall. I also touch upon Validation curves, Precision-Recall, ROC curves and AUC with equivalent code in R and Python This post is a continuation of my 3 earlier posts on Practical Machine Learning in R and Python1. Practical Machine Learning with R and Python – Part 12. Practical Machine Learning with R and Python – Part 23. Practical Machine Learning with R and Python – Part 3 The RMarkdown file with the code and the associated data files can be downloaded from…

Original Post: Practical Machine Learning with R and Python – Part 4

## Practical Machine Learning with R and Python – Part 3

In this post ‘Practical Machine Learning with R and Python – Part 3’, I discuss ‘Feature Selection’ methods. This post is a continuation of my 2 earlier postsWhile applying Machine Learning techniques, the data set will usually include a large number of predictors for a target variable. It is quite likely, that not all the predictors or feature variables will have an impact on the output. Hence it is becomes necessary to choose only those features which influence the output variable thus simplifying to a reduced feature set on which to train the ML model on. The techniques that are used are the followingThis post includes the above ML model in R and equivalent Python code.All these methods remove those features which do not sufficiently influence the output. As in my previous 2 posts on “Practical Machine Learning with R…

Original Post: Practical Machine Learning with R and Python – Part 3

## Practical Machine Learning with R and Python – Part 2

In this 2nd part of the series “Practical Machine Learning with R and Python – Part 2”, I continue where I left off in my first post Practical Machine Learning with R and Python – Part 2. In this post I cover the some classification algorithmns and cross validation. Specifically I touch-Logistic Regression-K Nearest Neighbors (KNN) classification-Leave out one Cross Validation (LOOCV)-K Fold Cross Validationin both R and Python. As in my initial post the algorithms are based on the following courses. You can download this R Markdown file along with the data from Github. I hope these posts can be used as a quick reference in R and Python and Machine Learning.I have tried to include the coolest part of either course in this post. The following classification problem is based on Logistic Regression. The data is an included data set…

Original Post: Practical Machine Learning with R and Python – Part 2