[unable to retrieve full-text content]By the end of this article, you could at least get the idea of how these questions are answered and be able to test yourself based on simple examples.
Original Post: Beginners Ask “How Many Hidden Layers/Neurons to Use in Artificial Neural Networks?”
[unable to retrieve full-text content]Machine learning has huge potential for the future of humanity — but it won’t solve all our problems.
Original Post: AI Solutionism
[unable to retrieve full-text content]This post is a distilled collection of conversations, messages, and debates on how to optimize deep models. If you have tricks you’ve found impactful, please share them in the comments below!
Original Post: Deep Learning Tips and Tricks
[unable to retrieve full-text content]Most Deep Learning frameworks currently focus on giving a best estimate as defined by a loss function. Occasionally something beyond a point estimate is required to make a decision. This is where a distribution would be useful. This article will purely focus on inferring quantiles.
Original Post: Deep Quantile Regression
[unable to retrieve full-text content]Understand the inner workings of neural network models as this post covers three related topics: histogram of weights, visualizing the activation of neurons, and interior / integral gradients.
Original Post: Inside the Mind of a Neural Network with Interactive Code in Tensorflow
[unable to retrieve full-text content]The approach basically coincides with Chollet’s Keras 4 step workflow, which he outlines in his book “Deep Learning with Python,” using the MNIST dataset, and the model built is a Sequential network of Dense layers. A building block for additional posts.
Original Post: Building a Basic Keras Neural Network Sequential Model
[unable to retrieve full-text content]Neural Networks are powerful but complex and opaque tools. Using Topological Data Analysis, we can describe the functioning and learning of a convolutional neural network in a compact and understandable way.
Original Post: Using Topological Data Analysis to Understand the Behavior of Convolutional Neural Networks
[unable to retrieve full-text content]Also 30 Free Resources for Machine Learning, Deep Learning, NLP ; 7 Simple Data Visualizations You Should Know in R.
Original Post: KDnuggets™ News 18:n25, Jun 27: 5 Clustering Algorithms Data Scientists Need to Know; Detecting Sarcasm with Deep Convolutional Neural Networks?
[unable to retrieve full-text content]This article explains batch normalization in a simple way. I wrote this article after what I learned from Fast.ai and deeplearning.ai.
Original Post: Batch Normalization in Neural Networks
[unable to retrieve full-text content]In this blog I am going to talk about the issues related to initialization of weight matrices and ways to mitigate them. Before that, let’s just cover some basics and notations that we will be using going forward.
Original Post: Deep Learning Best Practices – Weight Initialization