Building an Artificial Neural Network from Scratch using Keras

Artificial Neural Networks, or ANN, as they are sometimes called were among the very first Neural Network architectures. They are inspired by network of biological neurons in our brains. I find it very important to mention here that the Artificial Neural Networks are only inspired by the biological neurons and are not designed to model …

Building an Artificial Neural Network from Scratch using Keras Read More »

Building a Convolutional Neural Network from Scratch using Keras

Convolutional Neural Networks, or CNN, as they are better known, are widely used nowadays for a variety of tasks ranging from Natural Language Processing(NLP) to Computer Vision tasks such as Image Classification and Semantic Segmentation. In this blog post, we will be building our own Convolutional Neural Network from Scratch using the Keras library. Keras …

Building a Convolutional Neural Network from Scratch using Keras Read More »

Introduction to Nearest Neighbours Classification with Scikit-Learn

In this blog post, we will be talking about how to use the algorithm of Nearest Neighbours for Classification with Scikit Learn. In an earlier post, we discussed about how to use the Nearest Neighbours algorithm to perform a regression task using Scikit-Learn. Further, we will be dealing only with binary classification in this post, …

Introduction to Nearest Neighbours Classification with Scikit-Learn Read More »

An Introduction to the Nearest Neighbours Algorithm in Scikit-Learn

In this blog post, we will be talking about how to use the algorithm of Nearest Neighbours in Scikit Learn, a python library for training machine learning models. Nearest Neighbours is very simple and easy to use algorithm. In this particular post, we will not be dealing with any dataset, we will simply use a …

An Introduction to the Nearest Neighbours Algorithm in Scikit-Learn Read More »

Leaky ReLU as an Activation Function in Neural Networks

Rectified Linear Unit, also known as ReLU overcame some of the serious disadvantages of the earlier used activation functions such as Sigmoid and Hyperbolic Tangent activation functions. These include the exploding gradients and vanishing gradients problem. ReLU overcame these problems and it was fast and simple to calculate. However, despite all this it was far …

Leaky ReLU as an Activation Function in Neural Networks Read More »