## Welcome to Our Blog

### Deriving the Gradient Descent Rule (PART-2)

What Will You Learn? In our previous post, we have talked about the meaning of gradient descent and how it

### The Derivative of Softmax(z) Function w.r.t z

What will you learn? Ask any machine learning expert! They will all have to google the answer to this question:

### Deriving the Gradient Descent Rule (PART-1)

The Gradient Descent Rule https://www.youtube.com/watch?v=gYqG4OT2Kj4 When training a model, we strive to minimize a certain error function (). This error

### Reproducibility in Pytorch

What is Reproducibility All About? As a computer scientist, or as an academician, you do experiments with a bunch of

### What is the Delta Rule? (Part-2)

What We Have Learned So Far … So far, we have learned that the Delta rule guarantees to converge to

### What is the Delta Rule? (Part-1)

The Beauty that is the Delta Rule In general, there are 2 main ways to train an Artificial Neural Network

### The Perceptron Training Rule

The Perceptron Training Rule It is important to learn the training process of huge neural networks. However, we need to

### ECML-PKDD-2019: Elliptical Basis Function Data Descriptor (EBFDD) for Anomaly Detection

ECML-PKDD-2019 on EBFDD networks for Anomaly Detection This paper introduces the Elliptical Basis Function Data Descriptor (EBFDD) network, a one-class

### Train a Perceptron to Learn the AND Gate from Scratch in Python

What will you Learn in this Post? Neural Networks are function approximators. For example, in a supervised learning setting, given