The article moves through understanding Normalization, different types of normalization to the discussion of some Normalization layers.
SATA— Would COVID-19 never end?
RAM— No dear! It would take time for things to be normalized.
In a similar manner, Data Science also employs the word Normalization.
Normalization in laymen terms is basically bringing something to normal or stable state. Well, in this article we would dig a little deeper about normalization, some of its types and different Normalization layers used to normalize the output of hidden layers in a neural network.
Isn’t it amazing when someone just replaces their face with the face of some known personalities in real time? Or do you get extremely curious and excited when you see a stylish portrait being built using computers only by providing a small layout to it? Well, this is what we are going to uncover in this article. All these cool ideas are mostly implemented with the help of a modern idea in machine learning i.e. GANs.
Note: The aforementioned ideas could also be implemented with an efficient usage of Computer Graphics packages which is out of scope for this article…
As we know that to improve the performance of an ML model, we need to do some data preprocessing steps before actually training the model. One of the data preprocessing steps is Data Augmentation. Over recent years data augmentation has resulted in highly improving the model’s performance. As data augmentation techniques tend to increase the performance of models, efforts were also made to improve them. One such technique introduced recently is CutMix which we are gonna discuss in this article.
Note this article reflects the study of the original paper that introduced CutMix augmentation and thus some definitions and phrases…
I am an ML developer and knows very little about web-development, I needed to showcase my skills to the world so I recently made a portfolio for myself in just 2 days. Have a look at it.
Do you liked it? Well, you could also make a pretty interactive and beautiful looking portfolio for yourself in a very little amount of time. In this tutorial we would be focusing on learning how to get a starter-template from Gatsby to build a portfolio and hosting it on GitHub Pages so that you can share it with anyone and anywhere easily.
Recently I participated in a Kaggle competition: SIIM-ISIC Melanoma Classification. In this competition one has to output the probability of the melanoma in images of skin lesions from the two classes of skin cancer. So it is a kind of binary image classification task. The Evaluation criterion is AUC (Area Under the Curve) metric. At first I worked on a model with cross-entropy as a loss function. Then after some searching over the internet I found this paper in which the team at Facebook AI research(FAIR) introduced a new loss function — Focal Loss.
I got a good AUC score…
Machine Learning | Competitive Programming | Data Science | AI. An undergrad working on learning and spreading knowledge continuously.