Skip to content

Research Papers, Blogs & Resources

Research Papers and Blogs

Graph Based Method

  1. DeepWalk: Online Learning of Social Representations
  2. HOPE : Asymmetric Transitivity Preserving Graph Embedding
  3. Feature Extraction for Graphs
  4. A overview of FE in Graphs

Discipline the method

  1. A DISCIPLINED APPROACH TO NEURAL NETWORK HYPER-PARAMETERS: PART 1 – LEARNING RATE, BATCH SIZE, MOMENTUM, AND WEIGHT DECAY

Loss Functions

  1. Comprehensive Survey of Loss Functions in Machine Learning
  2. Classification
    1. Study of deep learning loss functions for multi-label remote sensing image classification
    2. Recall Loss for Semantic Segmentation
    3. Focal Loss for Dense Object Detection
    4. Class Distance Weighted Cross-Entropy Loss
    5. Squared Earth Mover’s Distance-based Loss for Training Deep Neural Networks
  3. Regression
    1. Regression Based Loss Functions for Time Series Forecasting

Tech Blogs

  1. AirBnB Engineering
  2. Spotify Research
  3. Netflix Research
  4. DoorDash ML Blog
  5. Uber Engineering
  6. Lyft Engineering
  7. Shopify Engineering
  8. Meta Engineering
  9. LinkedIn Engineering
  10. Kaggle Competition Blog

Knowledge Distillation

  1. https://arxiv.org/pdf/2006.05525.pdf
  2. https://arxiv.org/pdf/1503.02531.pdf
  3. https://arxiv.org/pdf/1910.01108.pdf

Modalities and Mixture of Experts

Deep learning for Tabular data

Embeddings

GANs

  1. GAN = Generative model + Adversarial model (This model judges the Generative model)
  2. GAN tricks and Hacks: https://github.com/soumith/ganhacks
  3. https://medium.com/@jonathan_hui/gan-some-cool-applications-of-gans-4c9ecca35900
  4. [MNIST GAN](https://medium.com/datadriveninvestor/generative-adversarial-network-gan-using-keras-ce1c05cfdfd3
  5. deepgenerativemodels research
  6. https://medium.com/@sanjay035/sketch-to-color-anime-translation-using-generative-adversarial-networks-gans-8f4f69594aeb

Encoder Decoders

  1. Sequence to Sequence Learning with Neural Networks
  2. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation
  3. Deep Visual-Semantic Alignments for Generating Image Descriptions
  4. https://ai.googleblog.com/2016/09/a-neural-network-for-machine.html
  5. https://ai.googleblog.com/2018/05/smart-compose-using-neural-networks-to.html
  6. https://medium.com/@martin.monperrus/sequence-to-sequence-learning-program-repair-e39dc5c0119b
  7. https://towardsdatascience.com/image-captioning-with-keras-teaching-computers-to-describe-pictures-c88a46a311b8
  8. http://www.manythings.org/anki/
  9. https://github.com/keras-team/keras/blob/master/examples/lstm_seq2seq.py

AutoEncoders

Attention

  1. NEURAL MACHINE TRANSLATION
  2. Show, Attend and Tell: Neural Image Caption Generation with Visual Attention
  3. Attention in Deep Networks with Keras
  4. Tx is a hyperparam, explained in a 2015 paper 1508.04025  and not in the original 2014 Attention Models paper 1409.0473.pdf. In the 2014 paper, Tx is the length of the whole input sentence.

Transformers

  1. https://jalammar.github.io/illustrated-transformer/
  2. TabNet Transformer
  3. Pytorch TabNet Youtube

Explainanle AI

  1. Explaining the Predictions of Any Classifier
  2. Integrated Gradients
  3. Robustness of Interpretability Methods
  4. Interpretable Machine Learning Web Book
  5. LIME TDS 1 | LIME Blog | LIME Text Explain

Siamese Networks

Python Libraries

Graph Analysis

Web Developement

Explainable AI

Trading

GitHub pages

Other Resourses