A big part of the utility of math (especially in ML) is having breadth rather than depth. The strategy of picking out specific things you don't know from papers and looking them up is only effective if you have the breadth in your background to understand the answers you find.

Broad knowledge is also what helps you manage the exponential tree of complexity you're encountering.

You won't have seen all the things you come across, but you'll develop the ability to make good judgements about what you need to read to achieve your goals. You'll learn how to recognize when a reference you're reading is more (or less) technical than you need, and how to search for something more appropriate. You'll also learn how and when you can use results without understanding the details.

Finally, as a general grad student strategy trying to learn everything just in time is not a path to success. Even if you had the perfect math oracle that you want it would be setting you up to be left behind. All the oracle gives you is the ability to catch up quickly to the ideas of others. Your job as a grad student is to generate new knowledge and to do that you need to seek things out on your own, not just follow along the latest trend. Part of your job is to go out hunting for ideas that your peers haven't found yet and bring them back to your field.

AI doesn't need to follow the human model, just like planes don't need to flap their wings like a bird. For most jobs AI will be very different from humans. Even when AI acts as human for entertainment I would imagine them being very different internally, as their job is to mimic aspects of human behaviors, not actually a human as a whole.

Almost all of machine learning is about representing data as vectors and performing linear and non-linear transformations in order to perform classification, regression, etc.

Most of ML is fitting models to data. To fit a model you minimize some error measure as a function of its real valued parameters, e.g. the weights of the connections in a neural network. The algorithms to do the minimization are based on gradient descent, which depends on derivatives, i.e. differential calculus.

Deep Learn - Implementation of research papers on Deep Learning+ NLP+ CV in Python using Keras, TensorFlow and Scikit Learn.

Machine Learning for Humans - Great article.

KubeFlow - Machine Learning Toolkit for Kubernetes. (Winding Road to Better Machine Learning Infrastructure Through Tensorflow Extended and Kubeflow)

KALE (Kubeflow Automated pipeLines Engine) - Aims at simplifying the Data Science experience of deploying Kubeflow Pipelines workflows.

TL-GAN: transparent latent-space GAN - Use supervised learning to illuminate the latent space of GAN for controlled generation and edit.

Grokking Deep Learning - Repository accompanying "Grokking Deep Learning" book.

Grenade - Deep Learning in Haskell.

Deep Learning Book Chapter Summaries - Attempting to make the Deep Learning Book easier to understand.

PracticalAI - Practical approach to learning machine learning.

RLgraph - Flexible computation graphs for deep reinforcement learning.

Nevergrad - Gradient-free optimization platform.

Convolution arithmetic - Technical report on convolution arithmetic in the context of deep learning.

FloydHub - Managed cloud platform for data scientists.

Style Transfer as Optimal Transport - Algorithm that transfers the distribution of visual characteristics, or style, of a reference image onto a subject image via an Optimal Transport plan.

Recommenders - Examples and best practices for building recommendation systems, provided as Jupyter notebooks.

AdaNet - Lightweight TensorFlow-based framework for automatically learning high-quality models with minimal expert intervention.

DAWNBench - Benchmark suite for end-to-end deep learning training and inference.

Interpretable machine learning book (2018) - Explaining the decisions and behavior of machine learning models.

KubeFlow Pipelines - Machine learning (ML) toolkit that is dedicated to making deployments of ML workflows on Kubernetes simple, portable, and scalable.

Machine Learning Feynman Experience - Collection of concepts I tried to implement using only Python, NumPy and SciPy on Google Colaboratory.

Tensor2Tensor - Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.

Deep learning drizzle - Various ML, reinforcement learning video lectures. (Code)

Xfer - Transfer Learning library for Deep Neural Networks.

Learning to Discover Efficient Mathematical Identities - Exploring how machine learning techniques can be applied to the discovery of efficient mathematical identities.

CleverHans - Adversarial example library for constructing attacks, building defenses, and benchmarking both.

Google AI Research - Contains code released by Google AI Research.

Deploying Deep Learning - Training guide for inference and deep vision runtime library for NVIDIA DIGITS and Jetson Xavier/TX1/TX2.

fairseq - Sequence-to-sequence learning toolkit for Torch from Facebook AI Research tailored to Neural Machine Translation (NMT).

TinyFlow - Tutorial code on how to build your own Deep Learning System in 2k Lines.

Deep Learning Models - Collection of various deep learning architectures, models, and tips.

Multi-Level Intermediate Representation Overview - MLIR project aims to define a common intermediate representation (IR) that will unify the infrastructure required to execute high performance machine learning models in TensorFlow and similar ML frameworks. (Talks) (HN) (Slides)

PySparNN - Approximate Nearest Neighbor Search for Sparse Data in Python.

ICML - International Conference on Machine Learning.

Differentiation for Hackers - The goal of this handbook is to demystify algorithmic differentiation, the tool that underlies modern machine learning.

ML and DS Applications in Industry - Curated list of applied machine learning and data science notebooks and libraries across different industries.

HoloClean - Machine Learning System for Data Enrichment.

Snorkel - System for quickly generating training data with weak supervision.

RAdam - On The Variance Of The Adaptive Learning Rate And Beyond.

Machine Learning Notebooks - Series of Jupyter notebooks that walk you through the fundamentals of Machine Learning and Deep Learning in Python using Scikit-Learn, Keras and TensorFlow 2.

Streamlit - Fastest way to build custom ML tools. (Web) (Awesome Streamlit) (Streamlit Cheat Sheet) (Tips, tricks, methods, and techniques for building apps with streamlit) (Best of Streamlit)

Papers with Code - The latest in machine learning.

TASO - Tensor Algebra SuperOptimizer for Deep Learning.

TRAINS - Auto-Magical Experiment Manager & Version Control for AI.

Polyaxon - Platform for reproducing and managing the whole life cycle of machine learning and deep learning applications.

Spell - Fastest and most powerful end-to-end platform for machine learning and deep learning.

DeepMind Research - Contains implementations and illustrative code to accompany DeepMind publications.

Prodify - Radically efficient machine teaching. An annotation tool powered by active learning.

Runway ML - Discover, create, and use artificial intelligence capabilities in your creative work.

Teachable Machine - Fast, easy way to create machine learning models for your sites, apps, and more – no expertise or coding required.

Clipper - Prediction serving system that sits between user-facing applications and a wide range of commonly used machine learning models and frameworks.

Arcade Learning Environment - Simple object-oriented framework that allows researchers and hobbyists to develop AI agents for Atari 2600 games.

Machine Learning Crash Course with TensorFlow APIs - Google's fast-paced, practical introduction to machine learning.

Dive into Deep Learning - Interactive deep learning book with code, math, and discussions, based on the NumPy interface. (HN) (Code)

Foundations of Machine Learning book - New edition of a graduate-level machine learning textbook that focuses on the analysis and theory of algorithms.

Deep Learning book - Resource intended to help students and practitioners enter the field of machine learning in general and deep learning in particular. (Code) (Notes) (Exercises) (LaTeX files for book notation) (PDF) (PDF 2)

Introduction to Deep Learning - Eugene Charniak - Project-based guide to the basics of deep learning.

Turi Create - Simplifies the development of custom machine learning models.

Demucs - Code for the paper Music Source Separation in the Waveform Domain.

Trax - Helps you understand and explore advanced deep learning.

Awesome Data Labeling - Curated list of awesome data labeling tools.

MediaPipe - Cross-platform framework for building multimodal applied machine learning pipelines.

Google Colaboratory - Research project created to help disseminate machine learning education and research.

Population Based Augmentation - Algorithm that quickly and efficiently learns data augmentation functions for neural network training.

AutoML-Zero - Open source code for the paper: "AutoML-Zero: Evolving Machine Learning Algorithms From Scratch". (HN)

Made With ML - Share what you've Made With ML. (Code)

Backpropagation 101 (2020) - How to trick yourself into understanding backprop without even trying.

Awesome Graph Classification - Collection of important graph embedding, classification and representation learning papers with implementations.

fast.ai - Making neural nets uncool again. (Code) (Docs) (Course launch) (HN)

SVM tutorial (HN)

Deep Learning in Production - Notes and references about deploying deep learning-based models in production.

Weights & Biases - Developer tools for ML. Experiment tracking, hyperparameter optimization, model and dataset versioning. (Code) (Docs) (Examples)

Protocols and Structures for Inference (PSI) spec - Aims to develop an architecture for presenting machine learning algorithms, their inputs (data) and outputs (predictors) as resource-oriented RESTful web services.

Reverb - Efficient and easy-to-use data storage and transport system designed for machine learning research.

NeurIPS - Conference on Neural Information Processing Systems.

Distill - Latest articles about machine learning.

Model Card Toolkit - Streamlines and automates generation of Model Cards, machine learning documents that provide context and transparency into a model's development and performance. (Article)

Bethge Lab - Perceiving Neural Networks.

SciML - Open Source Software for Scientific Machine Learning.

Compose - Machine learning tool for automated prediction engineering. It allows you to structure prediction problems and generate labels for supervised learning.

Applied ML - Curated papers, articles, and blogs on data science & machine learning in production.

explained.ai - Deep explanations of machine learning and related topics.

Determined - Deep Learning Training Platform. (Web)

Confetti AI - Ace Your Machine Learning Interviews.

Awesome Teachable Machine List - Curated list of awesome machine learning projects built with Google's Teachable Machine.

Synthetic Data Vault (SDV) - Synthetic Data Generation for tabular, relational, time series data. (Web)

Penn Machine Learning Benchmarks - Large, curated repository of benchmark datasets for evaluating supervised machine learning algorithms. (Web)

Responsible Machine Learning - Collection of tools for eXplainable AI (XAI). (Web)

MI2 DataLab - Group of mathematicians and computer scientists that love to play with data. (GitHub)

Papers of Robust ML - Mainly focus on defenses.

Awesome AutoML Papers - Curated list of automated machine learning papers, articles.

Preferred Networks - Develops practical applications of deep learning and other cutting-edge technologies. (GitHub)

ML Art - Curated showcase of creative machine learning artworks and projects.

ML Visuals - Contains figures and templates which you can reuse and customize to improve your scientific writing.

DL Visuals - Deep Learning Visuals.

OpenMined Courses - Learn how privacy technology is changing our world and how you can lead the charge.

Adversarial Robustness Toolbox - Python library for Machine Learning Security.

Brain Tokyo Workshop - Research materials released by members of the Google Brain team in Tokyo.

create-ml-app - Template Makefile for ML projects in Python.

telesto.ai - Competitive marketplace, where you can work on real-life machine learning challenges.

ML from the Fundamentals - Machine learning in a "from the first principles" style. (Code)

MIT Mądry Lab - Towards a Principled Science of Deep Learning. (GitHub)

DeepFaceLab - Leading software for creating deepfakes.

Deep Learning DIY (Code) (GitHub)

MLCommons - Machine learning innovation to benefit everyone.

ZenML - Extensible, open-source MLOps framework for using production-ready Machine Learning pipelines.

MLJAR Automated Machine Learning - Automates Machine Learning Pipeline with Feature Engineering and Hyper-Parameters Tuning. (Web) (HN)

Noah ML Research - Research related code released by Huawei Noah's Ark Lab.

ML Surveys - Survey papers summarizing advances in deep learning, NLP, CV, graphs, reinforcement learning, recommendations, graphs, etc.

Diverse Counterfactual Explanations (DiCE) for ML - Generate Diverse Counterfactual Explanations for any machine learning model. (Docs)

Interpretable Machine Learning - Techniques & resources for training interpretable ML models, explaining ML models, and debugging ML models.

Awesome Causality - Resources related to causality.

Patterns, Predictions, and Actions Book - A story about machine learning.

MIT HAN Lab - Accelerate Deep Learning Computing. (GitHub)

Transformers - Collection of resources to study Transformers in depth.

Label Errors - Label errors in benchmark ML test sets. (Lobsters)

AutoML.org (GitHub)

Machine Learning Collection - Resource for learning about ML, DL, PyTorch and TensorFlow.

Reproducible Deep Learning (2021) - PhD Course in Data Science. (Code)