A big part of the utility of math (especially in ML) is having breadth rather than depth. The strategy of picking out specific things you don't know from papers and looking them up is only effective if you have the breadth in your background to understand the answers you find.

Broad knowledge is also what helps you manage the exponential tree of complexity you're encountering.

You won't have seen all the things you come across, but you'll develop the ability to make good judgements about what you need to read to achieve your goals. You'll learn how to recognize when a reference you're reading is more (or less) technical than you need, and how to search for something more appropriate. You'll also learn how and when you can use results without understanding the details.

Finally, as a general grad student strategy trying to learn everything just in time is not a path to success. Even if you had the perfect math oracle that you want it would be setting you up to be left behind. All the oracle gives you is the ability to catch up quickly to the ideas of others. Your job as a grad student is to generate new knowledge and to do that you need to seek things out on your own, not just follow along the latest trend. Part of your job is to go out hunting for ideas that your peers haven't found yet and bring them back to your field.

AI doesn't need to follow the human model, just like planes don't need to flap their wings like a bird. For most jobs AI will be very different from humans. Even when AI acts as human for entertainment I would imagine them being very different internally, as their job is to mimic aspects of human behaviors, not actually a human as a whole.

Almost all of machine learning is about representing data as vectors and performing linear and non-linear transformations in order to perform classification, regression, etc.

Most of ML is fitting models to data. To fit a model you minimize some error measure as a function of its real valued parameters, e.g. the weights of the connections in a neural network. The algorithms to do the minimization are based on gradient descent, which depends on derivatives, i.e. differential calculus.

βStanford CS229 courseβ

βLearn ML in 3 monthsβ

βDeep Learn - Implementation of research papers on Deep Learning+ NLP+ CV in Python using Keras, TensorFlow and Scikit Learn.

βBuilding Brundage Botβ

βSummaries of ML papersβ

βFB AI Toolsβ

βDive Into MLβ

βMachine Learning for Humans - Great article.

βDeep Learning Worldβ

βKubeFlow - Machine Learning Toolkit for Kubernetes.

βTL-GAN: transparent latent-space GAN - Use supervised learning to illuminate the latent space of GAN for controlled generation and edit.

βGrokking Deep Learning - Repository accompanying "Grokking Deep Learning" book.

βGrenade - Deep Learning in Haskell.

βDeep Learning Book Chapter Summaries - Attempting to make the Deep Learning Book easier to understand.

βPracticalAI - Practical approach to learning machine learning.

βRLgraph - Flexible computation graphs for deep reinforcement learning.

βNevergrad - Gradient-free optimization platform.

βConvolution arithmetic - Technical report on convolution arithmetic in the context of deep learning.

βFloydHub - Managed cloud platform for data scientists.

βStyle Transfer as Optimal Transport - Algorithm that transfers the distribution of visual characteristics, or style, of a reference image onto a subject image via an Optimal Transport plan.

βRecommenders - Examples and best practices for building recommendation systems, provided as Jupyter notebooks.

βAdaNet - Lightweight TensorFlow-based framework for automatically learning high-quality models with minimal expert intervention.

βDAWNBench - Benchmark suite for end-to-end deep learning training and inference.

βInterpretable machine learning book (2018) - Explaining the decisions and behavior of machine learning models.

βKubeflow - Machine learning (ML) toolkit that is dedicated to making deployments of ML workflows on Kubernetes simple, portable, and scalable.

βMachine Learning Feynman Experience - Collection of concepts I tried to implement using only Python, NumPy and SciPy on Google Colaboratory.

βTensor2Tensor - Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.

βDeep learning drizzle - Various ML, reinforcement learning video lectures.

βXfer - Transfer Learning library for Deep Neural Networks.

βLearning to Discover Efficient Mathematical Identities - Exploring how machine learning techniques can be applied to the discovery of efficient mathematical identities.

βCleverHans - Adversarial example library for constructing attacks, building defenses, and benchmarking both.

βGoogle AI Research - Contains code released by Google AI Research.

βDeploying Deep Learning - Training guide for inference and deep vision runtime library for NVIDIA DIGITS and Jetson Xavier/TX1/TX2.

βfairseq - Sequence-to-sequence learning toolkit for Torch from Facebook AI Research tailored to Neural Machine Translation (NMT).

βTinyFlow - Tutorial code on how to build your own Deep Learning System in 2k Lines.

βDeep Learning Models - Collection of various deep learning architectures, models, and tips.

βMulti-Level Intermediate Representation Overview - MLIR project aims to define a common intermediate representation (IR) that will unify the infrastructure required to execute high performance machine learning models in TensorFlow and similar ML frameworks.

βPySparNN - Approximate Nearest Neighbor Search for Sparse Data in Python.

βICML - International Conference on Machine Learning.

βDifferentiation for Hackers - The goal of this handbook is to demystify algorithmic differentiation, the tool that underlies modern machine learning.

βML and DS Applications in Industry - Curated list of applied machine learning and data science notebooks and libraries across different industries.

βHoloClean - Machine Learning System for Data Enrichment.

βSnorkel - System for quickly generating training data with weak supervision.

βRAdam - On The Variance Of The Adaptive Learning Rate And Beyond.

βGoogle ML/AI Comicβ

βMachine Learning Notebooks - Series of Jupyter notebooks that walk you through the fundamentals of Machine Learning and Deep Learning in Python using Scikit-Learn, Keras and TensorFlow 2.

βPapers with Code - The latest in machine learning.

βTASO - Tensor Algebra SuperOptimizer for Deep Learning.

βTRAINS - Auto-Magical Experiment Manager & Version Control for AI.

βPolyaxon - Platform for reproducing and managing the whole life cycle of machine learning and deep learning applications.

βSpell - Fastest and most powerful end-to-end platform for machine learning and deep learning.

βML portfolio tips (2019)β

βDeepMind Research - Contains implementations and illustrative code to accompany DeepMind publications.

βDeep Learning Tutorialsβ

βProdify - Radically efficient machine teaching. An annotation tool powered by active learning.

βRunway ML - Discover, create, and use artificial intelligence capabilities in your creative work.

βNotes on Deep Learningβ

βTeachable Machine - Fast, easy way to create machine learning models for your sites, apps, and more β no expertise or coding required.