
Andrej Karpathy
I like to train deep neural nets on large datasets 🧠🤖💥 AI researcher and founder of Eureka Labs, focused on modernizing education in the age of AI. Previously served as Director of AI at Tesla and was a founding member of OpenAI. During my PhD at Stanford, I was the architect and lead instructor of the first deep learning course at Stanford (CS231n), which has become one of its most popular classes.
Founder at Eureka Labs
Building AI-native education platform. Creating educational videos on AI on YouTube with general audience and technical tracks.
OpenAI
Built a new team working on midtraining and synthetic data generation.
Director of AI at Tesla
Led the computer vision team of Tesla Autopilot and Tesla Optimus. Handled all in-house data labeling, neural network training and deployment on Tesla's custom inference chip.
Research Scientist at OpenAI
Research scientist and founding member at OpenAI.
PhD at Stanford
PhD focused on convolutional/recurrent neural networks and their applications in computer vision and NLP. Designed and was primary instructor for CS 231n: Convolutional Neural Networks for Visual Recognition.
MSc at University of British Columbia
Worked on learning controllers for physically-simulated figures with Michiel van de Panne.
BSc at University of Toronto
Double major in computer science and physics with a minor in math. First exposure to deep learning through Geoff Hinton's class.
micrograd
A tiny scalar-valued autograd engine with a bite! Implements backpropagation over a dynamically built DAG and a small neural networks library with a PyTorch-like API.
char-rnn
Torch character-level language model built out of LSTMs/GRUs/RNNs. Related to the Unreasonable Effectiveness of Recurrent Neural Networks blog post.
arxiv-sanity
Tames the overwhelming flood of papers on Arxiv. Allows researchers to discover relevant papers, search/sort by similarity, see recent/popular papers, and get recommendations.
neuraltalk2
Early image captioning project in (lua)Torch. Generates natural language descriptions of images using neural networks.
ConvNetJS
Deep learning library written from scratch entirely in Javascript. Enables web-based demos that train neural networks entirely in the browser.
CS 231n Course
Designed and was primary instructor for the first deep learning class at Stanford. Became one of the largest classes at Stanford with 750 students in 2017.