Exploring GPT2
Writing GPT2 from scratch and assigning weights from pre-trained Huggingface model
Writing GPT2 from scratch and assigning weights from pre-trained Huggingface model
Another article in the wild on writing transformers from scratch
Performance focussed talk on using torch.compile to generate fused kernels and learning triton along the way
Train LSTM on Animal Farm and create new text
Use convolutional neural networks for image compression
Use transfer learning on VGG-16 to detect dog breeds
Train a Convolutional Neural Network to Detect Dog Breeds
How to Solve the Dynamic Discovery Problem in ZeroMQ
How to generate text from a transformer model
How to approach system design interview