Weekly

Deep Learning

Neural networks, architectures, and the math that powers modern AI.

Transformers, diffusion models, state space models, CNNs, GNNs — the architectures that actually power the models everyone talks about. This newsletter goes deeper than "GPT is good" into how and why these systems work, what's changing, and where the field is heading.

What we cover

  • Neural network architectures
  • Training techniques & optimization
  • Transformers & attention mechanisms
  • Diffusion models & generative architectures
  • Hardware & compute for deep learning

Who it's for

ML engineers, researchers, and students who want to understand AI at the architecture level.