Skip to content

Module 1 โ€“ I See Tensors Everywhere ๐Ÿ•ถ๏ธ

"Behold, fledgling datanauts! The world is naught but tensors awaiting my command โ€” and soon, yours! "

โ€” Prof. Torchenstein

Salutations, my brilliant (and delightfully reckless) apprentices! By opening this manuscript you have volunteered to join my clandestine legion of PyTorch adepts. Consider this your official red-pill moment: from today every pixel, every token, every measly click-through rate shall reveal its true formโ€”a multidimensional array begging to be torch.tensor-ed โ€ฆ and we shall oblige it with maniacal glee! Mwahaha! ๐Ÿ”ฅ๐Ÿงช

pytorch tensors everywhere

Over the next notebooks we will:

  • Conjure tensors from thin air, coffee grounds, and suspiciously random seeds.
  • Shape-shift them with view, reshape, squeeze, unsqueeze, permute & the occasional dramatic flourish of einops.
  • Crunch mathematics so ferocious it makes matrix multiplications whimper โ€” and powers mighty Transformers.
  • Charm the GPU, dodge gradient explosions ๐Ÿƒโ€โ™‚๏ธ๐Ÿ’ฅ, and look diabolically clever while doing it.

Rebel Mission Checklist ๐Ÿ“

Tensors: The Building Blocks

  1. Summoning Your First Tensors - Learn to create tensors from scratch, access their elements and inspect their fundamental properties like shape, type, and device.
  2. Tensor Surgery & Assembly - Master the dark arts of tensor dissection! Slice with surgical precision, fuse separate tensors with torch.cat and torch.stack, and divide them with torch.split. Your scalpel awaits!
  3. Tensor Metamorphosis: Shape-Shifting Mastery - Transform tensor forms without altering their essence! Reshape reality with torch.reshape and torch.view, manipulate dimensions with squeeze and unsqueeze, expand and replicate data with expand and repeat, and flatten complex structures into submission.
  4. DTypes & Devices: The Soul of the Neural Network - Master the floating-point dtypes (float16, bfloat16) crucial for debugging training, and learn to teleport your tensors to the correct device for maximum power.

Tensor Operations: Computation at Scale

  1. Elemental Tensor Alchemy - Perform powerful element-wise and reduction operations to transform your tensors.
  2. Matrix Mayhem: Multiply or Perish - Unleash the raw power of matrix multiplication, the core of modern neural networks.
  3. Broadcasting: When Dimensions Bow to You - Discover the magic of broadcasting, where PyTorch intelligently handles operations on tensors of different shapes.

Einstein Summation: The Power of einsum

  1. Einstein Summation: Harness the ฮ›-Power - Wield the elegant einsum to perform complex tensor operations with a single, concise command.
  2. Advanced Einsum Incantations - Combine multiple tensors in arcane einsum expressions for operations like batched matrix multiplication.

Autograd: Automatic Differentiation

  1. Autograd: Ghosts in the Machine (Learning) - Uncover the secrets of automatic differentiation and see how PyTorch automatically computes gradients.
  2. Gradient Hoarding for Grand Spells - Learn the technique of gradient accumulation to simulate larger batch sizes and train massive models.

Enough talk! The tensors are humming with anticipation. Your first incantation awaits.

Proceed to the Summoning Ritual!