Summoning Your First Tensors¶
Module 1 | Lesson 1
Professor Torchenstein's Grand Directive¶
Mwahahaha! Welcome, my brilliant acolytes. Today, we shall peel back the very fabric of reality—or, at the very least, the fabric of a PyTorch tensor. We are not merely learning; we are engaging in the sacred act of creation!
"Prepare your minds! The tensors... they are about to be summoned!"
Your Mission Briefing¶
By the end of this dark ritual, you will have mastered the arcane arts of:
- Understanding what a tensor is and why it's the fundamental building block of all modern AI.
- Summoning tensors from nothingness using a variety of powerful PyTorch functions.
- Inspecting the very soul of a tensor: its shape, data type, and the device it inhabits.
- Simple Indexing the main way to access elements of a tensor.
- Creating sequences of numbers with
torch.arange
andtorch.linspace
and_like
methods.
Estimated Time to Completion: 15 glorious minutes of pure, unadulterated learning.
What You'll Need:
- A mind hungry for forbidden knowledge!
- A working PyTorch environment, ready for spellcasting.
- (Optional but recommended) A beverage of your choice—creation is thirsty work!
The Theory Behind the Magic: What is a Tensor, Really?¶
First, we must understand the incantation before we cast the spell. You've heard the word "tensor," whispered in the hallowed halls of academia and screamed during GPU memory overflows. But what is it?
Forget what the mathematicians told you about coordinate transformations for a moment. In our glorious domain of PyTorch, a tensor is simply a multi-dimensional array of numbers. It is the generalization of vectors and matrices to an arbitrary number of dimensions. Think of it as the ultimate container for your data!
- A scalar (a single number, like
5
) is a 0-dimensional tensor. - A vector (a list of numbers, like
[1, 2, 3]
) is a 1-dimensional tensor. - A matrix (a grid of numbers) is a 2-dimensional tensor.
- And a tensor? It can be all of those, and so much more! 3D, 4D, 5D... all await your command!
Why not just use matrices? Mwahaha, a foolish question! Modern data is complex!
- An image is not a flat grid; it's a 3D tensor (
height
,width
,channels
). - A batch of images for training is a 4D tensor (
batch_size
,height
,width
,channels
). - Text data is often represented as 3D tensors (
batch_size
,sequence_length
,embedding_dimension
).
Tensors give us the power to mold and shape all this data with a single, unified tool. They are the clay from which we will sculpt our magnificent AI creations!
The Ritual: Summoning Your First Tensors¶
Enough theory! The time has come to channel the raw power of PyTorch. We will now perform the summoning rituals—the core functions you will use constantly in your dark arts.
First, let's prepare the laboratory by importing torch
and setting a manual seed. Why the seed? To ensure our "random" experiments are reproducible! We are scientists, not gamblers!
1. Conjuring from Existing Data (torch.tensor
)¶
The most direct way to create a tensor is from existing data, like a Python list. The torch.tensor()
command consumes your data and transmutes it into a glorious PyTorch tensor.
import torch
# Set the seed for reproducibility
torch.manual_seed(42)
# A humble Python list
my_list = [[1, 2, 3], [4, 5, 6]]
# The transmutation!
my_tensor = torch.tensor(my_list)
print(my_tensor)
print(type(my_tensor))
tensor([[1, 2, 3], [4, 5, 6]]) <class 'torch.Tensor'>
2. Summoning Tensors of a Specific Size¶
Often, you won't have data yet. You simply need a tensor of a particular shape, a blank canvas for your masterpiece.
torch.randn(shape)
: Summons a tensor filled with random numbers from a standard normal distribution (mean 0, variance 1). Perfect for initializing weights in a neural network!torch.zeros(shape)
: Creates a tensor of the given shape filled entirely with zeros.torch.ones(shape)
: Creates a tensor of the given shape filled entirely with ones.
# A 2x3 tensor of random numbers
random_tensor = torch.randn(2, 3)
print(f"A random tensor:\n {random_tensor}\n")
# A 3x2 tensor of zeros
zeros_tensor = torch.zeros(3, 2)
print(f"A tensor of zeros:\n {zeros_tensor}\n")
# A 2x3x4 tensor of ones
ones_tensor = torch.ones(2, 3, 4)
print(f"A tensor of ones:\n {ones_tensor}")
A random tensor: tensor([[-0.7658, -0.7506, 1.3525], [ 0.6863, -0.3278, 0.7950]]) A tensor of zeros: tensor([[0., 0.], [0., 0.], [0., 0.]]) A tensor of ones: tensor([[[1., 1., 1., 1.], [1., 1., 1., 1.], [1., 1., 1., 1.]], [[1., 1., 1., 1.], [1., 1., 1., 1.], [1., 1., 1., 1.]]])
3. Inspecting Your Creation¶
A true master understands their creation. Once you have summoned a tensor, you must learn to inspect its very soul. These are the three most critical attributes you will constantly examine:
.shape
: Reveals the dimensions of your tensor. A vital sanity check!.dtype
: Shows the data type of the elements within the tensor (e.g.,torch.float8
,torch.float32
,torch.int64
)..device
: Tells you where the tensor lives—on the humble CPU or the glorious GPU.
More details about the data types and device types you will learn in 03_data_types_and_device_types
# Let's create a tensor to inspect
inspection_tensor = torch.randn(3, 4)
print(f"The tensor:\n {inspection_tensor}\n")
# Inspecting its soul
print(f"Shape of the tensor: {inspection_tensor.shape}")
print(f"Data type of the tensor: {inspection_tensor.dtype}")
print(f"Device the tensor is on: {inspection_tensor.device}")
The tensor: tensor([[ 2.2082, -0.6380, 0.4617, 0.2674], [ 0.5349, 0.8094, 1.1103, -1.6898], [-0.9890, 0.9580, 1.3221, 0.8172]]) Shape of the tensor: torch.Size([3, 4]) Data type of the tensor: torch.float32 Device the tensor is on: cpu
4 Precision Strikes: Accessing Elements¶
To access a single, quivering element within our tensor, we use the [row, column]
notation, just as you would with a common Python list of lists. Remember, my apprentice: dimensions are zero-indexed! The first row is row 0
, not row 1! A classic pitfall for the uninitiated.
subject_tensor = torch.randint(0, 100, (5, 4))
# Let's pluck the element at the 2nd row (index 1) and 4th column (index 3).
single_element = subject_tensor[1, 3]
print(f"Element at [1, 3]: {single_element}")
# .item() is a glorious spell to extract the raw Python number from a single-element tensor.
# Use it when you need to pass a tensor's value to other libraries or just print it cleanly!
print(f"Its value is: {single_element.item()}")
print(f"Notice its data type: {single_element.dtype}")
print(f"And its shape: {single_element.shape} (a 0-dimensional tensor!)")
Element at [1, 3]: 9 Its value is: 9 Notice its data type: torch.int64 And its shape: torch.Size([]) (a 0-dimensional tensor!)
5. Creating Sequential Tensors¶
Sometimes, you need tensors with predictable, orderly values.
torch.arange(start, end, step)
: Creates a 1D tensor with values fromstart
(inclusive) toend
(exclusive), with a givenstep
. It's the PyTorch version of Python'srange()
.torch.linspace(start, end, steps)
: Creates a 1D tensor with a specific number ofsteps
evenly spaced betweenstart
andend
(both inclusive).
Your Mission: A Gauntlet of Sequences!¶
Your list of challenges grows, apprentice! Prove your mastery.
- Odd Numbers: Create a 1D tensor of all odd numbers from 1 to 19.
- Evenly Spaced: Create a 1D tensor with 9 evenly spaced numbers from 50 to 100.
- Countdown: Create a tensor that counts down from 10 to 0 in steps of 0.5.
- Pi Sequence: The famous
sin
andcos
functions, used in positional encodings, operate on radians. Create a tensor with 17 evenly spaced numbers from-π
toπ
. (Hint:torch.pi
is your friend!) arange
vs.linspace
: Create a tensor of numbers from 0 to 1 with a step of 0.1 usingarange
. Then, create a tensor from 0 to 1 with 11 steps usinglinspace
. Observe the subtle but critical difference in their outputs! What causes it?
# Your code for the challenges goes here!
print("--- 1. Odd Numbers ---")
odd_numbers = torch.arange(1, 20, 2)
print(f"{odd_numbers}\n")
print("--- 2. Evenly Spaced ---")
evenly_spaced = torch.linspace(50, 100, 9)
print(f"{evenly_spaced}\n")
print("--- 3. Countdown ---")
countdown = torch.arange(10, -0.1, -0.5)
print(f"{countdown}\n")
print("--- 4. Pi Sequence ---")
pi_seq = torch.linspace(-torch.pi, torch.pi, 17)
print(f"{pi_seq}\n")
print("--- 5. arange vs. linspace ---")
# arange may suffer from floating point errors and not include the endpoint!
arange_ex = torch.arange(0, 1, 0.1)
# linspace is often safer for float ranges as it guarantees the number of points.
linspace_ex = torch.linspace(0, 1, 10)
print(f"arange result (0 to 0.9): {arange_ex}")
print(f"linspace result (0 to 1, 10 steps): {linspace_ex}\n")
print("Notice how arange's result doesn't include 1, while linspace does!\n")
--- 1. Odd Numbers --- tensor([ 1, 3, 5, 7, 9, 11, 13, 15, 17, 19]) --- 2. Evenly Spaced --- tensor([ 50.0000, 56.2500, 62.5000, 68.7500, 75.0000, 81.2500, 87.5000, 93.7500, 100.0000]) --- 3. Countdown --- tensor([10.0000, 9.5000, 9.0000, 8.5000, 8.0000, 7.5000, 7.0000, 6.5000, 6.0000, 5.5000, 5.0000, 4.5000, 4.0000, 3.5000, 3.0000, 2.5000, 2.0000, 1.5000, 1.0000, 0.5000, 0.0000]) --- 4. Pi Sequence --- tensor([-3.1416, -2.7489, -2.3562, -1.9635, -1.5708, -1.1781, -0.7854, -0.3927, 0.0000, 0.3927, 0.7854, 1.1781, 1.5708, 1.9635, 2.3562, 2.7489, 3.1416]) --- 5. arange vs. linspace --- arange result (0 to 0.9): tensor([0.0000, 0.1000, 0.2000, 0.3000, 0.4000, 0.5000, 0.6000, 0.7000, 0.8000, 0.9000]) linspace result (0 to 1, 10 steps): tensor([0.0000, 0.1111, 0.2222, 0.3333, 0.4444, 0.5556, 0.6667, 0.7778, 0.8889, 1.0000]) Notice how arange's result doesn't include 1, while linspace does!
6. Creating Tensors from Other Tensors (the _like
methods)¶
Behold, a most elegant form of mimicry! Often, you will need to create a new tensor that has the exact same shape as another. PyTorch provides the _like
methods for this very purpose.
torch.zeros_like(input_tensor)
: Creates a tensor of all zeros with the sameshape
,dtype
, anddevice
as the input tensor.torch.ones_like(input_tensor)
: The same, but for ones!torch.randn_like(input_tensor)
: The same, but for random numbers!
# Let's start with a template tensor
template_tensor = torch.ones(2, 4)
print(f"Our template tensor:\n {template_tensor}\n")
# Now, create tensors LIKE our template
zeros_mimic = torch.zeros_like(template_tensor)
print(f"A zeros tensor created from the template:\n {zeros_mimic}\n")
random_mimic = torch.randn_like(template_tensor)
print(f"A random tensor created from the template:\n {random_mimic}")
Real-World Sorcery: Where are Sequential Tensors Used?¶
You may wonder, "Professor, is this just for making neat little rows of numbers?" A fair question from a novice! The answer is a resounding NO! These sequential tensors are the silent bedrock of many powerful constructs:
Positional Encodings in Transformers: How does a Transformer know the order of words in a sentence? It doesn't, inherently! We must inject that information. The very first step is often to create a tensor representing the positions
[0, 1, 2, ..., sequence_length - 1]
usingtorch.arange
. This sequence is then transformed into a high-dimensional positional embedding.Generating Time-Series Data: When working with audio, financial data, or any kind of signal, you often need a time axis.
torch.linspace
is perfect for creating a smooth, evenly-spaced time vector to plot or process your data against.Creating Coordinate Grids in Vision: For advanced image manipulation, you might need a grid representing the
(x, y)
coordinates of every pixel. You can generate thex
andy
vectors separately usingtorch.arange
and then combine them to form this essential grid.
Your Mission: Forge Your Own Creation!¶
A true master never stops practicing. I leave you with these challenges to solidify your newfound power. Do not be afraid to experiment! To the lab!
Apprentice Challenge: Create a 2D tensor (a matrix) of shape
(3, 5)
filled with random numbers. Then, print its shape and data type to the console.Artisan Challenge: Create a 1D tensor of your favorite numbers (at least 4). Then, create a second tensor of all ones that has the exact same shape as your first tensor.
Create the bias vector: You are tasked with creating the initial "bias" vector for a small neural network layer with 10 output neurons. For arcane reasons, the master architect (me!) has decreed that it must be a 1D tensor, filled with zeros, except for the very last element, which must be
1
. Create this specific tensor!Positional Encoding Denominator: In the legendary Transformer architecture, a key component is the denominator
10000^(2i / d_model)
. Your mission is to create this 1D tensor. Letd_model = 128
. The termi
represents dimension pairs, so it goes from0
tod_model/2 - 1
. Usetorch.arange
to create the2i
sequence first, then perform the final calculation. This is a vital step in building the neural networks of tomorrow!
# Your code for the final challenges goes here!
# Apprentice Challenge Solution
print("--- Apprentice Challenge ---")
apprentice_tensor = torch.randn(3, 5)
print(f"Tensor Shape: {apprentice_tensor.shape}")
print(f"Tensor DType: {apprentice_tensor.dtype}\n")
# Artisan Challenge Solution
print("--- Artisan Challenge ---")
favorite_numbers = torch.tensor([3.14, 42, 1337, 99.9])
ones_like_faves = torch.ones_like(favorite_numbers)
print(f"Favorite Numbers Tensor: {favorite_numbers}")
print(f"Ones-Like Tensor: {ones_like_faves}\n")
# Master Challenge Solution
print("--- Bias Vector Challenge ---")
bias_vector = torch.zeros(10)
bias_vector[9] = 1
print(f"Masterful Bias Vector: {bias_vector}")
print("--- 4. Positional Encoding Denominator ---")
d_model = 128
# Create the sequence for 2i (i.e., 0, 2, 4, ... up to d_model-2)
two_i = torch.arange(0, d_model, 2)
# Calculate the denominator
denominator = 10000 ** (two_i / d_model)
print(f"The first 5 values of the denominator are:\n{denominator[:5]}")
print(f"\nThe last 5 values of the denominator are:\n{denominator[-5:]}")
print(f"\nShape of the denominator tensor: {denominator.shape}")
--- 6. Positional Encoding Denominator (Master Challenge) --- The first 5 values of the denominator are: tensor([1.0000, 1.1548, 1.3335, 1.5399, 1.7783]) The last 5 values of the denominator are: tensor([4869.6753, 5623.4131, 6493.8164, 7498.9419, 8659.6436]) Shape of the denominator tensor: torch.Size([64])
Summary: The Knowledge Is Yours!¶
Magnificent! You've wrestled with the raw chaos of creation and emerged victorious! Let's recount the powerful secrets you've assimilated today:
- Tensors are Everything: You now understand that a tensor is a multi-dimensional array, the fundamental data structure for every piece of data you will encounter in your machine learning journey.
- The Summoning Rituals: You have mastered the core incantations for creating tensors:
torch.tensor
,torch.randn
/zeros
/ones
, the powerful_like
variants, and the sequence generatorstorch.arange
andtorch.linspace
. - Know Your Creation: You have learned the vital importance of inspecting your tensors using
.shape
,.dtype
, and.device
to understand their nature and prevent catastrophic errors.
You have taken your first, most important step. The power is now in your hands!
Professor Torchenstein's Outro¶
Do you feel it? The hum of latent power in your very fingertips? That, my apprentice, is the feeling of true understanding. You have summoned your first tensors, and they have answered your call. But this is merely the beginning! Our creations are still... rigid. Inflexible.
In our next lesson, we will learn the dark arts of Tensor Shape-Shifting & Sorcery! We will slice, squeeze, and permute our tensors until reality itself seems to bend to our will.
Until then, keep your learning rates high and your gradients flowing. The future of AI is in our hands! Mwahahahahaha!