Tensor Shape-Shifting & Sorcery¶
Module 1 | Lesson 2
Professor Torchenstein's Grand Directive¶
Mwahahaha! You have summoned your first tensors from the ether! They are... raw. Untamed. Clumps of numerical clay awaiting a master's touch. A lesser mind would be content with their existence, but not you. Not us!
Today, we sculpt! We will learn the arcane arts of tensor manipulation. We will not merely use data; we will bend it, twist it, and reshape it to our will until it confesses its secrets. This is not data processing; this is tensor sorcery! Prepare to command the very dimensions of your data!
Your Mission Briefing¶
By the time you escape this chamber of knowledge, you will have etched the following incantations into your very soul:
- The Art of Selection: Pluck elements, rows, or slices from a tensor with masterful slicing.
- Forbidden Fusions: Combine disparate tensors into unified monstrosities with
torch.cat
andtorch.stack
. - Metamorphic Mastery: Change a tensor's very form without altering its essence using
reshape
andview
. - Dimensional Sorcery: Add or remove dimensions at will with the mystical
squeeze
andunsqueeze
commands. - The Grand Permutation: Reorder dimensions to your strategic advantage with
permute
andtranspose
.
Estimated Time to Completion: 20 minutes of exhilarating dimensional gymnastics.
What You'll Need:
- The wisdom from our last lesson on summoning tensors.
- A will of iron and a mind ready to be bent!
- Your PyTorch environment, humming with anticipation.
Part 1: The Art of Selection - Slicing¶
Before you can reshape a tensor, you must learn to grasp its individual parts. Indexing is your scalpel, allowing you to perform precision surgery on your data. Slicing is your cleaver, letting you carve out whole sections for your grand experiments.
We will start by summoning a test subject—a 2D tensor brimming with potential! We must also prepare our lab with the usual incantations (import torch
and manual_seed
) to ensure our results are repeatable. We are scientists, not chaos-wizards!
import torch
# Set the seed for cosmic consistency
torch.manual_seed(42)
# Our test subject: A 2D tensor of integers. Imagine it's a map to a hidden treasure!
# Or perhaps experimental results from a daring new potion.
subject_tensor = torch.randint(0, 100, (5, 4))
print(f"Our subject tensor of shape {subject_tensor.shape}, ripe for dissection:")
print(subject_tensor)
Our subject tensor of shape torch.Size([5, 4]), ripe for dissection: tensor([[42, 67, 76, 14], [26, 35, 20, 24], [50, 13, 78, 14], [10, 54, 31, 72], [15, 95, 67, 6]])
Sweeping Strikes: Accessing Rows and Columns¶
Previous lesson: 01_introduction_to_tensors.ipynb gives you the basics for accessing element of a tensor.
But what if we require an entire row or column for our dark machinations? For this, we use the colon :
, the universal symbol for "give me everything along this dimension!"
[row, :]
- Fetches the entire row.[:, column]
- Fetches the entire column.
Let's seize the entire 3rd row (index 2) and the 2nd column (index 1).
# Get the entire 3rd row (index 2)
third_row = subject_tensor[2, :] # or simply subject_tensor[2]
print(f"The third row: {third_row}")
print(f"Shape of the row: {third_row.shape}\\n")
# Get the entire 2nd column (index 1)
second_column = subject_tensor[:, 1]
print(f"The second column: {second_column}")
print(f"Shape of the column: {second_column.shape}")
The third row: tensor([50, 13, 78, 14]) Shape of the row: torch.Size([4])\n The second column: tensor([67, 35, 13, 54, 95]) Shape of the column: torch.Size([5])
Carving Chunks: The Power of Slicing¶
Mere elements are but trivialities! True power lies in carving out entire sub-regions of a tensor. Slicing uses the start:end
notation. As with all Pythonic sorcery, the start
is inclusive, but the end
is exclusive.
Let us carve out the block containing the 2nd and 3rd rows (indices 1 and 2), and the last two columns (indices 2 and 3).
# Carve out rows 1 and 2, and columns 2 and 3
sub_tensor = subject_tensor[1:3, 2:4]
print("Our carved sub-tensor:")
print(sub_tensor)
print(f"Shape of the sub-tensor: {sub_tensor.shape}")
Conditional Conjuring: Boolean Mask Indexing¶
Now for a truly diabolical technique! We can use a boolean mask to summon only the elements that meet our nefarious criteria. A boolean mask is a tensor of the same shape as our subject, but it contains only True
or False
values. When used for indexing, it returns a 1D tensor containing only the elements where the mask was True
.
Let's find all the alchemical ingredients in our tensor with a value greater than 50!
# Create the boolean mask
mask = subject_tensor > 50
print("The boolean mask (True where value > 50):")
print(mask)
print(f"\\n")
# Apply the mask
selected_elements = subject_tensor[mask]
print("Elements greater than 50:")
print(selected_elements)
print(f"Shape of the result: {selected_elements.shape} (always a 1D tensor!)")
# You can also combine conditions! Mwahaha!
# Let's find elements between 20 and 40.
mask_combined = (subject_tensor > 20) & (subject_tensor < 40)
print("\\nElements between 20 and 40:")
print(subject_tensor[mask_combined])
Your Mission: The Slicer's Gauntlet¶
Enough of my demonstrations! The scalpel is now in your hand. Prove your mastery with these challenges!
- The Corner Pocket: From our
subject_tensor
, select the element in the very last row and last column. - The Central Core: Select the inner
3x2
block of thesubject_tensor
(that's rows 1-3 and columns 1-2). - The Even Stevens: Create a boolean mask to select only the elements in
subject_tensor
that are even numbers. (Hint: The modulo operator%
is your friend!) - The Grand Mutation: Use your boolean mask from challenge 3 to change all even numbers in the
subject_tensor
to the value-1
. Then, print the mutated tensor. Yes, my apprentice, indexing can be used for assignment! This is a pivotal secret!
# Your code for the Slicer's Gauntlet goes here!
# --- 1. The Corner Pocket ---
print("--- 1. The Corner Pocket ---")
corner_element = subject_tensor[-1, -1] # Negative indexing for the win!
print(f"The corner element is: {corner_element.item()}\\n")
# --- 2. The Central Core ---
print("--- 2. The Central Core ---")
central_core = subject_tensor[1:4, 1:3]
print(f"The central core:\\n{central_core}\\n")
# --- 3. The Even Stevens ---
print("--- 3. The Even Stevens ---")
even_mask = subject_tensor % 2 == 0
print(f"The mask for even numbers:\\n{even_mask}\\n")
print(f"The even numbers themselves: {subject_tensor[even_mask]}\\n")
# --- 4. The Grand Mutation ---
print("--- 4. The Grand Mutation ---")
# Let's not mutate our original, that would be reckless! Let's clone it first.
mutated_tensor = subject_tensor.clone()
mutated_tensor[even_mask] = -1
print(f"The tensor after mutating even numbers to -1:\\n{mutated_tensor}")