(batch_size, width, height), (batch_size, dim)
- usually 1D is bias
- usually 2D is weight matrix
- usually 3D is sequential or image/video data
# One way tensor = torch.ones([100, 2]).cuda() # Better Code to Create Tensors Directly on GPU tensor = torch.ones([100, 2], device="cuda:0")
Pytorch Tensor member functions
torch Tensor.view()
torch Tensor.reshape()
torch Tensor.backward()
torch Tensor.squeeze()
torch Tensor.unsqueeze()
Pytorch Tensor.dim()
Pytorch Tensor.size()
torch pin_memory()
torch.tril()
torch.sum()
torch Tensor.rand()
torch Tensor.dim()
torch Tensor.bmm()
torch.allclose()
torch.Tensor.transpose()
torch.arange()
torch.register_hook()
torch.flatten()
torch Tensor.expand()
torch Tensor.expand_as()
torch.Tensor.detach()
torch Tensor.half()
torch.Tensor members
Pytorch Tensors
numpy index manipulation
[…, None]
append last dimension
[None, …]
insert first dimension
Indexing for In-place operations to save memory (numpy Boolean Indexing)
x = torch.randn(100, 100, 100) # Large tensor with random values x[x < 0] += 0.1 # Use advanced indexing to update the negative values