In PyTorch, detach() is a method that blocks gradient flowIt saves memory by removing a tensor from the computation graph, allowing you to manipulate it without affecting gradients torch.Tensor.detach — PyTorch 2.4 documentationThis method also affects forward mode AD gradients and the result will never have forward mode AD gradients.https://pytorch.org/docs/stable/generated/torch.Tensor.detach.html