torch.jit.trace()

Creator
Creator
Seonglae Cho
Created
Created
2025 Apr 16 12:57
Editor
Edited
Edited
2025 Jun 16 21:0
Refs
Change
torch.nn
model to TorchScript model
Unlike
torch.jit.script()
, there may be errors in dynamic paths when loading saved models

Dynamic Shape Inference with TorchScript Shape Caching

import torch from torch import nn model = nn.Sequential(nn.Linear(128, 64), nn.ReLU() , nn.Linear(64, 10)) example = torch.randn(1, 128) traced = torch.jit.trace(model, example) # Different shape gets cached on first use output = traced(torch.randn(32, 128))
 
 
 
 
 

Recommendations