Change torch.nn model to TorchScript model
Unlike torch.jit.script(), there may be errors in dynamic paths when loading saved models
Dynamic Shape Inference with TorchScript Shape Caching
import torch from torch import nn model = nn.Sequential(nn.Linear(128, 64), nn.ReLU() , nn.Linear(64, 10)) example = torch.randn(1, 128) traced = torch.jit.trace(model, example) # Different shape gets cached on first use output = traced(torch.randn(32, 128))