Replies: 1 comment
-
I don't think TorchScript supports lazy parameters via model(x, edge_index) # Run a forward pass
model.conv1 = model.conv1.jittable()
model.conv2 = model.conv2.jittable()
model = torch.jit.script(model) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am trying to convert my GraphSAGE model to JIT script version based this example. I always keep gettingUninitialized PArameters Weight error for
LazyOAram
HeteroData( user={ num_nodes=305171, x=[305171, 1153], y=[305171], train_mask=[305171], val_mask=[305171], test_mask=[305171] }, seller={ num_nodes=31375, x=[31375, 771], train_mask=[31375], val_mask=[31375], test_mask=[31375] }, (user, amount, seller)={ edge_index=[2, 349379] }, (user, date_of_txn, seller)={ edge_index=[2, 349379] }, (seller, rev_amount, user)={ edge_index=[2, 349379] }, (seller, rev_date_of_txn, user)={ edge_index=[2, 349379] } )
My module looks like this
`class GNN(torch.nn.Module):
def init(self, hidden_channels, out_channels):
super().init()
torch.manual_seed(1234567)
model = GNN(hidden_channels=512, out_channels=32)
model = to_hetero(model, data.metadata(), aggr='sum')
model.to(device)
model = torch.jit.script(model)
`
The error was
raise RuntimeError("'{}' has uninitialized parameters {}. Did you forget to run a forward pass?" .format(torch.typename(type(mod)), name))
Beta Was this translation helpful? Give feedback.
All reactions