Open
Description
In FeaturesLinear
def __init__(self, field_dims, output_dim=1):
super().__init__()
self.fc = torch.nn.Embedding(sum(field_dims), output_dim)
self.bias = torch.nn.Parameter(torch.zeros((output_dim,)))
self.offsets = np.array((0, *np.cumsum(field_dims)[:-1]), dtype=np.long)
def forward(self, x):
"""
:param x: Long tensor of size ``(batch_size, num_fields)``
"""
x = x + x.new_tensor(self.offsets).unsqueeze(0)
return torch.sum(self.fc(x), dim=1) + self.bias
The model only assign weight value for each field as embedding, but it didn't consider the data value here.
For example, if we has a data as [[1,0]], the model will work embedding("1")+embedding("0"), but in real, we need work embedding("1")*1+embedding("0")*0.
What reason the model delete the effect of data value?
Metadata
Metadata
Assignees
Labels
No labels