You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The custom.CustomSiLU operator appears to be incorrectly classified as a binary operator in the _AIMET_V1_BINARY_MODULES list (line 752), which is causing quantization to fail.
SiLU (Sigmoid Linear Unit) is an activation function defined as f(x) = x * sigmoid(x), which operates on a single input tensor. This is a unary operation, not a binary one. I think it's an accidental mistake.
_AIMET_V1_BINARY_MODULES= [
custom.MatMul,
custom.Add,
custom.Multiply,
# ... other binary operators ...custom.CustomSiLU, # <-- This appears to be incorrectly classifiedcustom.Maximum,
# ... other operators ...
]
The
custom.CustomSiLU
operator appears to be incorrectly classified as a binary operator in the_AIMET_V1_BINARY_MODULES
list (line 752), which is causing quantization to fail.SiLU (Sigmoid Linear Unit) is an activation function defined as f(x) = x * sigmoid(x), which operates on a single input tensor. This is a unary operation, not a binary one. I think it's an accidental mistake.
aimet/TrainingExtensions/torch/src/python/aimet_torch/v2/nn/fake_quant/_legacy_impl.py
Line 752 in 414e7c4
The text was updated successfully, but these errors were encountered: