You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A quant scheme that james/someone else at occulus suggested:
Use normal L2 regularization.
Quantize each weight to either -1 or 1 with probability proportional to the current weight.
FOr example: If W = -0.5, set W = -1 with 75% probility and W = 1 with 25% probability.
The text was updated successfully, but these errors were encountered:
A quant scheme that james/someone else at occulus suggested:
Use normal L2 regularization.
Quantize each weight to either -1 or 1 with probability proportional to the current weight.
FOr example: If W = -0.5, set W = -1 with 75% probility and W = 1 with 25% probability.
The text was updated successfully, but these errors were encountered: