You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If GPUs are available and utilized, loading the model by instantiating a DialogTag object is bringing GPU memory allocation to virtually 100%, and my card has way more than the size of the model. Without corroboration that this model loads correctly on a GPU, it seems like the model was saved incorrectly / there is some kind of memory leak.
Additionally, the code is not written to accept a vector of examples for prediction.
If anyone has successfully trained this with a GPU - mind sharing your exact environment?
The text was updated successfully, but these errors were encountered:
Hi @argideritzalpea,
I trained this model with tensorflow without any support for GPU afair. I'm planning to port this library to torch with GPU support and minor improvements to the model. It does accept list of sentences if you want to infer for a bunch of sentences instead of a single sentence.
If GPUs are available and utilized, loading the model by instantiating a
DialogTag
object is bringing GPU memory allocation to virtually 100%, and my card has way more than the size of the model. Without corroboration that this model loads correctly on a GPU, it seems like the model was saved incorrectly / there is some kind of memory leak.Additionally, the code is not written to accept a vector of examples for prediction.
If anyone has successfully trained this with a GPU - mind sharing your exact environment?
The text was updated successfully, but these errors were encountered: