Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model maxes GPU memory allocation #5

Open
argideritzalpea opened this issue Dec 16, 2021 · 1 comment
Open

Model maxes GPU memory allocation #5

argideritzalpea opened this issue Dec 16, 2021 · 1 comment

Comments

@argideritzalpea
Copy link

If GPUs are available and utilized, loading the model by instantiating a DialogTag object is bringing GPU memory allocation to virtually 100%, and my card has way more than the size of the model. Without corroboration that this model loads correctly on a GPU, it seems like the model was saved incorrectly / there is some kind of memory leak.

Additionally, the code is not written to accept a vector of examples for prediction.

If anyone has successfully trained this with a GPU - mind sharing your exact environment?

@bhavimalik
Copy link

Hi @argideritzalpea,
I trained this model with tensorflow without any support for GPU afair. I'm planning to port this library to torch with GPU support and minor improvements to the model. It does accept list of sentences if you want to infer for a bunch of sentences instead of a single sentence.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants