Skip to content
This repository has been archived by the owner on Dec 29, 2022. It is now read-only.

Link pre-trained models from docs #150

Open
papajohn opened this issue Apr 6, 2017 · 3 comments
Open

Link pre-trained models from docs #150

papajohn opened this issue Apr 6, 2017 · 3 comments

Comments

@papajohn
Copy link

papajohn commented Apr 6, 2017

Would you please consider linking pre-trained models for each of the configurations in https://github.com/google/seq2seq/tree/master/example_configs from the docs?

Lots of exploratory research involves fine-tuning an existing model, looking at output, applying a model in a new way, etc. So, lots of people are really just training the same thing (for a week).

E.g., OpenNMT models are shared here: http://opennmt.net/Models/

@coventry
Copy link
Contributor

Are there any third-party pretrained models out there? Training one up myself is proving expensive.

@matthias-samwald
Copy link

Additionally / alternatively, we from the user community could share trained models and results of different settings (at the moment I'm seeing what effect layer norm has on the big model...).

@PapaMadeleine2022
Copy link

@papajohn @coventry @matthias-samwald I also need the pre-trained models for English to German or other languages. Do you get the link?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants