-
-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use conda packages cache as we do with pip #3261
Comments
My project's builds on RTD have been timing-out recently. The biggest contributor (likely ~300 seconds) appears to be time spent downloading packages in packages as part of So if we could setup caching, that would make a big difference for us. |
I want to second this. I am having similar problems recently (#4071), and it looks the download + install of the conda packages varies to take between 500- 800 seconds (once even up to 1500 seconds), of which the download is a significant part I suspect. |
Just to clarify, if this is implemented you will still have the timeout problem in your first recent builds. How is this?
|
Once I thought that maybe a global cache for conda/pip would be great, but then I realize that this has two main problems:
Then, we talked about Although,
Anyway, these are just ideas and unfortunately, they are not on our current roadmap. |
This is true -- but it would still be better than our current state of affairs! |
Unfortunately, I didn't find a proxy for conda channels similar to I found |
We no longer cache things, and we have bigger builders for users using conda |
At the moment our
conda
command uses these settings,For
pip
we use the--cache-dir
attribute so we save all the packages in the project's directory and we avoid to re-download them again.In
conda
we are doing nothing. I found that we can useCONDA_PKGS_DIRS
env variable for this (I didn't find an attribute yet), so:We could use this env variable to point to something like
./user_builds/<project-slug>/.cache/conda
The text was updated successfully, but these errors were encountered: