-
Notifications
You must be signed in to change notification settings - Fork 507
Torch-XLA not compatible with static python #8948
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Thank you for the question. I believe the excerpt just means that, for build purposes, it will link against the
|
Hi, I work with @drewjenks01. Our python interpreter is a cc_binary which builds from source during the build. It's statically linked, and there is no libpython shared library that we can point extensions at. Python 3.8 added support for statically-linked interpreters by changing the standard for how native extensions on unix are expected to depend on libpython. In short, they should not declare dynamic library ( https://docs.python.org/3.8/whatsnew/3.8.html#changes-in-the-c-api cc_binary is linking |
@ysiraichi Just following up to see if you had any thoughts on this / plans to address it? This is currently a blocking issue for us sadly |
❓ Questions and Help
I am trying to use Torch-XLA v2.3.0 but it fails with:
I noticed this message here:
which suggests XLA is pulling in a two year old version of pybind11_bazel, which gets its python binary/library/headers/paths by inspecting the copy of the interpreter installed on the operating system. During this probing pybind11_bazel explicitly asks the python interpreter to give it the linker flags it would need to embed the interpreter in its code, leading to that dependency. This renders it unusable with static python.
Is there a way to make this work/could you provide a different build of Torch-XLA which is compatible with static python?
The text was updated successfully, but these errors were encountered: