Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FFMLP(fp-16) not compatable with Jittor=13.8.5 #73

Open
RobotDogCyberslacking opened this issue Aug 13, 2023 · 2 comments
Open

FFMLP(fp-16) not compatable with Jittor=13.8.5 #73

RobotDogCyberslacking opened this issue Aug 13, 2023 · 2 comments

Comments

@RobotDogCyberslacking
Copy link

following the instructions in ReadMe.md , I got this when running the ngp_fox test:

/hy-tmp/JNeRF-master/python/jnerf/runner/runner.py:193: RuntimeWarning: invalid value encountered in cast
  ndarr = (img*255+0.5).clip(0, 255).astype('uint8')

The output image in log folder is all black.
Some loss value is NAN.

my hardware environment:
Intel(R) Xeon(R) CPU E5-2686 v4
RTX3090-24G

The training and output is all correct when I set using FP_16 = False in config file, which make the code use the MLP from pytorch.nn instead of FMLP.

I tried to fix it by downgrading my python to 3.8, jittor=1.3.4.13(exactly the version in requirements.txt),without running python setup.py, and everything goes fine.

If you encounter the same problem, try:

  1. downgrade your python to 3.8, jittor=1.3.4.13
  2. do not run python setup.py again, because it will upgrade your jittor package to the latest version automaticly.

I hope you guys fix it ASAP.

@pechpo
Copy link

pechpo commented Nov 28, 2023

I also encountered "RuntimeWarning: invalid value encountered in cast ndarr = (img*255+0.5).clip(0, 255).astype('uint8')"
Using FP_16 = False solve my problem.
Thank you for sharing.

@scymz2
Copy link

scymz2 commented Jun 12, 2024

jittor=1.3.4.13 doesn't support -arch=compute_89 (40-series GPU), need to mannually update cuda >=11.8, but I don't know how to do that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants