Skip to content
/ UPT Public
forked from ml-jku/UPT

Code for the paper Universal Physics Transformers

License

Notifications You must be signed in to change notification settings

pabhermoso/UPT

This branch is 2 commits ahead of ml-jku/UPT:main.

Repository files navigation

Universal Physics Transformer (UPT)

[Project Page] [Paper] [Talk] [Tutorial] [Codebase Demo Video] [BibTeX]

Train your own models

Instructions to setup the codebase on your own environment are provided in SETUP_CODE, SETUP_DATA.

A video to motivate design choices of the codebase and give an overview of the codebase can be found here.

Configurations to train models can be found here.

Citation

If you like our work, please consider giving it a star ⭐ and cite us

@article{alkin2024upt,
      title={Universal Physics Transformers}, 
      author={Benedikt Alkin and Andreas Fürst and Simon Schmid and Lukas Gruber and Markus Holzleitner and Johannes Brandstetter},
      journal={arXiv preprint arXiv:2402.12365},
      year={2024}
}

About

Code for the paper Universal Physics Transformers

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 75.0%
  • Jupyter Notebook 24.9%
  • Shell 0.1%