Skip to content

Commit

Permalink
igibson --> omnigibson
Browse files Browse the repository at this point in the history
  • Loading branch information
cremebrule committed Oct 22, 2022
1 parent 129b28e commit 04747e3
Show file tree
Hide file tree
Showing 366 changed files with 2,497 additions and 2,497 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/build-containers.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ on:
workflow_dispatch:

env:
REGISTRY_USER: igibson
REGISTRY_USER: omnigibson
IMAGE_REGISTRY: docker.io
REGISTRY_PASSWORD: ${{ secrets.REGISTRY_PASSWORD }}

Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ jobs:
build-docs:
name: Build and deploy documentation
runs-on: ubuntu-latest
if: github.repository == 'StanfordVL/iGibson' && github.ref == 'refs/heads/master'
if: github.repository == 'StanfordVL/OmniGibson' && github.ref == 'refs/heads/master'

steps:
- name: Checkout code
Expand Down
20 changes: 10 additions & 10 deletions .github/workflows/examples-as-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,14 +15,14 @@ concurrency:
jobs:
test:
runs-on: [self-hosted, linux, gpu]
if: github.repository == 'StanfordVL/iGibson-dev'
if: github.repository == 'StanfordVL/OmniGibson-dev'

steps:
- name: Checkout source
uses: actions/checkout@v2
with:
submodules: true
path: igibson
path: omnigibson

- name: Add CUDA to env
run: echo "/usr/local/cuda/bin" >> $GITHUB_PATH
Expand All @@ -34,19 +34,19 @@ jobs:
architecture: x64

- name: Install dev requirements
working-directory: igibson
working-directory: omnigibson
run: pip install -r requirements-dev.txt

- name: Install additional dev requirements
working-directory: igibson
working-directory: omnigibson
run: pip install -r tests/requirements-tests.txt

- name: Install
working-directory: igibson
working-directory: omnigibson
run: pip install -e .

- name: Uninstall pip bddl
working-directory: igibson
working-directory: omnigibson
run: pip uninstall -y bddl

- name: Checkout BDDL
Expand All @@ -64,15 +64,15 @@ jobs:
run: pip install -e .

- name: Link Dataset
working-directory: igibson
run: ln -s /scr/ig-data igibson/data
working-directory: omnigibson
run: ln -s /scr/ig-data omnigibson/data

- name: Create tests of examples
working-directory: igibson
working-directory: omnigibson
run: python tests/create_tests_of_examples.py

- name: Run tests
working-directory: igibson
working-directory: omnigibson
run: pytest /tmp/tests_of_examples

- name: Remove Files
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ jobs:
build:
runs-on: "ubuntu-latest"
steps:
- name: Checkout iGibson source
- name: Checkout OmniGibson source
uses: actions/checkout@master
with:
submodules: true
Expand All @@ -24,7 +24,7 @@ jobs:
build
--user
- name: Remove unnecessary files to reduce file size
run: rm -r igibson/render/openvr/samples
run: rm -r omnigibson/render/openvr/samples
- name: Build a binary wheel and a source tarball
run: >-
python -m
Expand All @@ -33,7 +33,7 @@ jobs:
--outdir dist/
.
- name: Publish a Python distribution to PyPI
if: github.repository == 'StanfordVL/iGibson' && startsWith(github.ref, 'refs/tags')
if: github.repository == 'StanfordVL/OmniGibson' && startsWith(github.ref, 'refs/tags')
uses: pypa/gh-action-pypi-publish@release/v1
with:
user: __token__
Expand Down
8 changes: 4 additions & 4 deletions .github/workflows/sync-repos.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Sync iGibson-dev/master to iGibson/master
name: Sync OmniGibson-dev/master to OmniGibson/master

on:
workflow_dispatch:
Expand All @@ -12,7 +12,7 @@ jobs:
if: github.ref == 'refs/heads/master'
steps:
- uses: actions/checkout@v2
- name: Sync iGibson-dev/master to iGibson/master
if: github.repository == 'StanfordVL/iGibson-dev' && startsWith(github.ref, 'refs/tags')
- name: Sync OmniGibson-dev/master to OmniGibson/master
if: github.repository == 'StanfordVL/OmniGibson-dev' && startsWith(github.ref, 'refs/tags')
run:
git push https://$PERSONAL_ACCESS_TOKEN:[email protected]/StanfordVL/iGibson.git master:master
git push https://$PERSONAL_ACCESS_TOKEN:[email protected]/StanfordVL/OmniGibson.git master:master
30 changes: 15 additions & 15 deletions .github/workflows/tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,14 +9,14 @@ concurrency:
jobs:
test:
runs-on: [self-hosted, linux, gpu]
if: github.repository == 'StanfordVL/iGibson-dev'
if: github.repository == 'StanfordVL/OmniGibson-dev'

steps:
- name: Checkout source
uses: actions/checkout@v2
with:
submodules: true
path: igibson
path: omnigibson

- name: Add CUDA to env
run: echo "/usr/local/cuda/bin" >> $GITHUB_PATH
Expand All @@ -28,15 +28,15 @@ jobs:
architecture: x64

- name: Install dev requirements
working-directory: igibson
working-directory: omnigibson
run: pip install -r requirements-dev.txt

- name: Install
working-directory: igibson
working-directory: omnigibson
run: pip install -e .

- name: Uninstall pip bddl
working-directory: igibson
working-directory: omnigibson
run: pip uninstall -y bddl

- name: Checkout BDDL
Expand All @@ -54,33 +54,33 @@ jobs:
run: pip install -e .

- name: Link Dataset
working-directory: igibson
run: ln -s /scr/ig-data igibson/data
working-directory: omnigibson
run: ln -s /scr/ig-data omnigibson/data

# The below method of checking out ig-dataset is currently unused due to poor speed.
# - name: Create data directory
# run: mkdir -p igibson/igibson/data
# run: mkdir -p omnigibson/omnigibson/data
#
# - name: Checkout ig_dataset
# - name: Checkout og_dataset
# uses: actions/checkout@v2
# with:
# repository: StanfordVL/ig_dataset
# repository: StanfordVL/og_dataset
# token: ${{ secrets.PERSONAL_ACCESS_TOKEN }} # PAT is required since this is a different repo
# path: igibson/igibson/data/ig_dataset
# path: omnigibson/omnigibson/data/og_dataset
# submodules: recursive
# lfs: true
#
# - name: Checkout ig_assets
# - name: Checkout og_assets
# uses: actions/checkout@v2
# with:
# repository: StanfordVL/ig_assets
# repository: StanfordVL/og_assets
# token: ${{ secrets.PERSONAL_ACCESS_TOKEN }} # PAT is required since this is a different repo
# path: igibson/igibson/data/assets
# path: omnigibson/omnigibson/data/assets
# submodules: recursive
# lfs: true

- name: Run tests
working-directory: igibson
working-directory: omnigibson
run: pytest

- name: Upload coverage to Codecov
Expand Down
8 changes: 4 additions & 4 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -74,12 +74,12 @@ build
dist

# Directories used for QC pipeline
igibson/utils/data_utils/mesh_decimation/collision
igibson/utils/data_utils/mesh_decimation/visual
igibson/utils/data_utils/mesh_decimation/final_videos
omnigibson/utils/data_utils/mesh_decimation/collision
omnigibson/utils/data_utils/mesh_decimation/visual
omnigibson/utils/data_utils/mesh_decimation/final_videos

# libcryptopp
igibson/render/mesh_renderer/libcryptopp.so.8.6
omnigibson/render/mesh_renderer/libcryptopp.so.8.6

# Coverage
.coverage
Expand Down
20 changes: 10 additions & 10 deletions .gitmodules
Original file line number Diff line number Diff line change
@@ -1,15 +1,15 @@
[submodule "igibson/render/pybind11"]
path = igibson/render/pybind11
[submodule "omnigibson/render/pybind11"]
path = omnigibson/render/pybind11
url = https://github.com/pybind/pybind11.git
[submodule "igibson/render/glfw"]
path = igibson/render/glfw
[submodule "omnigibson/render/glfw"]
path = omnigibson/render/glfw
url = https://github.com/glfw/glfw
[submodule "igibson/render/glm"]
path = igibson/render/glm
[submodule "omnigibson/render/glm"]
path = omnigibson/render/glm
url = https://github.com/g-truc/glm
[submodule "igibson/render/openvr"]
path = igibson/render/openvr
[submodule "omnigibson/render/openvr"]
path = omnigibson/render/openvr
url = https://github.com/ValveSoftware/openvr
[submodule "igibson/render/cryptopp"]
path = igibson/render/cryptopp
[submodule "omnigibson/render/cryptopp"]
path = omnigibson/render/cryptopp
url = https://github.com/fxia22/cryptopp
6 changes: 3 additions & 3 deletions MANIFEST.in
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
include LICENSE

graft igibson
prune igibson/data
prune igibson/render/openvr/samples
graft omnigibson
prune omnigibson/data
prune omnigibson/render/openvr/samples

global-exclude *.py[co]
48 changes: 24 additions & 24 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,34 +1,34 @@
# iGibson: A Simulation Environment to train Robots in Large Realistic Interactive Scenes
# OmniGibson: A Simulation Environment to train Robots in Large Realistic Interactive Scenes

<img src="./docs/images/igibsonlogo.png" width="500"> <img src="./docs/images/igibson.gif" width="250">
<img src="./docs/images/omnigibsonlogo.png" width="500"> <img src="./docs/images/omnigibson.gif" width="250">

iGibson is a simulation environment providing fast visual rendering and physics simulation based on Bullet. iGibson is equipped with fifteen fully interactive high quality scenes, hundreds of large 3D scenes reconstructed from real homes and offices, and compatibility with datasets like CubiCasa5K and 3D-Front, providing 8000+ additional interactive scenes. Some of the features of iGibson include domain randomization, integration with motion planners and easy-to-use tools to collect human demonstrations. With these scenes and features, iGibson allows researchers to train and evaluate robotic agents that use visual signals to solve navigation and manipulation tasks such as opening doors, picking up and placing objects, or searching in cabinets.
OmniGibson is a simulation environment providing fast visual rendering and physics simulation based on Bullet. OmniGibson is equipped with fifteen fully interactive high quality scenes, hundreds of large 3D scenes reconstructed from real homes and offices, and compatibility with datasets like CubiCasa5K and 3D-Front, providing 8000+ additional interactive scenes. Some of the features of OmniGibson include domain randomization, integration with motion planners and easy-to-use tools to collect human demonstrations. With these scenes and features, OmniGibson allows researchers to train and evaluate robotic agents that use visual signals to solve navigation and manipulation tasks such as opening doors, picking up and placing objects, or searching in cabinets.

### Latest Updates
[8/9/2021] Major update to iGibson to reach iGibson 2.0, for details please refer to our [arxiv preprint](https://arxiv.org/abs/2108.03272).
[8/9/2021] Major update to OmniGibson to reach OmniGibson 2.0, for details please refer to our [arxiv preprint](https://arxiv.org/abs/2108.03272).

- iGibson 2.0 supports object states, including temperature, wetness level, cleanliness level, and toggled and sliced states, necessary to cover a wider range of tasks.
- iGibson 2.0 implements a set of predicate logic functions that map the simulator states to logic states like Cooked or Soaked.
- iGibson 2.0 includes a virtual reality (VR) interface to immerse humans in its scenes to collect demonstrations.
- OmniGibson 2.0 supports object states, including temperature, wetness level, cleanliness level, and toggled and sliced states, necessary to cover a wider range of tasks.
- OmniGibson 2.0 implements a set of predicate logic functions that map the simulator states to logic states like Cooked or Soaked.
- OmniGibson 2.0 includes a virtual reality (VR) interface to immerse humans in its scenes to collect demonstrations.


[12/1/2020] Major update to iGibson to reach iGibson 1.0, for details please refer to our [arxiv preprint](https://arxiv.org/abs/2012.02924).
[12/1/2020] Major update to OmniGibson to reach OmniGibson 1.0, for details please refer to our [arxiv preprint](https://arxiv.org/abs/2012.02924).

- Release of iGibson dataset that includes 15 fully interactive scenes and 500+ object models annotated with materials and physical attributes on top of [existing 3D articulated models](https://cs.stanford.edu/~kaichun/partnet/).
- Release of OmniGibson dataset that includes 15 fully interactive scenes and 500+ object models annotated with materials and physical attributes on top of [existing 3D articulated models](https://cs.stanford.edu/~kaichun/partnet/).
- Compatibility to import [CubiCasa5K](https://github.com/CubiCasa/CubiCasa5k) and [3D-Front](https://tianchi.aliyun.com/specials/promotion/alibaba-3d-scene-dataset) scene descriptions leading to more than 8000 extra interactive scenes!
- New features in iGibson: Physically based rendering, 1-beam and 16-beam LiDAR, domain randomization, motion planning integration, tools to collect human demos and more!
- New features in OmniGibson: Physically based rendering, 1-beam and 16-beam LiDAR, domain randomization, motion planning integration, tools to collect human demos and more!
- Code refactoring, better class structure and cleanup.

[05/14/2020] Added dynamic light support :flashlight:

[04/28/2020] Added support for Mac OSX :computer:

### Citation
If you use iGibson or its assets and models, consider citing the following publication:
If you use OmniGibson or its assets and models, consider citing the following publication:

```
@misc{li2021igibson,
title={iGibson 2.0: Object-Centric Simulation for Robot Learning of Everyday Household Tasks},
@misc{li2021omnigibson,
title={OmniGibson 2.0: Object-Centric Simulation for Robot Learning of Everyday Household Tasks},
author={Chengshu Li and Fei Xia and Roberto Mart\'in-Mart\'in and Michael Lingelbach and Sanjana Srivastava and Bokui Shen and Kent Vainio and Cem Gokmen and Gokul Dharan and Tanish Jain and Andrey Kurenkov and Karen Liu and Hyowon Gweon and Jiajun Wu and Li Fei-Fei and Silvio Savarese},
year={2021},
eprint={2108.03272},
Expand All @@ -38,8 +38,8 @@ If you use iGibson or its assets and models, consider citing the following publi
```

```
@inproceedings{shen2021igibson,
title={iGibson 1.0: a Simulation Environment for Interactive Tasks in Large Realistic Scenes},
@inproceedings{shen2021omnigibson,
title={OmniGibson 1.0: a Simulation Environment for Interactive Tasks in Large Realistic Scenes},
author={Bokui Shen and Fei Xia and Chengshu Li and Roberto Mart\'in-Mart\'in and Linxi Fan and Guanzhi Wang and Claudia P\'erez-D'Arpino and Shyamal Buch and Sanjana Srivastava and Lyne P. Tchapmi and Micael E. Tchapmi and Kent Vainio and Josiah Wong and Li Fei-Fei and Silvio Savarese},
booktitle={2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
year={2021},
Expand All @@ -49,30 +49,30 @@ If you use iGibson or its assets and models, consider citing the following publi
```

### Documentation
The documentation for iGibson can be found here: [iGibson Documentation](http://svl.stanford.edu/igibson/docs/). It includes installation guide (including data download instructions), quickstart guide, code examples, and APIs.
The documentation for OmniGibson can be found here: [OmniGibson Documentation](http://svl.stanford.edu/omnigibson/docs/). It includes installation guide (including data download instructions), quickstart guide, code examples, and APIs.

If you want to know more about iGibson, you can also check out [our webpage](http://svl.stanford.edu/igibson), [iGibson 2.0 arxiv preprint](https://arxiv.org/abs/2108.03272) and [iGibson 1.0 arxiv preprint](https://arxiv.org/abs/2012.02924).
If you want to know more about OmniGibson, you can also check out [our webpage](http://svl.stanford.edu/omnigibson), [OmniGibson 2.0 arxiv preprint](https://arxiv.org/abs/2108.03272) and [OmniGibson 1.0 arxiv preprint](https://arxiv.org/abs/2012.02924).

### Dowloading the Dataset of 3D Scenes

For instructions to install iGibson and download dataset, you can visit [installation guide](http://svl.stanford.edu/igibson/docs/installation.html) and [dataset download guide](http://svl.stanford.edu/igibson/docs/dataset.html).
For instructions to install OmniGibson and download dataset, you can visit [installation guide](http://svl.stanford.edu/omnigibson/docs/installation.html) and [dataset download guide](http://svl.stanford.edu/omnigibson/docs/dataset.html).

There are other datasets we link to iGibson. We include support to use CubiCasa5K and 3DFront scenes, adding up more than 10000 extra interactive scenes to use in iGibson! Check our [documentation](https://github.com/StanfordVL/iGibson/tree/master/igibson/utils/data_utils/ext_scene) on how to use those.
There are other datasets we link to OmniGibson. We include support to use CubiCasa5K and 3DFront scenes, adding up more than 10000 extra interactive scenes to use in OmniGibson! Check our [documentation](https://github.com/StanfordVL/OmniGibson/tree/master/omnigibson/utils/data_utils/ext_scene) on how to use those.

We also maintain compatibility with datasets of 3D reconstructed large real-world scenes (homes and offices) that you can download and use with iGibson. For Gibson Dataset and Stanford 2D-3D-Semantics Dataset, please fill out this [form](https://forms.gle/36TW9uVpjrE1Mkf9A). For Matterport3D Dataset, please fill in this [form](http://dovahkiin.stanford.edu/matterport/public/MP_TOS.pdf) and send it to [[email protected]](mailto:[email protected]). Please put "use with iGibson simulator" in your email. Check our [dataset download guide](http://svl.stanford.edu/igibson/docs/dataset.html) for more details.
We also maintain compatibility with datasets of 3D reconstructed large real-world scenes (homes and offices) that you can download and use with OmniGibson. For Gibson Dataset and Stanford 2D-3D-Semantics Dataset, please fill out this [form](https://forms.gle/36TW9uVpjrE1Mkf9A). For Matterport3D Dataset, please fill in this [form](http://dovahkiin.stanford.edu/matterport/public/MP_TOS.pdf) and send it to [[email protected]](mailto:[email protected]). Please put "use with OmniGibson simulator" in your email. Check our [dataset download guide](http://svl.stanford.edu/omnigibson/docs/dataset.html) for more details.

### Using iGibson with VR
If you want to use iGibson VR interface, please visit the [VR guide (TBA)].
### Using OmniGibson with VR
If you want to use OmniGibson VR interface, please visit the [VR guide (TBA)].


### Contributing
This is the github repository for iGibson (pip package `igibson`) 2.0 release. (For iGibson 1.0, please use `1.0` branch.) Bug reports, suggestions for improvement, as well as community developments are encouraged and appreciated. Please, consider creating an issue or sending us an email.
This is the github repository for OmniGibson (pip package `omnigibson`) 2.0 release. (For OmniGibson 1.0, please use `1.0` branch.) Bug reports, suggestions for improvement, as well as community developments are encouraged and appreciated. Please, consider creating an issue or sending us an email.

The support for our previous version of the environment, Gibson, can be found in the [following repository](http://github.com/StanfordVL/GibsonEnv/).

### Acknowledgments

iGibson uses code from a few open source repositories. Without the efforts of these folks (and their willingness to release their implementations under permissable copyleft licenses), iGibson would not be possible. We thanks these authors for their efforts!
OmniGibson uses code from a few open source repositories. Without the efforts of these folks (and their willingness to release their implementations under permissable copyleft licenses), OmniGibson would not be possible. We thanks these authors for their efforts!

- Syoyo Fujita: [tinyobjloader](https://github.com/syoyo/tinyobjloader)
- Erwin Coumans: [egl_example](https://github.com/erwincoumans/egl_example)
Expand Down
2 changes: 1 addition & 1 deletion clean.sh
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#!/bin/sh

rm -rf build
rm -rf igibson/render/mesh_renderer/build/
rm -rf omnigibson/render/mesh_renderer/build/
2 changes: 1 addition & 1 deletion docker/.env
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
REGISTRY=docker.io
REPO=igibson
REPO=omnigibson
VERSION=v2.0.4
Loading

0 comments on commit 04747e3

Please sign in to comment.