Skip to content

digitalworlds/UPose

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

203 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

GitHub

Your Scene Name UPose - AI motion tracking for Unity

UPose is a Unity resource that utilizes different AI methods for human motion tracking and demonstrates its capabilities in a gallery of sample applications. You can use the UPose framework as a setup for your own projects, as an experimental setup for human-computer interaction research, or even as instructional material for courses that involve movement, media, and machines.

Features

  • 🦾 MediaPipe human motion tracking
  • 🦿 MMPose human motion tracking
  • πŸ“ Compute joint rotations from mediapipe or other skeletal data
  • πŸ’» Simple API for accessing human pose data
  • 🧍 Standard Human IK skeleton support
  • ⌨️ Real-time data streaming from Python to Unity
  • πŸƒ Several interactive demos included
UPoseDemo.mp4

For more info on how UPose calculates joint angles from media pipe world coordinates read our tutorial.

For direct access to UPose from Python, use https://pypi.org/project/upose/0.1.0/.

✍ Cite as

If you use this repository in your research please cite as:

UPose - AI motion tracking for Unity [Computer software]. Digital Worlds Institute, 2025. https://github.com/digitalworlds/UPose

πŸ“” Contents

Demo 1 - πŸƒβ€β™‚οΈ Exercise Room

In this demo the user must complete an exercise routine with hip adbuctions and elbow flexions. Scene: ExerciseScene.unity

Your Scene Name

Demo 2 - ⚽ Soccer

In this demo the user kicks a soccer ball in a soccer stadium. Scene: SoccerScene.unity

Your Scene Name

Demo 3 - 🏹 Archer

In this demo, the user must shoot the target hanging from the tree by mimicking an archer.

Your Scene Name

Demo 4 - πŸ•Ί Dance Scene

In this scene you dance with four other avatars that mimic your moves on their own pace. Scene: DanceScene.unity

Dance Scene

Demo 5 - 🏞️ Interactive Waterfall

Where motion meets water, each movement sparks a shimmer of falling light. Scene: Waterfallscene1.unity

Your Scene Name

Demo 6 - 🏏 CatchBall

Use both hands to control a platform to catch the ball into the basket! Your Scene Name

πŸ’» Unity C# Example

UPose API is simple and easy to use in C# in Unity. Here is a 3-line example that shows how to get a bone rotation from the motion tracker and apply it to an avatar.

MotionTrackingPose pose = FindFirstObjectByType<UPose>();

//Get right fore arm rotation
Quaternion rotation=pose.GetRotation(Landmark.LEFT_ELBOW);

//Apply the rotation to your avatar
LeftForeArm.localRotation=rotation;

πŸ’€ Skeletal Structure

UPose supports avatars with the standard human IK skeleton. Avatars with this structure can be created using tools such as readyplayer.me. The UPose API gets the 3D coordinates of the corresponding human joints from the motion tracking source (MediaPipe, MMPose, etc.) and it calculates joint rotations that can be then assigned as localRotations to the corresponding bones of the skeleton. The bone hierarchy and naming convention is shown below:

- Hips
  - Spine
    - Spine1
      - Spine2
        - Neck
          - Head
            - HeadTop_End
            - LeftEye
            - RightEye
        - LeftShoulder
          - LeftArm
            - LeftForeArm
              - LeftHand
        - RightShoulder
          - RightArm
            - RightForeArm
              - RightHand
  - LeftUpLeg
    - LeftLeg
      - LeftFoot
  - RightLeg
    - LightLeg
      -RightFoot

▢️ How to run

The motion tracking methods are implemented in Python and stream the data to Unity. To install Python and the motion tracking libraries please refer to the installation guide below. Once you complete the installation, you can run UPose with the following steps:

Step 1 - Start the Motion Tracking Steam

To run mediapipe go into the folder GitHub/UPose/MotionCapture/mediapipe and run:

conda activate mediapipe
python run_mediapipe.py

This program will attempt to connect to Unity and stream the motion capture data to your Unity project. You can close the python project by pressing the escape button on the camera window.

Step 2 - Run a Unity Demo Scene

Open in Unity the project GitHub/UPose/UPose and select one of the sample scenes provided in the UPose framework. The sample scenes can be found in the Unity project folder Assets/Scenes.

Keep the python program running while using UPose in Unity to see the motion capture data in action!

Enjoy! πŸ§πŸƒβ€β™‚οΈπŸ•Ί

βš™οΈ Installation

MediaPipe

The mediapipe tracking works in Python. It is recomended that you install miniconda, a minimal version of Anaconda, in order to keep the python setup of this project separate from other python installations in your computer. https://www.anaconda.com/docs/getting-started/miniconda/install

After installation you need to use the terminal and run from the folder miniconda3/bin or miniconda3/Library/bin the following:

conda init

You can verify your miniconda installation by:

conda --version

Then you can create a new environment for the mediapipe setup:

conda create -n mediapipe python=3.9
conda activate mediapipe

Then you can install the dependencies of this project:

pip install opencv-python upose
pip install mediapipe==0.10.14
conda deactivate

MMPose

conda create --name openmmlab python=3.8 -y
conda activate openmmlab

🀝 Credits

We would like to acknowledge the sources of assets we used in this project:

GitHub

About

UPose calculates joint rotations from MediaPipe pose data in Python and streams them to Unity for live avatar animation.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors