The IRCHAD Navigation System is a Raspberry Pi-powered prototype that provides indoor navigation assistance for visually impaired users. It uses IMU sensors, ultrasonic modules, a Pi camera, and printed AprilTags to estimate the user’s position and guide them in real time within indoor environments.
Irchadlot/
└── tags/ #examples of tags
├── camera_calibration.py # Calibrate the Pi Camera for AprilTag detection
├── commandes.py # Command interpretation and voice output
├── device_monitoring.py # Hardware status monitoring (IMU, ultrasonic, etc.)
├── navigation.py # Path planning and instruction generation
├── obstacle_detection.py # Uses ultrasonic sensors and YOLOE model for obstacle avoidance
├── position_tracking.py # Position estimation using IMU and AprilTags
- Raspberry Pi 4 with raspbian as OS
- Pi Camera Module v4
- IMU Sensor (MPU6050)
- Ultrasonic Sensor (HC-SR04)
- AprilTags (must be printed and placed in the environment)
- Python 3.8+
- This system must be run on a Raspberry Pi .
- The first step before using the navigation system is to run
camera_calibration.pyto calibrate the Pi Camera. - AprilTags must be printed and placed in the environment for accurate position tracking.
On Raspberry Pi, run:
sudo raspi-config
Then:
Enable the Camera interface
Enable I2C for the IMU
Reboot the Raspberry Pi
sudo apt install python3-venv -y
python3 -m venv irchad-env
source irchad-env/bin/activate
pip install --upgrade pip
sudo apt update && sudo apt upgrade -y
sudo apt install i2c-tools libcamera-apps python3-opencv
pip install numpy scipy matplotlib pillow
pip install opencv-python picamera2
pip install smbus2 mpu6050-raspberrypi adafruit-blinka
pip install RPi.GPIO
pip install pyttsx3
pip install ultralytics dt-apriltags
pip install shapely networkx
python3 tags/camera_calibration.py
python3 tags/navigation.py