An AI system that fuses live earthquake data, real-time weather signals, and NLP-based human distress detection into a unified disaster risk score — with explainable AI showing exactly WHY each decision was made.
When disasters strike, emergency responders face three critical questions:
- Where is the danger right now?
- How severe is the risk in each location?
- Why is that area flagged as high risk? Existing systems answer these questions in silos — seismic agencies track earthquakes, weather services track storms, social media monitors track human signals. No open-source system fuses all three in real time.
DisasterSenseAI solves this.
Live Seismic Data (USGS) ──┐
│
Live Weather Data (OpenWeather)─┼──► XGBoost ML Model ──► Risk Score + SHAP Explanation
│ ↑
NLP Distress Signals ──┘ Trained on
(HuggingFace Transformers) 5,000 scenarios
| Source | Data | Update Frequency |
|---|---|---|
| USGS Earthquake API | Magnitude, location, depth | Every 5 minutes |
| OpenWeather API | Wind speed, temperature, storm severity | Every 5 minutes |
| HuggingFace NLP | Distress signal detection from text | On demand |
- Live Earthquake Map — Real-time USGS data plotted on interactive world map
- Weather Fusion — OpenWeather API layered with seismic signals per location
- NLP Distress Detection — Zero-shot HuggingFace classifier detects SOS messages with confidence scores
- XGBoost Risk Model — 94% accuracy, 0.9777 AUC-ROC, trained on 5,000 disaster scenarios
- SHAP Explainability — Every prediction comes with a bar chart explaining the top risk factors
- Live Alert Panel — Auto-ranked danger zones with expandable SHAP explanations
- Custom Scenario Predictor — Input any values and get instant AI prediction with explanation
| Layer | Technology |
|---|---|
| ML Model | XGBoost (94% accuracy, 0.9777 AUC-ROC) |
| Explainability | SHAP (TreeExplainer) |
| NLP | HuggingFace Transformers (facebook/bart-large-mnli) |
| Seismic Data | USGS Earthquake API |
| Weather Data | OpenWeatherMap API |
| Frontend | Streamlit |
| Maps | Folium + streamlit-folium |
| Deployment | Streamlit Cloud |
git clone https://github.com/Suresh-1116/DisasterSenseAI.git
cd DisasterSenseAIpython -m venv venv
venv\Scripts\activate # Windows
source venv/bin/activate # Mac/Linuxpip install -r requirements.txtCreate a .env file in the root folder:
OPENWEATHER_API_KEY=your_api_key_here
Get a free key at openweathermap.org
python train_model.pypython -m streamlit run app.py| Metric | Score |
|---|---|
| Accuracy | 94% |
| AUC-ROC | 0.9777 |
| Precision (High Risk) | 95% |
| Recall (High Risk) | 98% |
| F1 Score | 0.96 |
The model learned that magnitude and distress signal count are the strongest predictors of disaster risk, followed by wind speed and population density.
DisasterSenseAI/
│
├── app.py # Main Streamlit dashboard
├── distress_detector.py # HuggingFace NLP module
├── risk_predictor.py # ML prediction + SHAP explanation
├── train_model.py # XGBoost model training pipeline
│
├── disaster_model.pkl # Trained XGBoost model
├── shap_explainer.pkl # SHAP TreeExplainer
│
├── requirements.txt # Python dependencies
├── .env # API keys (not committed)
└── .gitignore
Building this project from scratch taught me:
- Multi-modal AI — fusing structured data (seismic, weather) with unstructured data (text signals)
- Explainable AI — using SHAP to make ML decisions transparent and trustworthy
- Real-time data pipelines — connecting live APIs and handling failures gracefully
- Production deployment — Docker concepts, environment secrets, Streamlit Cloud
- Integrate real Twitter/X API for live distress signal detection
- Add satellite imagery analysis using YOLOv8 for building damage detection
- Historical simulation mode using past disaster datasets (Kerala floods 2018)
- SMS/email alert system for high-risk zone notifications
- Mobile-responsive UI
V Suresh Kumar
- GitHub: @Suresh-1116
- LinkedIn: suresh-kumar-43a458255
- Email: vsureshkumar1116@gmail.com
MIT License — feel free to use and build on this project.
If this project helped you, please give it a star!


