Skip to content

aaronhan223/Multimodal-Transformer

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

69 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Mixture-of-Experts for Multimodal Fusion

This repository contains implementation from the paper: FuseMoE: Mixture-of-Experts Transformers for Fleximodal Fusion.

Set Up Environment

Run the following commands to create a conda environment:

conda create -n MulEHR python=3.8
source activate MulEHR
pip install -r requirements.txt

Repository Structure

  • src/: Source code
    • preprocessing/: Scripts for MIMIC-III and MIMIC-IV data preprocessing
    • core/: Core implementation for the MoE and irregularity/modality encoder module
    • scripts/: Scripts to run experiments in different settings
    • utils/: Hyper-parameters, I/O, utility functions

Run Experiments

Under src/scripts/:

MIMIC-III experiments

sh run.sh

MIMIC-IV experiments

sh run_mimiciv.sh

Load Results

First change the filepath in load_result.py, then run

python load_result.py

Acknowledgement

Part of our implementations are based on the following papers:

Citation

@article{han2024fusemoe,
  title={FuseMoE: Mixture-of-Experts Transformers for Fleximodal Fusion},
  author={Han, Xing and Nguyen, Huy and Harris, Carl and Ho, Nhat and Saria, Suchi},
  journal={arXiv preprint arXiv:2402.03226},
  year={2024}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 47.6%
  • Python 36.0%
  • Shell 16.4%