ChartMoE

ICLR2025 Oral

arXiv

Project Page

Github Repo

Hugging Face Model

ChartMoE is a multimodal large language model with Mixture-of-Expert connector, based on InternLM-XComposer2 for advanced chart 1)understanding, 2)replot, 3)editing, 4)highlighting and 5)transformation.

This is a reproduction of diversely-aligner moe-connector, please feel free to use it for continue sft training!

Open Source License

The data is licensed under Apache-2.0.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Collection including Coobiw/ChartMoE-Aligned-Connector