ChartMoE(ICLR2025 Oral)
Collection
[ICLR2025 Oral] ChartMoE: Mixture of Diversely Aligned Expert Connector for Chart Understanding
github: https://github.com/IDEA-FinAI/ChartMoE
•
6 items
•
Updated
ChartMoE
ICLR2025 Oral
ChartMoE is a multimodal large language model with Mixture-of-Expert connector, based on InternLM-XComposer2 for advanced chart 1)understanding, 2)replot, 3)editing, 4)highlighting and 5)transformation.
This is a reproduction of diversely-aligner moe-connector, please feel free to use it for continue sft training!
The data is licensed under Apache-2.0.