Safetensors
English
llava_next
remote-sensing
AdaptLLM commited on
Commit
74948d1
·
verified ·
1 Parent(s): 1765dfd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -9,7 +9,7 @@ tags:
9
  datasets:
10
  - AdaptLLM/remote-sensing-visual-instructions
11
  ---
12
- # Adapting Multimodal Large Language Models to Domains via Post-Training
13
 
14
  This repo contains the **remote sensing MLLM developed from LLaVA-NeXT-Llama3-8B** in our paper: [On Domain-Specific Post-Training for Multimodal Large Language Models](https://huggingface.co/papers/2411.19930).
15
 
@@ -71,10 +71,10 @@ See [Post-Train Guide](https://github.com/bigai-ai/QA-Synthesizer/blob/main/docs
71
  ## Citation
72
  If you find our work helpful, please cite us.
73
 
74
- [AdaMLLM](https://huggingface.co/papers/2411.19930)
75
  ```bibtex
76
  @article{adamllm,
77
- title={On Domain-Specific Post-Training for Multimodal Large Language Models},
78
  author={Cheng, Daixuan and Huang, Shaohan and Zhu, Ziyu and Zhang, Xintong and Zhao, Wayne Xin and Luan, Zhongzhi and Dai, Bo and Zhang, Zhenliang},
79
  journal={arXiv preprint arXiv:2411.19930},
80
  year={2024}
 
9
  datasets:
10
  - AdaptLLM/remote-sensing-visual-instructions
11
  ---
12
+ # Adapting Multimodal Large Language Models to Domains via Post-Training (EMNLP 2025)
13
 
14
  This repo contains the **remote sensing MLLM developed from LLaVA-NeXT-Llama3-8B** in our paper: [On Domain-Specific Post-Training for Multimodal Large Language Models](https://huggingface.co/papers/2411.19930).
15
 
 
71
  ## Citation
72
  If you find our work helpful, please cite us.
73
 
74
+ [Adapt MLLM to Domains](https://huggingface.co/papers/2411.19930) (EMNLP 2025 Findings)
75
  ```bibtex
76
  @article{adamllm,
77
+ title={On Domain-Adaptive Post-Training for Multimodal Large Language Models},
78
  author={Cheng, Daixuan and Huang, Shaohan and Zhu, Ziyu and Zhang, Xintong and Zhao, Wayne Xin and Luan, Zhongzhi and Dai, Bo and Zhang, Zhenliang},
79
  journal={arXiv preprint arXiv:2411.19930},
80
  year={2024}