Skip to content

ExplainableML/finer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 

Repository files navigation

[CVPR 2026 Oral] FINER: MLLMs Hallucinate under Fine-grained Negative Queries

Paper Project Page Models FINER-Tuning Data

Authors: Rui Xiao, Sanghwan Kim, Yongqin Xian, Zeynep Akata, Stephan Alaniz

News

  • [2026-04-08] 🎉 Our paper was accepted to CVPR 2026 as an Oral Presentation.

Abstract

Multimodal large language models (MLLMs) struggle with hallucinations, particularly with fine-grained queries, a challenge underrepresented by existing benchmarks that focus on coarse image-related questions. We introduce FIne-grained NEgative queRies (FINER), alongside two benchmarks: FINER-CompreCap and FINER-DOCCI. Using FINER, we analyze hallucinations across four settings: multi-object, multi-attribute, multi-relation, and “what” questions. Our benchmarks reveal that MLLMs hallucinate when fine-grained mismatches co-occur with genuinely present elements in the image. To address this, we propose FINER-Tuning, leveraging Direct Preference Optimization (DPO) on FINER-inspired data. Finetuning four frontier MLLMs with FINER-Tuning yields up to 24.2% gains on hallucinations from our benchmarks, while simultaneously improving performance on eight existing hallucination suites and enhancing general multimodal capabilities across six benchmarks.

Methodology

We refer to our project page, where we walk you through the paper in details.

FINER-Benchmarks

FINER-Tuning

Pre-trained Models

We released the pre-trained FINER models on Huggingface.

Code coming soon.

Citations

If you find our work useful, please star this repo and cite:

@inproceedings{xiao2026finer,
  title={FINER: MLLMs Hallucinate under Fine-grained Negative Queries},
  author={Xiao, Rui and Kim, Sanghwan and Xian, Yongqin and Akata, Zeynep and Alaniz, Stephan},
  booktitle={CVPR},
  year={2026}
}

About

[CVPR 2026 Oral] FINER: MLLMs Hallucinate under Fine-grained Negative Queries

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages