arXiv Analytics

Sign in

arXiv:2401.09861 [cs.CV]AbstractReferencesReviewsResources

Temporal Insight Enhancement: Mitigating Temporal Hallucination in Multimodal Large Language Models

Li Sun, Liuan Wang, Jun Sun, Takayuki Okatani

Published 2024-01-18Version 1

Recent advancements in Multimodal Large Language Models (MLLMs) have significantly enhanced the comprehension of multimedia content, bringing together diverse modalities such as text, images, and videos. However, a critical challenge faced by these models, especially when processing video inputs, is the occurrence of hallucinations - erroneous perceptions or interpretations, particularly at the event level. This study introduces an innovative method to address event-level hallucinations in MLLMs, focusing on specific temporal understanding in video content. Our approach leverages a novel framework that extracts and utilizes event-specific information from both the event query and the provided video to refine MLLMs' response. We propose a unique mechanism that decomposes on-demand event queries into iconic actions. Subsequently, we employ models like CLIP and BLIP2 to predict specific timestamps for event occurrences. Our evaluation, conducted using the Charades-STA dataset, demonstrates a significant reduction in temporal hallucinations and an improvement in the quality of event-related responses. This research not only provides a new perspective in addressing a critical limitation of MLLMs but also contributes a quantitatively measurable method for evaluating MLLMs in the context of temporal-related questions.

Related articles: Most relevant | Search more
arXiv:2402.12750 [cs.CV] (Published 2024-02-20, updated 2024-07-26)
Model Composition for Multimodal Large Language Models
Chi Chen et al.
arXiv:2404.09204 [cs.CV] (Published 2024-04-14)
TextHawk: Exploring Efficient Fine-Grained Perception of Multimodal Large Language Models
arXiv:2402.12451 [cs.CV] (Published 2024-02-19, updated 2024-06-06)
The Revolution of Multimodal Large Language Models: A Survey