papers AI Learner
The Github is limit! Click to go to the new site.

Title Generation for User Generated Videos

2016-09-08
Kuo-Hao Zeng, Tseng-Hung Chen, Juan Carlos Niebles, Min Sun

Abstract

A great video title describes the most salient event compactly and captures the viewer’s attention. In contrast, video captioning tends to generate sentences that describe the video as a whole. Although generating a video title automatically is a very useful task, it is much less addressed than video captioning. We address video title generation for the first time by proposing two methods that extend state-of-the-art video captioners to this new task. First, we make video captioners highlight sensitive by priming them with a highlight detector. Our framework allows for jointly training a model for title generation and video highlight localization. Second, we induce high sentence diversity in video captioners, so that the generated titles are also diverse and catchy. This means that a large number of sentences might be required to learn the sentence structure of titles. Hence, we propose a novel sentence augmentation method to train a captioner with additional sentence-only examples that come without corresponding videos. We collected a large-scale Video Titles in the Wild (VTW) dataset of 18100 automatically crawled user-generated videos and titles. On VTW, our methods consistently improve title prediction accuracy, and achieve the best performance in both automatic and human evaluation. Finally, our sentence augmentation method also outperforms the baselines on the M-VAD dataset.

Abstract (translated by Google)

一个伟大的视频标题描述最突出的事件紧凑,捕捉观众的注意力。相比之下,视频字幕倾向于生成描述视频整体的句子。虽然自动生成视频标题是一个非常有用的任务,但它比视频字幕要少得多。我们首次提出了视频标题生成方法,提出了两种将最先进的视频字幕扩展到新任务的方法。首先,我们通过用高亮检测器启动视频字幕,突出显示敏感。我们的框架允许联合培训标题生成和视频突出本地化的模型。其次,在视频字幕播放器中引入高度的句子多样性,使得所生成的字幕也是多样化和吸引人的。这意味着可能需要大量的句子来学习标题的句子结构。因此,我们提出了一种新的句子增强方法来训练一个带有额外句子的例子,但没有相应的视频。我们收集了野外的大规模视频标题(VTW)数据集18100自动抓取用户生成的视频和标题。在VTW上,我们的方法不断提高标题预测的准确性,并在自动和人工评估中取得最佳表现。最后,我们的句子增加方法也胜过M-VAD数据集的基线。

URL

https://arxiv.org/abs/1608.07068

PDF

https://arxiv.org/pdf/1608.07068


Similar Posts

Comments