ACM Transactions on the Web

BNoteToDanmu: Category-Guided Note-to-Danmu Conversion Method for Learning on Video Sharing Platforms

Publication typeJournal Article
Publication date2025-03-13
scimago Q2
wos Q2
SJR0.787
CiteScore4.9
Impact factor2.6
ISSN15591131, 1559114X
Abstract

Danmu (or “bullet screen”), a popular feature on video sharing platforms, plays a crucial role in facilitating knowledge sharing and learning. In recent years, danmu has drawn attention to automatic generation methods. However, existing methods mostly utilize limited content sources, such as the video itself (e.g., subtitles) and neighboring danmus, while other valuable sources remain underexplored. To this end, this paper proposes a Category-Guided Note-to-Danmu conversion model (CG-NTD) by leveraging user-generated notes. The model is designed to identify unique contents within the notes and convert them into danmus, while also showing the source note categories. CG-NTD classifies the notes by fusing them with subtitle and neighboring danmu features. Then, it uses a cross-attention mechanism to integrate the note’s category feature with note, subtitle, and danmu contexts, to identify three keywords from the notes as the generated danmus. Using Bilibili as the research site, we implement a plugin prototype named BNoteToDanmu. Automatic and human evaluations reveal that CG-NTD outperforms BiLSTM, mT5, and BERT baselines in Precision, Recall, and F1-score metrics, and generates more understandable and relevant danmus than ChatGPT. Moreover, the plugin demonstrates promising applications, such as assisting users in viewing videos, posting danmus, and recognizing high-quality notes. These findings offer insights into leveraging user creations to generate danmu to enhance its learning value on video sharing platforms.

Found 

Are you a researcher?

Create a profile to get free access to personal recommendations for colleagues and new articles.
Share
Cite this
GOST | RIS | BibTex
Found error?