Trust it or not: Understanding users’ motivations and strategies for assessing the credibility of AI-generated information

Publication typeJournal Article
Publication date2024-11-08
scimago Q1
wos Q1
SJR2.400
CiteScore13.7
Impact factor4.3
ISSN14614448, 14617315
Abstract

The evolution of artificial intelligence (AI) facilitates the creation of multimodal information of mixed quality, intensifying the challenges individuals face when assessing information credibility. Through in-depth interviews with users of generative AI platforms, this study investigates the underlying motivations and multidimensional approaches people use to assess the credibility of AI-generated information. Four major motivations driving users to authenticate information are identified: expectancy violation, task features, personal involvement, and pre-existing attitudes. Users evaluate AI-generated information’s credibility using both internal (e.g. relying on AI affordances, content integrity, and subjective expertise) and external approaches (e.g. iterative interaction, cross-validation, and practical testing). Theoretical and practical implications are discussed in the context of AI-generated content assessment.

Found 
Found 

Top-30

Journals

1
2
Proceedings of the Association for Information Science and Technology
2 publications, 10.53%
Frontiers in Psychology
1 publication, 5.26%
Asia Pacific Journal of Tourism Research
1 publication, 5.26%
Discover Computing
1 publication, 5.26%
Systems
1 publication, 5.26%
Technology in Society
1 publication, 5.26%
Organizacija
1 publication, 5.26%
Information Processing and Management
1 publication, 5.26%
Scientific Reports
1 publication, 5.26%
Journal of Theoretical and Applied Electronic Commerce Research
1 publication, 5.26%
Electronic Library
1 publication, 5.26%
International Journal of Human-Computer Interaction
1 publication, 5.26%
Journal of Open Innovation: Technology, Market, and Complexity
1 publication, 5.26%
International Journal of Information Management
1 publication, 5.26%
Information Systems Engineering and Management
1 publication, 5.26%
Innovation in Language Learning and Teaching
1 publication, 5.26%
Telecommunications Policy
1 publication, 5.26%
Information Communication and Society
1 publication, 5.26%
1
2

Publishers

1
2
3
4
5
Elsevier
5 publications, 26.32%
Taylor & Francis
4 publications, 21.05%
Springer Nature
3 publications, 15.79%
MDPI
2 publications, 10.53%
Wiley
2 publications, 10.53%
Frontiers Media S.A.
1 publication, 5.26%
Walter de Gruyter
1 publication, 5.26%
Emerald
1 publication, 5.26%
1
2
3
4
5
  • We do not take into account publications without a DOI.
  • Statistics recalculated weekly.

Are you a researcher?

Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
19
Share
Cite this
GOST |
Cite this
GOST Copy
Ou M. et al. Trust it or not: Understanding users’ motivations and strategies for assessing the credibility of AI-generated information // New Media and Society. 2024.
GOST all authors (up to 50) Copy
Ou M., Zheng H., Zeng Y., Hansen P. Trust it or not: Understanding users’ motivations and strategies for assessing the credibility of AI-generated information // New Media and Society. 2024.
RIS |
Cite this
RIS Copy
TY - JOUR
DO - 10.1177/14614448241293154
UR - https://journals.sagepub.com/doi/10.1177/14614448241293154
TI - Trust it or not: Understanding users’ motivations and strategies for assessing the credibility of AI-generated information
T2 - New Media and Society
AU - Ou, Mengxue
AU - Zheng, Han
AU - Zeng, Yueliang
AU - Hansen, Preben
PY - 2024
DA - 2024/11/08
PB - SAGE
SN - 1461-4448
SN - 1461-7315
ER -
BibTex
Cite this
BibTex (up to 50 authors) Copy
@article{2024_Ou,
author = {Mengxue Ou and Han Zheng and Yueliang Zeng and Preben Hansen},
title = {Trust it or not: Understanding users’ motivations and strategies for assessing the credibility of AI-generated information},
journal = {New Media and Society},
year = {2024},
publisher = {SAGE},
month = {nov},
url = {https://journals.sagepub.com/doi/10.1177/14614448241293154},
doi = {10.1177/14614448241293154}
}
Profiles