Labeling Synthetic Content: User Perceptions of Warning Label Designs for AI-generated Content on Social Media
FOS: Computer and information sciences
Computer Science - Computers and Society
Artificial Intelligence (cs.AI)
Emerging Technologies (cs.ET)
H.5.1
Computer Science - Artificial Intelligence
Computers and Society (cs.CY)
Computer Science - Human-Computer Interaction
Computer Science - Emerging Technologies
H.4.0; J.7; H.5.1
H.4.0
J.7
Human-Computer Interaction (cs.HC)
DOI:
10.31234/osf.io/p5t3v_v2
Publication Date:
2025-02-20T11:59:24Z
AUTHORS (4)
ABSTRACT
In this research, we explored the efficacy of various warning label designs for AI-generated content on social media platforms—e.g., deepfakes. We devised and assessed ten distinct label design samples that varied across the dimensions of sentiment, color/iconography, positioning, and level of detail. Our experimental study involved 911 participants randomly assigned to these ten label designs and a control group evaluating social media content. We explored their perceptions relating to 1) Belief in the content being AI-generated, 2) Trust in the labels and 3) Social Media engagement perceptions of the content. The results demonstrate that the presence of labels had a significant effect on the user’sbelief that the content is AI-generated, deepfake, or edited by AI. However their trust in the label significantly varied based on the label design. Notably, having labels did not significantly change their engagement behaviors, such as ’like’, comment, and sharing. However, there were significant differences in engagement based on content type: political and entertainment. This investigation contributes to the field of human-computer interaction by defining a design space for label implementation and providing empirical support for the strategic use of labels to mitigate the risks associated with synthetically generated media.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....