图书情报知识 ›› 2025, Vol. 42 ›› Issue (2): 131-144.doi: 10.13366/j.dik.2025.02.131

• 情报、信息与共享 • 上一篇    下一篇

智能推荐用户的算法偏见感知影响机理研究

肖梓培1,2, 查先进1, 严亚兰3   

  1. 1.武汉大学信息管理学院,武汉,430072;
    2.武汉大学图书情报国家级实验教学示范中心,武汉,430072;
    3.武汉科技大学管理学院,武汉,430065
  • 出版日期:2025-03-10 发布日期:2025-05-03
  • 通讯作者: 查先进(ORCID: 0000-0001-6522-3414),博士,教授,研究方向:信息行为与信息治理、数字经济与数字信息资源管理、信息分析与竞争情报,Email: xianjinzha@163.com。
  • 作者简介:肖梓培(ORCID: 0009-0009-8904-4547),硕士研究生,研究方向:信息行为、信息系统,Email: 15079509025@163.com;严亚兰(ORCID: 0000-0003-4263-9278),博士,教授,研究方向:知识共享与知识管理、信息资源管理、信息系统,Email: yalanyan@163.com。
  • 基金资助:
    本文系国家社会科学基金重大项目“人工智能颠覆性应用的社会影响与信息治理研究”(23&ZD223)的研究成果之一。

The Influencing Mechanism of Intelligent Recommendation Users' Algorithmic Bias Perception

XIAO Zipei1,2, ZHA Xianjin1, YAN Yalan3   

  1. 1. School of Information Management, Wuhan University, Wuhan, 430072;
    2. National Demonstration Center for Experimental Library and Information Science Education, Wuhan University, Wuhan, 430072;
    3. School of Management, Wuhan University of Science and Technology, Wuhan, 430065
  • Online:2025-03-10 Published:2025-05-03
  • Contact: Correspondence should be addressed to ZHA Xianjin, Email: xianjinzha@163.com, ORCID: 0000-0001-6522-3414
  • Supported by:
    This is an outcome of the Major Project "Research on Social Impacts of Disruptive Applications of Artificial Intelligence and Information Governance"(23&ZD223)supported by National Social Science Foundation of China.

摘要: [目的/意义]智能推荐在减轻用户信息超载的同时,也让用户感受到了算法偏见。算法偏见感知反映了用户的主观感受,探索智能推荐用户的算法偏见感知影响机理,有助于减轻算法偏见带来的危害。[研究设计/方法]利用扎根理论探索智能推荐用户的算法偏见感知影响机理。在开放编码阶段,识别了175个初始概念和28个基本范畴。在主轴编码阶段,提取了10个主范畴。在选择编码阶段,确定“算法偏见感知”为核心范畴,构建了智能推荐用户的算法偏见感知影响机理模型。[结论/发现]算法素养、人格特质、心理状态、推荐窄化、差异比较、算法特性、社会环境可以直接影响算法偏见感知。同时,算法特性、智能推荐质量、社会环境可以通过心理状态中介影响算法偏见感知,推荐窄化对算法偏见感知的影响受到算法素养的调节。[创新/价值]结合用户体验来考察算法偏见,研究结果能够为用户抵抗算法偏见、平台纠正算法偏见等提供参考。

关键词: 智能推荐, 算法偏见感知, 用户体验, 影响机理, 扎根理论

Abstract: [Purpose/Significance] While intelligent recommendation systems alleviate users' information overload, they also make users sense algorithm bias. Because the perception of Algorithmic bias reflect users' subjective feelings, exploring the influencing mechanism of algorithmic bias perception of intelligent recommendation users has an important implications for reducing the harm brought by algorithmic bias. [Design/Methodology] Utilizing grounded theory, this study explored the influencing mechanism of algorithmic bias perception of intelligent recommendation users.In the open coding phase, 175 initial concepts and 28 basic categories were identified. In the axial coding phase,10 principal categories were extracted. In the selective coding phase,"algorithmic bias perception" was identified as the core category. Finally, a theoretical model of influencing mechanism of algorithmic bias perception of intelligent recommendation users was developed. [Findings/Conclusion] The research results indicate that algorithmic literacy, personality traits, psychological state, recommendation narrowing, difference comparison, algorithm characteristics, and social environment directly affect users' perception of algorithmic bias. Furthermore, algorithmic characteristics, intelligent recommendation quality and social environment influence algorithmic bias perception through the mediation of psychological state. The impact of recommendation narrowing on the perception of algorithmic bias is moderated by algorithmic literacy. [Originality/Value] This study innovatively examines algorithmic bias based on user experience. The research findings provide references for both users to mitigate the impact of algorithmic bias and platforms to correct such algorithmic bias.

Keywords: Intelligent recommendation, Algorithmic bias perception, User experience, Influencing mechanism, Grounded Theory