The Korean Society for Journalism & Communication (KSJCS)
[ Article ]
Korean Journal of Journalism & Communication Studies - Vol. 68, No. 5, pp.165-195
ISSN: 2586-7369 (Online)
Print publication date 31 Oct 2024
Received 05 Jun 2024 Revised 16 Sep 2024 Accepted 25 Sep 2024
DOI: https://doi.org/10.20879/kjjcs.2024.68.5.005

유튜브 추천 알고리즘과 유튜브 지속적 이용의도의 관계 : 알고리즘의 투명성 원칙과 책무성 원칙에 대한 규제 태도의 매개효과를 중심으로

심홍진*** ; 박준혁****
***정보통신정책연구원 연구위원 hjshim@kisdi.re.kr
****펜실베이니아 대학교 박사과정 jhpark24@upenn.edu
Relationship Between YouTube’s Recommendation Algorithm and Continuous Usage Intention : The Mediating Effect of Regulatory Attitudes Toward Algorithmic Transparency and Accountability Principles
Hongjin Shim*** ; Joonhyeog Park****
***Research Fellow, Korea Information Society Development Institute, corresponding author hjshim@kisdi.re.kr
****Ph.D Student, University of Pennsylvania jhpark24@upenn.edu

초록

본 연구는 유튜브 추천 알고리즘에 대한 이용자의 긍정적·부정적 인식이 유튜브 지속 이용 의도에 미치는 영향을 조사하고, 이들 변수 간의 관계에서 알고리즘 투명성과 책무성 원칙에 대한 이용자의 규제 태도의 매개효과를 검증하였다. 이러한 분석을 통해 유튜브 추천 알고리즘의 부정적 효과와 잠재적 역기능을 검토하고 이에 따른 학술적 시사점과 정책적 대응 방향을 제시하고자 하였다. 연구는 방송통신위원회와 정보통신정책연구원에서 수행 및 공개한 <2022년 지능정보사회 이용자 패널조사> 데이터(n=5,378)를 활용하여 분석을 수행하였다. 연구 결과는 다음과 같다. 첫째, 유튜브 추천 알고리즘에 대한 긍정적 인식은 유튜브 지속 이용의도에 직접적으로 정적 영향을 미칠 뿐 아니라, 알고리즘 투명성과 책무성 원칙에 대한 규제 태도를 통해 간접적으로도 긍정적 영향을 미치는 것으로 나타났다. 이는 유튜브 추천 알고리즘에 대한 긍정적 인식과 유튜브 지속 이용의도 간의 관계에서 알고리즘 투명성과 책무성 원칙에 대한 규제 태도의 매개효과가 유의미함을 시사한다. 둘째, 유튜브 추천 알고리즘에 대한 부정적 인식은 유튜브 지속 이용의도에 통계적으로 유의미한 영향을 미치지 않는 것으로 나타났다. 그러나 부정적 인식은 알고리즘 투명성과 책무성 원칙에 대한 규제 태도를 매개로 유튜브 지속 이용의도에 부정적인 영향을 미치는 것으로 확인되었다. 이와 같은 연구 결과를 토대로 본 연구는 유튜브 추천 알고리즘의 투명성과 책무성 원칙 강화를 통해 이용자의 긍정적 인식을 증진시키고, 부정적 인식을 완화하는 방향의 정책적 대응을 제언하였다. 나아가 유튜브 추천 알고리즘의 잠재적 역기능에 대한 지속적인 탐구와 검토의 필요성을 강조하였다.

Abstract

This study investigated the impact of users’ positive and negative perceptions of YouTube’s recommendation algorithm on their intention to continue using the platform. It also examined the mediating effects of regulatory attitudes towards the principles of algorithm transparency and accountability between these perceptions and intention. The research aimed to explore both the positive and negative implications of YouTube’s recommendation system, offering academic and policy recommendations to improve user experience while mitigating potential risks. The study utilized data from the 2022 Intelligent Information Society User Panel Survey, conducted by the Korea Communications Commission and the Korea Information Society Development Institute, with a sample size of 5,378 respondents. This survey enabled a comprehensive assessment of users’ perceptions of YouTube’s recommendation system and their regulatory attitudes toward transparency and accountability. The key findings are as follows: First, positive perceptions of YouTube’s recommendation algorithm had a direct positive effect on users’ intention to continue using the platform, as well as an indirect effect mediated by regulatory attitudes toward transparency and accountability principles. This suggests that positive perceptions of the algorithm not only improve user satisfaction but also build trust in the platform when transparency and accountability principles are appropriately applied. Second, negative perceptions of YouTube’s recommendation algorithm did not show a significant direct impact on the usage intention. However, when regulatory attitudes regarding transparency and accountability were included as mediators, a negative effect on usage intention emerged. This implies that negative user perceptions of the algorithm may deter continued use if transparency and accountability are not adequately addressed. These findings highlight the dual role of transparency and accountability principles as both protective measures and trust-building factors. Enhancing transparency would ensure that users are informed about how recommendations are generated, increasing trust and satisfaction. Accountability, on the other hand, emphasizes corporate responsibility for the outcomes produced by the algorithm, which can help mitigate negative perceptions and maintain user engagement. The study stresses the importance of ongoing exploration and evaluation of potential dysfunctions of YouTube’s recommendation system, particularly the spread of misinformation, privacy issues, and the reinforcement of confirmation biases through filter bubbles. To address these challenges, the study recommends improving transparency and accountability within recommendation algorithms to boost users’ positive perceptions and reduce negative ones. Moreover, policymakers should consider user-centric regulatory frameworks that account for the complex dynamics between user perception, regulatory intervention, and technological affordances. This research provides insights into how algorithmic governance can be designed to enhance user trust and satisfaction, ensuring the sustainable and ethical deployment of AI-based recommendation systems on digital platforms.

Keywords:

YouTube Recommendation Service, Algorithmic Principles, Usage Intention, Intelligent Information Society User Panel Survey

키워드:

유튜브 추천 서비스, 알고리즘 기본원칙, 이용의도, 지능정보사회 이용자 패널조사

Acknowledgments

This study was based on the data from the 2022 Intelligent Information Society User Panel Survey conducted by the Korea Communications Commission and Korea Information Society Development Institute(이 논문은 방송통신위원회·정보통신정책연구원의 2022년 지능정보사회 이용자 패널조사 데이터를 분석하여 작성한 것임).

References

  • Bae, H. J., & Lee, S. W. (2020). A study on user recognition of personalized recommended service platforms by content characteristics. Korean Journal of Broadcasting and Telecommunication Studies, 34(3), 5-42.
    배현진‧이상우 (2020). 콘텐츠 특성에 따른 개인화 추천서비스 플랫폼에 대한 사용자 인식 연구. <한국방송학보>, 34권 3호, 5-42.
  • Buder, J., Rabl, L., Feiks, M., Badermann, M., & Zurstiege, G. (2021). Does negatively toned language use on social media lead to attitude polarization?. Computers in Human Behavior, 116, 106663. [https://doi.org/10.1016/j.chb.2020.106663]
  • Choi, M. Y. (2020). A study of factors influencing the use of YouTube video recommendation service: Focus on the moderating effects of media literacy. Unpublished master’s thesis, Sogang University, Seoul, Korea.
    최민영 (2020). <유튜브 추천 동영상 이용에 영향을 미치는 요인 탐구: 미디어 리터러시의 조절효과를 중심으로>. 서강대학교 언론대학원 석사학위 논문.
  • Chung, S., & Moon, S. I. (2016). Is the third-person effect real? A critical examination of rationales, testing methods, and previous findings of the third-person effect on censorship attitudes. Human Communication Research, 42(2), 312-337. [https://doi.org/10.1111/hcre.12078]
  • Covington, P., Adams, J., & Sargin, E. (2016). Deep neural networks for YouTube recommendations. Proceedings of the 10th ACM Conference on Recommender Systems, 191-198. [https://doi.org/10.1145/2959100.2959190]
  • Diakopoulos, N. (2015). Algorithmic accountability. Digital Journalism, 3, 398-415. [https://doi.org/10.1080/21670811.2014.976411]
  • Diakopoulos, N., & Koliska, M. (2017). Algorithmic transparency in the news media. Digital Journalism, 5, 809-828. [https://doi.org/10.1080/21670811.2016.1208053]
  • Dormehl, L. (2016). Thinking machines: The inside story of Artificial Intelligence and our race to build the future. Random House.
  • Eagly, A. H., & Chaiken, S. (2005). Attitude research in the 21st century: The current state of knowledge. In D. Albarracín, B. T. Johnson, & M. P. Zanna (Eds.), The handbook of attitudes (pp. 743-767). Lawrence Erlbaum Associates Publishers.
  • Eslami, M., Vaccaro, K., Lee, M., On, A., Gilbert, E., & Karahalios, K. (2019). User attitudes towards algorithmic opacity and transparency in online reviewing platforms. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. [https://doi.org/10.1145/3290605.3300724]
  • Gallup Korea. (2022). Market70 2022 (2): Usage rates of 18 types of media, content, and social network services #SNS. Gallup Report. Retrieved 4/22/24 from https://www.gallup.co.kr/gallupdb/reportContent.asp?seqNo=1323
    한국갤럽조사연구소 (2022). <마켓70 2022 (2) 미디어·콘텐츠·소셜네트워크 서비스 18종 이용률 #SNS>. 갤럽리포트.
  • Garrett, R. K. (2009). Echo chambers online?: Politically motivated selective exposure among Internet news users. Journal of Computer-mediated Communication, 14(2), 265-285. [https://doi.org/10.1111/j.1083-6101.2009.01440.x]
  • Gross, R., & Acquisti, A. (2005). Information revelation and privacy in online social networks. Proceedings of the 2005 ACM Workshop on Privacy in the Electronic Society. [https://doi.org/10.1145/1102199.1102214]
  • Gunther, A. C., Bolt, D., Borzekowski, D. L., Liebhart, J. L., & Dillard, J. P. (2006). Presumed influence on peer norms: How mass media indirectly affect adolescent smoking. Journal of Communication, 56(1), 52-68. [https://doi.org/10.1111/j.1460-2466.2006.00002.x]
  • Han, J. Y., & Lee, K. W. (2008). The effect of perceived credibility on mobile user’s attitude and behavioral changes. HCI Korea Conference.
    한지연‧이경원 (2008). <인지된 신뢰성이 모바일 사용자의 태도 및 행동 변화에 미치는 영향>. 한국HCI학회 학술대회.
  • Hayes, A. F. (2012). PROCESS: A versatile computational tool for observed variable mediation, moderation, and conditional process modeling.
  • Huckfeldt, R., Mendez, J. M., & Osborn, T. (2004). Disagreement, ambivalence, and engagement: The political consequences of heterogeneous networks. Political Psychology, 25(1), 65-95. [https://doi.org/10.1111/j.1467-9221.2004.00357.x]
  • Kang, H. C., Han, S. T., Jeong, B. C., & Shin, Y. J. (2004). On the algorithm of recommendation system for personalization. Journal of the Korean Data Analysis Society, 6(4), 1043-1049.
    강현철‧한상태‧정병철‧신연주 (2004). 개인화를 위한 추천시스템 알고리즘에 관한 연구. <Journal of the Korean Data Analysis Society>, 6권 4호, 1043-1049.
  • Kang, M. H. (2021). Does YouTube reinforce confirmation bias?: A study on the political uses of YouTube and its effects. Journal of Speech, Media and Communication Research, 20(4), 261-288. [ https://doi.org/10.51652/ksmca.2021.20.4.8 ]
    강명현 (2021). 유튜브는 확증편향을 강화하는가?: 유튜브의 정치적 이용과 효과에 관한 연구. <한국소통학보>, 20권 4호, 261-288.
  • Kim, D. J. (2020). An empirical study on filter bubbles in YouTube: Using social network analysis and text network analysis. Unpublished master’s thesis, Soongsil University, Seoul, Korea.
    김덕진 (2020). <유튜브 필터버블 현상에 대한 실증적 연구 : 사회연결망분석과 텍스트 네트워크 분석을 활용하여.> 숭실대학교 대학원 석사학위 논문.
  • Kim, I. S., & Kim, J. M. (2021). Youtube algorithm and confirmation bias. Paper presented at the 2021 Korean Association of Computer Education Winter Conference, 25(1), 71-74.
    김인식‧김자미 (2021). <유튜브 알고리즘과 확증편향>. 한국컴퓨터교육학회 학술발표대회논문집, 25권 1호., 71-74
  • Kim, M. K. (2022). The effect of perception of the usefulness of YouTube algorithm recommendation on media trust on Youtube: Mediated effects of perceived harm, confirmation bias, and privacy concerns. Journal of Speech, Media and Communication Research, 21(4), 7-42. [ https://doi.org/10.51652/ksmca.2022.21.4.1 ]
    김미경 (2022). 유튜브 알고리즘 추천의 유용성 인식에 따른 유튜브에 대한 미디어 신뢰도: 지각된 유해성, 확증편향, 프라이버시 염려의 매개 효과. <한국소통학보>, 21권 4호, 7-42.
  • Kim, R. G., Kim, S., & Ahn, J. J. (2022). A Study on the Bias and Regulation of AI Algorithm. International Telecommunications Policy Review, 29(2), 111-144. [ https://doi.org/10.37793/ITPR.29.2.4 ]
    김로건‧김시원‧안정민 (2022). AI 추천 알고리즘 편향성과 규제에 관한 연구. <정보통신정책연구>, 29권 2호, 111-144
  • Kim, Y. K., & Song, H. J. (2024). The impact of directional and accuracy goal orientation on the intention to continuously use YouTube’s recommendation algorithm: The mediating effects of perceived usefulness and perceived credibility. Journal of Communication Research, 61(1), 94-133.
    김유경‧송현진 (2024). 개인의 방향성 목표 성향 및 정확성 목표 성향이 유튜브 알고리즘 지속 이용의도에 미치는 영향 : 지각된 유용성과 지각된 신뢰성의 매개효과. <언론정보연구>, 61권 1호, 94-133.
  • Kizilcec, R. (2016). How much information?: Effects of transparency on trust in an algorithmic interface. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. [https://doi.org/10.1145/2858036.2858402]
  • Koenig, A. (2020). The algorithms know me and I know them: Using student journals to uncover algorithmic literacy awareness. Computers and Composition, 58, 102611. [https://doi.org/10.1016/j.compcom.2020.102611]
  • Korea Communications Commission. (2021). Basic Principles for User Protection in AI-Based Media Recommendation Services.
    방송통신위원회 (2021). 인공지능 기반 미디어 추천 서비스 이용자 보호 기본원칙.
  • Korea Communications Commission & Korea Information Society Development Institute (2023). 2022 Intelligent Information Society User Panel Survey. (National Statistics Approval(Consultation) No. 164004). Retrieved 11/19/23 from https://www.kcc.go.kr/user.do;jsessionid=czN-30cfefz3NdtE4l99AznyfySrMZO0jvQsZ8d8.servlet-aihgcldhome10?mode=view&page=A02060400&dc=&boardId=1030&cp=1&boardSeq=55764
    방송통신위원회·정보통신정책연구원 (2023). 2022년 지능정보사회 이용자 패널조사. (국가승인(협의)통계·승인번호 제164004호).
  • Lee, B., & Tamborini, R. (2005). Third-person effect and Internet pornography: The influence of collectivism and Internet self-efficacy. Journal of Communication, 55(2), 292-310. [https://doi.org/10.1111/j.1460-2466.2005.tb02673.x]
  • Lee, J. H., Ko, K. A., & Ha, D. G. (2018). A study on motivations of viewers watching personal live streaming broadcast and the influences of motivation factors to satisfaction and continuance intention focused on Post Acceptance Model (PAM). The Korean Journal of Advertising and Public Relations, 20(2), 178-215. [이주희‧고경아‧하대권 (2018). 1인 미디어 이용자들의 라이브 스트리밍 방송 시청 동기 및 사용자 반응에 관한 연구. <한국광고홍보학보>, 20권 2호, 178-215. [https://doi.org/10.16914/kjapr.2018.20.2.178]
  • Lee, S. E., & Choi, S. W. (2020). Content extremization on YouTube: Is the algorithm creating ‘reality’ or ‘myth’?. 2020 World Media Trends, Spring Issue.
    이소은‧최순욱 (2020). 유튜브의 콘텐츠 극단화: 알고리즘이 만드는 ‘현실’일까 ‘신화’일까?. 2020 해외 미디어 동향, 봄호.
  • Lee, S. R., & Kim, H. J. (2022). A study on the third-person effect toward citizen-generated political videos on YouTube. Korean Journal of Journalism & Communication Studies, 66(4), 72-106. [ https://doi.org/10.20879/kjjcs.2022.66.4.003 ]
    이선량‧김효정 (2022). 일반인 제작 유튜브 정치 동영상에 대한 제3자 효과 연구. <한국언론학보>, 66권 4호, 72-106.
  • Lie, J. W. (2021). How has the entertainment news production practices changed since the portal site’s introduction of AI news curation? An exploratory study. Korean Journal of Broadcasting & Telecommunications Research, 113, 93-121.
    이재원 (2021). 포털 사이트의 인공지능 뉴스 큐레이션 도입과 뉴스 생산 관행 변화에 관한 연구: 네이버 연예뉴스를 중심으로. <방송통신연구>, 113호, 93-121.
  • Ma, L. Y., & Kweon, S. H. (2020). The effects of personalized servicee of Youtube on user’s continuous use intention: Based on modified technology acceptance model. Korean Journal of Communication & Information, 99, 65-95. [ https://doi.org/10.46407/kjci.2020.02.99.65 ]
    마리야오‧권상희 (2020). 개인화 서비스요인이 사용자의 지속적인 이용의도영향에 미치는 연구 : 유튜브의 기술수용모델을 중심으로. <한국언론정보학보>, 99권, 65-95.
  • McLeod, D. M., Eveland, W. P., Jr., & Nathanson, A. I. (1997). Support for censorship of violent and misogynic rap lyrics: An analysis of the third-person effect. Communication Research, 24(2), 153-174. [https://doi.org/10.1177/009365097024002003]
  • Ministry of Science and ICT. (2020). Human Centered Ethical Standards for Artificial Intelligence (AI).
    과학기술정보통신부 (2020). 사람이 중심이 되는 인공지능(AI) 윤리기준
  • Moon. S. Y., & Kim, Y. A. (2019). A study of Youtube beauty contents, usage motivation, satisfaction and intention of continuous useage of beauty majors. The Korean Society of Cosmetics and Cosmetology, 9(3), 405-415.
    문서영‧김연아 (2019). 뷰티 전공자의 유튜브 뷰티 콘텐츠 이용동기, 만족도, 지속이용의도에 관한 연구. <한국화장품미용학회지>, 9권 3호, 405-415.
  • Mou, X., Xu, F., & Du, J. T. (2021). Examining the factors influencing college students’ continuance intention to use short-form video APP. Aslib Journal of Information Management, 73(6), 992-1013. [https://doi.org/10.1108/AJIM-03-2021-0080]
  • Mozilla. (2021). YouTube regrets a crowdsourced investigation into YouTube’s recommendation algorithm. Retrieved 4/10/24 from https://assets.mofoprod.net/network/documents/Mozilla_YouTube_Regrets_Report.pdf
  • Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175-220. [https://doi.org/10.1037/1089-2680.2.2.175]
  • Oh, S. U. (2018). Is AI a detector or producer of misinformation?: On the relationship between AI technology and misinformation. Press Arbitration, 149, 18-31.
    오세욱 (2018). AI는 허위정보의 감별사일까, 생산자일까: AI 기술과 허위정보의 상관관계에 관하여. <언론중재>, 149호, 18-31.
  • Oh, S. U., & Song, H. Y. (2019). YouTube algorithm and journalism (Research Report 2019-04). Retrieved 4/24/24 from https://www.kpf.or.kr/front/research/selfDetail.do?seq=575347&link_g_homepage=F
    오세욱‧송해엽 (2019). 유튜브 알고리즘과 저널리즘. (한국언론진흥재단 연구보고서 2019-04).
  • Oh, S. U., & Yun, H. O. (2022). ‘Algorithm’ approached with ‘media literacy’ focusing on the case of ‘NewsAlgo’. Korean Journal of Broadcasting & Telecommunications Research, 2022 special issue, 7-37.
    오세욱‧윤현옥 (2022). ‘미디어 리터러시’로 접근한 ‘알고리즘’ : ‘뉴스알고(NewsAlgo)’ 사례를 중심으로. <방송통신연구>, 2022년 특집호, 7-37.
  • Komiak, S. Y., & Benbasat, I. (2006). The effects of personalization and familiarity on trust and adoption of recommendation agents. MIS Quarterly, 30(4), 941-960. [https://doi.org/10.2307/25148760]
  • Park, W. S., Oh, Y. S., & Cho, J. H. (2023). A study on the intention to use Chat GPT service using Unified Theory of Acceptance and Use of Technology (UTAUT). Korean Journal of Broadcasting and Telecommunication Studies, 37(5), 52-97. [ https://doi.org/10.22876/kab.2023.37.5.002 ]
    박우승‧오유선‧조재희 (2023). 통합기술수용모델(UTAUT)을 적용한 Chat-GPT 서비스 이용의도에 관한 연구 : 20-40대를 중심으로. <한국방송학보>, 37권 5호, 52-97.
  • Pavlou, P. A. (2002). Institution-based trust in interorganizational exchange relationships: The role of online B2B marketplaces on trust formation. The Journal of Strategic Information Systems, 11(3-4), 215-243. [https://doi.org/10.1016/S0963-8687(02)00017-3]
  • Peiser, W., & Peter, J. (2000). Third‐person perception of television‐viewing behavior. Journal of Communication, 50(1), 25-45. [https://doi.org/10.1111/j.1460-2466.2000.tb02832.x]
  • Rojas, H. (2010). “Corrective” actions in the public sphere: How perceptions of media and media effects shape political behaviors. International Journal of Public Opinion Research, 22(3), 343-363. [https://doi.org/10.1093/ijpor/edq018]
  • Rojas, H., Shah, D. V., & Faber, R. J. (1996). For the good of others: Censorship and the third-person effect. International Journal of Public Opinion Research, 8(2), 163-186. [https://doi.org/10.1093/ijpor/8.2.163]
  • Roose, K. (2019, June 8). The making of a YouTube radical. The New York Times. Retrieved 4/19/24 from https://www.nytimes.com/interactive/2019/06/08/technology/youtube-radical.html
  • Schmude, T., Koesten, L., Moller, T., & Tschiatschek, S. (2023). On the impact of explanations on understanding of algorithmic decision-naking. Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency. [https://doi.org/10.1145/3593013.3594054]
  • Sharabati, A. A. A., Al-Haddad, S., Al-Khasawneh, M., Nababteh, N., Mohammad, M., & Abu Ghoush, Q. (2022). The impact of TikTok user satisfaction on continuous intention to use the application. Journal of Open Innovation: Technology, Market, and Complexity, 8(3), 125-145. [https://doi.org/10.3390/joitmc8030125]
  • Shin, D., & Park, Y. J. (2019). Role of fairness, accountability, and transparency in algorithmic affordance. Computers and Human Behavior, 98, 277-284. [https://doi.org/10.1016/j.chb.2019.04.019]
  • Shin, D., Zaid, B., & Ibahrine, M. (2020). Algorithm appreciation: Algorithmic performance, developmental processes, and user interactions. Paper presented at 2020 International Conference on Communications, Computing, Cybersecurity, and Informatics, Sharjah, United Arab Emirates. [https://doi.org/10.1109/CCCI49893.2020.9256470]
  • Shin, Y. J. (2020). Filter bubble phenomenon study on Youtube recommendation algorithm. Unpublished master’s thesis, Yonsei University, Seoul, Korea.
    신유진 (2020). <유튜브 알고리즘으로 인한 필터버블 현상 연구>. 연세대학교 정보대학원 석사학위 논문.