Current issue

The Korean Society for Journalism & Communication Studies - Vol. 64 , No. 4

[ Article ]
Korean Journal of Journalism & Communication Studies - Vol. 64, No. 4, pp.436-470
Abbreviation: KSJCS
ISSN: 2586-7369 (Online)
Print publication date 31 Aug 2020
Received 10 Jun 2020 Revised 27 Jul 2020 Accepted 04 Aug 2020
DOI: https://doi.org/10.20879/kjjcs.2020.64.4.012

AI 미디어와 의인화 : AI 음성 대화형 에이전트의 의인화 평가척도 개발 연구
임종수** ; 최진호*** ; 이혜민****
**세종대학교 미디어커뮤니케이션학과 교수 (jslim123@sejong.ac.kr)
***한양대학교 컴퓨테이셔널사회과학연구센터 박사후연구원 (jinhochoi@hanyang.ac.kr)
****세종대학교 미디어커뮤니케이션학과 박사과정 수료 (min8958@daum.net)

Measuring the Perceived Anthropomorphism of an AI Conversational Agent : Scale Development and Validation
Jongsoo Lim** ; Jinho Choi*** ; Hyemin Lee****
**Professor, Department of Media & Communication, Sejong University (jslim123@sejong.ac.kr)
***Postdoctoral Research Fellow, Center for Computational Social Sciences, Hanyang University, corresponding author (jinhochoi@hanyang.ac.kr)
****Doctoral Candidate, Department of Media & Communication, Sejong University (min8958@daum.net)
Funding Information ▼

초록

본 연구는 AI 미디어를 매개로 한 AI 음성 대화형 에이전트(AI conversational agent)와의 커뮤니케이션 과정을 이해하기 위해 의인화적 관점에서 접근한다. 인간과 대화하는 사회적 로봇으로서의 AI 대화형 에이전트가 목소리나 대화내용 등에 있어 인간과 유사하게 설계되었다는 점에 주목한다. AI 대화형 에이전트에 대한 인지된 의인화를 구성하는 요소들은 무엇이며, 어떠한 문항들로 측정될 수 있는지 탐색하였다. 이러한 과정을 통해 개발된 척도의 신뢰도와 타당도를 다면적으로 검증하였다. 결과적으로 AI 대화형 에이전트에 대한 인지된 의인화는 이성적 및 정서적 지지, 합리성, 친밀성, 예의성, 그리고 인지적 개방성이라는 5개 요인으로 구성되는 것으로 확인되었으며, 이를 측정하기 위한 문항은 19개로 도출되었다. 의인화 구성요인 가운데 합리성, 친밀성, 예의성, 인지적 개방성이 AI 스피커에 대한 이용만족에 정적인 영향을 주는 것으로 나타났다. 본 연구를 통해 개발된 AI 대화형 에이전트 의인화 평가척도는 사람들의 AI 미디어 이용행위를 비롯한 커뮤니케이션 과정을 이해하는 데 도움을 줄 수 있을 것이다.

Abstract

This study seeks to develop a scale that measures people’s perception of an artificial intelligence (AI) conversational agent that utilizes a voice-user interface (VUI), owing to the increase in communication with social robots due to technological advances in AI, big data, and machine learning. The study focuses on the concept of anthropomorphism because an AI conversational agent, one of the social robots, is designed to simulate human voice and dialogue. Focusing on how people perceive the level of anthropomorphism of conversational agents while communicating with them, this study explores the dimension of perceived anthropomorphism and the questions to measure them. For this purpose, psychometric scale development procedures were followed and a large pool of items were rigorously tested for their reliability and validity. First, anthropomorphic factors were explored by researching the literature on anthropomorphism and interviewing experts in this area, and 35 measurement items were generated. The initial item pool consists of questions related to civility, rationality, refinement, intimacy, individuality, rational support, emotional support, and cognitive openness. In order to test the internal consistency among the initial question items, 81 university students were surveyed as preliminary research, and 35 measurement items were finalized. Another survey was conducted to examine whether the scale adequately reflected the proposed dimensional structure of the construct. Considering gender, age, and residence, 484 people who had experience using smart speakers within the last six months were surveyed, and their responses were analyzed. As a result, the perceived anthropomorphism of an AI conversational agent was identified as being composed of five factors—rational and emotional support, rationality, intimacy, civility, and cognitive openness—and 19 questions were developed to measure them. In particular, of the variables for perceived anthropomorphism, rationality, intimacy, civility, and cognitive openness had a positive effect on satisfaction with smart speakers. Reliability and validity were verified in multiple ways to ultimately develop a perceived anthropomorphic evaluation scale for an AI conversational agent. The scale developed by this study can be used to measure and assess subjective awareness of how human an AI conversational agent feels. This study is also expected to help predict users’ attitudes or behaviors regarding AI media that use conversational agents. As communication with agents in AI media increases, interactive AI agents’ quality increases, and they are producing increasingly better results. Consequently, this study is expected to contribute to understanding the process of communication with non-human actors and their communication strategies.


Keywords: AI media, AI conversational agent, Human-robot interaction (HRI), Anthropomorphism, Scale development
키워드: AI 미디어, AI 대화형 에이전트, 인간-로봇 상호작용(HRI), 의인화, 척도개발

Acknowledgments

This work was supported by the Ministry of Education of the Republic of Korea and the National Research Foundation of Korea (NRF-2017S1A5A2A01026118). (이 논문은 2017년 대한민국 교육부와 한국연구재단의 인문사회분야 중견연구자지원사업(트랙3)의 지원을 받아 수행된 연구임(2차년도))


References
1. Anderson, J. C., & Gerbing, D. W. (1988). Structural equation modeling in practice: A review and recommended two-step approach. Psychological Bulletin, 103(3), 411-423. doi:10.1037/0033-2909.103.3.411
2. Bartneck, C., Kulić, D., Croft, E., & Zoghbi, S. (2009). Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International Journal of Social Robotics, 1, 71-81. doi:10.1007/s12369-008-0001-3
3. Bitterly, K. (2019, August 23). 1 in 4 Americans Own a Smart Speaker. What Does That Mean for News? NYtimes. Retrived from https://open.nytimes.com/how-might-the-new-york-times-sound-on-smart-speakers-3b59a6a78ae3
4. Blackstone, M. (1993). Beyond brand personality: Building brand relationships, brand equity and advertising. In D. A. Aaker & A. L. Biel (Eds.), Brand equity and advertising: Advertising’s role in building strong brands (pp. 113-124). Hillsdale, NJ: Lawrence Erlbaum.
5. Boellstorff, T. (2008). Coming of age in second life. Princeton and London: Princeton University Press.
6. Braidotti, R. (2016). Posthuman critical theory. In D. Banerji & M. R. Paranjape (Eds.), Critical posthumanism and planetary future (pp. 13-32). New Delhi: Springer India.
7. Brown, P., & Levinson, S. C. (1987). Politeness: Some universals in language use. Cambridge, UK: Cambridge University Press.
8. Chandler, J., & Schwarz, N. (2010). Use does not wear ragged the fabric of friendship: Thinking of objects as alive makes people less willing to replace them. Journal of Consumer Psychology, 20(2), 138-145. doi:10.1016/j.jcps.2009.12.008
9. Chun, S. (2008). Post-human: Anthropomorphism and becoming-animals. Literature and Environment, 7(2), 163-178. doi:10.36063/asle.2008.7.2.007
10. Churchill, G. A. (1979). A paradigm for developing better measures of marketing constricts. Journal of Marketing Research, 16(1), 64-73. doi:10.1177/002224377901600110
11. Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2003). Applied multiple regression/correlation analysis in the behavioral sciences (3rd ed.). Mahwah, NJ: Eelbaum.
12. Dargis, M. (2013, December 17). Disembodied, but, Oh, What a Voice. NYtimes. Retrieved from https://www.nytimes.com/2013/12/18/movies/her-directed-by-spike-jonze.html
13. DeVellis, R. F. (2003). Scale development: Theory and applications. Thousand Oaks, CA: Sage.
14. Disalvo, C., Gemperle, F., & Forlizzi, J. (2004, November). Imitating the human form: Four kinds of anthropomorphic form. In Proceedings of Futureground, Design Research Society International Conference, Melbourne, Australia. Retrieved from http://www.cs.cmu.edu/~kiesler/anthropomorphism-org/pdf/Imitating.pdf
15. Ekman, P. (1992). Are there basic emotions? Psychological Review, 99(3), 550-553. doi:10.1037/0033-295X.99.3.550
16. Ekman, P. (1999). Basic emotions. In T. Dalgleish & M. Power (Eds.), Handbook of cognition and emotion (pp. 45-60). Sussex, UK: John Wiley & Sons, Ltd.
17. Epley, N., Waytz, A., Akalis, S., & Cacioppo, J. T. (2008). When we need a human: Motivational determinants of anthropomorphism. Social Cognition, 26(2), 143-155. doi:10.1521/soco.2008.26.2.143
18. Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research, 18(1), 39-50. doi:10.1177/002224378101800104
19. Gray, C. H. (2001). Cyborg citizen: Politics in the posthuman age. New York and London: Routledge.
20. Guest, T. (2007). Second lives: A journey through virtual worlds. New York, NY: Random House.
21. Guthrie, S. E. (1997). Anthropomorphism: A definition and a theory. In R. W. Mitchell, N. S. Thompson, & H. L. Miles (Eds.), Anthropomorphism, anecdotes, and animals (pp. 50-58). Albany, NY: SUNY Press.
22. Haslam, N. (2006). Dehumanization: An integrative review. Personality and Social Psychology Review, 10(3), 252-264. doi:10.1207/s15327957pspr1003_4
23. Hayles, N. K. (1999). How we became posthuman: virtual bodies in cybernetics: Literature and informatics. Chicago, IL: University of Chicago Press.
24. Hellén, K., & Sääksjärvi, M. (2013). Development of a scale measuring childlike anthropomorphism in products. Journal of Marketing Management, 29(1-2), 141-157. doi:10.1080/0267257X.2012.759989
25. Ho, C., & MacDorman, K. (2010). Revisiting the uncanny valley theory: Developing and validating an alternative to the Godspeed indices. Computers in Human Behavior, 26(6), 1508-1518. doi:10.1016/j.chb.2010.05.015
26. Hong, E.-J., Cho, K, & Choi, J. (2017). Effects of anthropomorphic conversational interface for smart home: An experimental study on the voice and chatting interactions. Journal of the HCI Society of Korea, 12(1), 15-23.
27. Hong, J. (2019, December 18). Samsung’s Galaxy Home Mini release: Three big shock waves for telecommunications, portal sites, and manufacturers. Global Economic. Retrieved from https://news.g-enews.com/ko-kr/news/article/news_all/201912161426582936c29df5275_1/article.html?md=20191217201259_R
28. Jack, R. E., Garrod, O. G. B., & Schyns, P. G. (2014). Dynamic facial expressions of emotion transmit an evolving hierarchy of signal over time. Current Biology, 24(2), 187-192. doi:10.1016/j.cub.2013.11.064
29. Jeong, W., Heo, J., & Kim, S. (2018, June). A study on the impact of intelligent voice user interface anthropomorphic performance on user experience. Paper presented at the annual meeting of Korean Society of Design Science, Seoul.
30. Jeong, Y., Lee, J. Y., & Kang, Y. A. (2019, February). An exploratory study on user perception of social relationship with conversational agent. Paper presented at the annual meeting of the HCI Society of Korea, Jeju.
31. Jeong, Y., Park, D.-E., Yoon, J., & Jang, M.-K. (2019). Exploring effects of dialect on user perception of conversational agents. Journal of Digital Contents Society, 20(7), 1439-1446. doi:,10.9728/dcs.2019.20.7.1439
32. Jin, B. (2019). Evaluative messages from conversational agents: A relational perspective. Journal of the HCI Society of Korea, 14(3), 13-20.
33. Jun, S. Y., Kim, S., & Park, H. K. (2017). The effects of anthropomorphized brand positioning on consumers’ brand evaluation: The moderating effect of social connection and perceived power. Journal of Consumer Studies, 28(6), 45-74.
34. Kim, B. S., & Woo, H. J. (2019). A study on the intention to use AI speakers: Focusing on extended technology acceptance model. The Journal of the Korea Contents Association, 19(9), 1-10. doi:10.5392/JKCA.2019.19.09.001
35. Kim, J., & Choi, J. (2018). Effect of conversational agent’s recommendation strategy on voice shopping experience: Focused on initiative strategies and type of goods. Journal of Cybercommunication Academic Society, 35(4), 5-35. doi:10.14695/KJSOS.2018.21.1.59
36. Kim, J., Hong, S., & Choo, B. (2007). Applications of structural equation modeling in management studies: A critical review. Korean Management Review, 36(4), 897-923.
37. Kim, J.-H., Lee, K.-H., & Choi, J. (2018). Determinants of safety and satisfaction with in-vehicle voice interaction: With a focus of agent persona and UX components. The Journal of the Korea Contents Association, 18(8), 573-585. doi:10.5392/JKCA.2018.18.08.573
38. Kim, P. (2019, September 4). The AI speaker as a companion for the elderly who are lonely. IT Chosun. Retrieved from http://it.chosun.com/site/data/html_dir/2019/09/04/2019090400085.html
39. Kaiser, H. E. (1960). The application of electronic computers to factor analysis. Education & Psychological Measurement, 20(1), 14I-151. doi:10.1177/001316446002000116
40. Kaufman, J. D., & Dunlap, W. P. (2000). Determining the number of factors to retain: Q windows-based FORTRAN-IMSL program for parallel analysis. Behavior Research Methods, Instruments, & Computers, 32(3), 389-395. doi:10.3758/BF03200806
41. Kelshaw, T. (2016, October 5). AI & gender: A Maxus survey. Retrieved from http://www.maxusglobal.com/blog/ai-gender-maxus-survey
42. Kinsella, B., & Mutchler, A. (2019 March). Smart speaker consumer adoption report (U.S.). Retrieved from https://voicebot.ai/wp-content/uploads/2019/03/smart_speaker_consumer_adoption_report_2019.pdf
43. Kirby, V. (2011). Quantum anthropologies: Life at large. Durham: Duke University Press.
44. Kowalczuk, P. (2018). Consumer acceptance of smart speakers: a mixed methods approach. Journal of Research in Interactive Marketing, 12(4), 418-431. doi:10.1108/JRIM-01-2018-0022
45. Lakoff, R. T. (1989). The limits of politeness: Therapeutic and courtroom discourse. Multilingua-Journal of Cross-Cultural and Interlanguage Communication, 8(2-3), 101-130.
46. Lee, G. (2019, October 1). Dementia prevention using an AI speaker: SK Telecom expands its AI care. Newsis. Retrieved from http://www.newsis.com/view/?id=NISX20191001_0000786199&cID=13001&pID=13000
47. Lee, H.-J., Cho, C.-H., Lee, S., & Keel, Y.-H. (2019). A study on consumers’ perception of and use motivation of artificial intelligence(AI) speaker. The Journal of the Korea Contents Association, 19(3), 138-154. doi:10.5392/JKCA.2019.19.03.138
48. Lee, J. (2012). The daydream of mathematics. Seoul: Humanist.
49. Lee, J.-M., Jung, M., Lee, J., Kim, Y.-E., & An, C. (2019). Consumer perception and adoption intention of artificial intelligent speaker: Non-users perspective. Journal of Consumer Studies, 30(2), 193-213.
50. Lee, S.-K. (2013). Deleuze and Guattari's “animal becoming” research. Journal of the New Korean Philosophical Association, 72(2), 409-441.
51. Lim, J., Shin, M., Moon, H., Yoon, J., Jeong, T., Lee, Y., & Yu, S. H. (2017). AI robot anthropomorphism study: A semantic network analysis of AlphaGo press coverage. Korean Journal of Journalism & Communication Studies, 61(4), 111-142. doi:10.20879/kjjcs.2017.61.4.004
52. Mackenzie, A. (2002). Transductions: Bodies and machines at speed. London, UK: A&C Black.
53. Nass, C., & Brave, S. (2005). Wired for speech. Cambridge, MA: MIT Press.
54. Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social response to the computers. Journal of Social Issues, 56(1), 81-103. doi:10.1111/0022-4537.00153
55. Park, B.-K., & Lee, J.-G. (2009). Conceptualizing and measuring brand equity of celebrity endorser: Scale development and validation. The Korean Journal of Advertising and Public Relations, 11(2), 155-192.
56. Park, K. I., & Cho, C.-H. (2015). Developing the scale of brand social presence: Focusing on Facebook. The Korean Journal Advertising, 26(5), 213-241. doi:10.14377/KJA.2015.7.15.213
57. Park, S.-A., & Choi, S. M. (2018). A understanding the factors influencing satisfaction and continued use intention of AI speakers: Focusing on the utilitarian and hedonic values. Information Society & Media, 19(3), 159-182.
58. Park, W. (2019, August 9). Fierce competition between Amazon, Google, and Baidu as the global AI speaker market grows. ChosunBiz. Retrieved from https://biz.chosun.com/site/data/html_dir/2019/08/08/2019080803409.html
59. Pollack, J. B. (2006). Mindless intelligence, Intelligent Systems. IEEE Intelligence, 21(3), 50-56. doi:10.1109/MIS.2006.55
60. Proudfoot, D. (2011). Anthropomorphism and AI: Turing’s much misunderstood imitation game. Artificial Intelligence, 175(5-6), 950-957. doi:10.1016/j.artint.2011.01.006
61. Purington, A., Taft, J. G., Sannon, S., Bazarova, N. N., & Taylor, S. H. (2017, May). “Alexa is my new BFF” Social Roles, User Satisfaction, and Personification of the Amazon Echo. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems (pp. 2853-2859).
62. Reeves, B., & Nass, C. (1996). The media equation: How people treat computers, television and new media like real people and places. New York, NY: Cambridge University Press.
63. Roh, M.-J., & Choi, M.-K. (2018). The effect of personal innovativeness on the adoption of A.I. speakers: The moderating effect of purse string control. Journal of Business Research, 33(1), 195-230.
64. Russell, S., & Norvig, P. (1995). A modern, agent-oriented approach to introductory artificial intelligence. ACM SIGART Bulletin, 6(2), 24-26. doi:10.1145/201977.201989
65. Ryu, J., & Baylor, A. (2005). The psychometric structure of pedagogical agent persona. Technology Instruction Cognition and Learning, 2(4), 291-314.
66. Searle, J. R. (1980). Minds, brains, and programs. Behavioral and Brain Sciences, 3(3), 417-457.
67. Seo, H., Kwon, O, & Kim, J. (2016, January). Man in the thing: 2-dimensional anthropomorphized smart home system. Paper presented at the annual meeting of the HCI Society of Korea, Kangwon.
68. Shannon, C. E. (1948). A mathematical theory of communication. The Bell System Technical Journal, 27(3), 379-423. doi:10.1002/j.1538-7305.1948.tb01338.x
69. Son, M. (2019). Factors affecting usage behavior and continued usage intention of AI speakers. The Journal of Internet Electronic Commerce Research, 19(6), 203-223.
70. Taipale, S., Vincent, J., Sapio, B., Lugano, G., & Fortunati, L. (2015). Introduction: Situating the human in social robots. In J. Vincent, S. Taipale, B. Sapio, G. Lugano, & L. Fortunati (Eds.), Social robots from a human perspective (pp. 1-17). Dordrecht: Springer.
71. Turkle, S. (1984). The second self: Computers and the human spirit. New York, NY: Simon & Schuster.
72. Turing, A. M. (1936). On computable numbers, with an application to the Entscheidungs problem. Proceedings of the London Mathematical Society, 42, 230-265.
73. Yonan, E. A. (1995). Religion as anthropomorphism: A new theory that invites definitional and epistemic scrutiny. Religion, 25(1), 31-34. doi:10.1006/reli.1995.0004
74. Waytz, A., Cacioppo, A., & Epley, N. (2010). Who sees human?: The stability and importance of individual differences in anthropomorphism. Perspectives on Psychological Science, 5(3), 219-232. doi:10.1177/1745691610369336
75. Wiener, N. (1948). Cybernetics: Or control and communication in the animal and the machine. Cambridge, MA: MIT Press.

부록
1. 김배성·우형진 (2019). 인공지능(AI) 스피커 사용의도에 관한 연구: 확장된 기술수용모델을 중심으로. <한국콘텐츠학회 논문지>, 19권 9호, 1-10.
2. 김정현·최준호 (2018). 대화형 에이전트의 추천 전략이 음성쇼핑경험에 미치는 영향에 관한 연구: 대화 주도권과 제품 유형을 중심으로. <사이버커뮤니케이션학보>, 35권 4호, 5-35.
3. 김지현·이가현·최준호 (2018). 자동차 음성인식 인터랙션의 안전감과 만족도 인식 영향 요인: 에이전트 퍼소나와 사용자 경험 속성을 중심으로. <한국콘텐츠학회논문지>, 18권 8호, 573-585.
4. 김진호·홍세희·추병대 (2007). 경영학 연구에서의 구조방정식 모형의 적용: 문헌연구와 비판. <경영학연구>, 36권 4호, 897-923.
5. 김평화 (2019, 9, 4). 외로운 노인의 동반자 AI스피커. <IT 조선>. Retrieved from http://it.chosun.com/site/data/html_dir/2019/09/04/2019090400085.html
6. 노민정·최민경 (2018). 개인의 혁신성이 인공지능 스피커의 수용에 미치는 영향: 가계지출 통제력에 따른 조절효과를 중심으로. <경영연구>, 33권 1호, 195-230.
7. 박경인·조창환 (2015) 브랜드의 사회적 실재감 척도개발에 관한 연구: 페이스북을 중심으로. <광고학연구>, 26권 5호, 213-241.
8. 박범길·이정교 (2009). 유명 광고모델의 브랜드 자산에 관한 연구: 척도 개발과 타당성 검증을 중심으로. <한국광고홍보학보>, 11권 2호, 155-192.
9. 박수아·최세정 (2018). 인공지능 스피커 만족도와 지속적 이용의도에 영향을 미치는 요인: 기능적, 정서적 요인을 중심으로. <정보사회와 미디어>, 19권 3호, 159-182.
10. 박원익 (2019, 8, 9). 글로벌 AI 스피커 쑥쑥...아마존·구글·바이두 ‘미래전쟁’. <조선비즈>. Retrieved from https://biz.chosun.com/site/data/html_dir/2019/08/08/2019080803409.html
11. 서하양·권오균·김진우 (2016). 스마트 홈 시스템 의인화 연구: 인터랙션 방식을 중심으로. <한국HCI학회 학술대회 논문집>, 259-261.
12. 손민희 (2019). 인공지능 스피커의 이용행동과 지속이용의도의 영향요인. <인터넷전자상거래연구>, 19권 6호, 203-223.
13. 이국현 (2019, 10, 1). “AI스피커로 치매 예방”…SKT, 인공지능 돌봄 확대. <뉴시스>. Retrieved from http://www.newsis.com/view/?id=NISX20191001_0000786199&cID=13001&pID=13000
14. 이수경 (2013). 들뢰즈와 가타리의 ‘동물-되기’ 연구. <철학논총>, 72집 2권, 409-441.
15. 이진경 (2012). <수학의 몽상>. 서울: 휴머니스트.
16. 이진명·정민지·이주래·김예은·안치연 (2019). 인공지능 스피커에 대한 소비자 인식과 수용의도: 비수용자를 중심으로. <소비자학연구>, 30권 2호, 193-213.
17. 이희준·조창환·이소윤·길영환 (2019). 인공지능 스피커(AI 스피커)에 대한 사용자 인식과 이용 동기 요인 연구. <한국콘텐츠학회 논문지>, 19권 3호, 138-154.
18. 임종수·신민주·문훈복·윤주미·정태영·이연주·유승현 (2017). AI 로봇 의인화 연구: ‘알파고’ 보도의 의미네트워크분석. <한국언론학보>, 61권 4호, 111-142.
19. 전성률·김소라·박혜경 (2017). 브랜드 의인화 포지셔닝 유형이 소비자의 브랜드 평가에 미치는 영향: 소비자의 사회적 유대와 지각된 권력의 조절효과를 중심으로. <소비자학연구>, 28권 6호, 45-74.
20. 전세재 (2008). 포스트 휴먼: 의인화와 동물-되기의 기법. <문학과 환경>, 7권 2호, 163-178.
21. 정유인·이정연·강연아 (2019). 대화형 에이전트에 대한 사회적 관계 인식수준 및 영향요인에 관한 탐색적 연구: 인지된 나이, 지위, 친밀도를 중심으로. <한국HCI학회 학술대회 논문집>, 219-224.
22. 정유인·박도인·윤정미·장미경 (2019). 대화형 에이전트의 사투리 사용이 사용자 인식에 미치는 영향: 내집단 유대감과 지속사용의도를 중심으로. <한국디지털콘텐츠학회 논문지>, 20권 7호, 1439-1446.
23. 정원웅·허지예·김세화 (2018). 음성 대화형 인터페이스의 의인화 표현이 사용자 경험에 미치는 영향에 관한 연구. <한국디자인학회 봄 국제학술대회 논문집>, 116-117.
24. 진보래 (2019). 인간과 대화형 에이전트 간 관계 형성의 효과: 챗봇의 칭찬이나 비판에 사용자 친밀감이 미치는 영향. <한국HCI학회 논문지>, 14권 3호, 13-20.
25. 홍은지·조광수·최준호 (2017). 스마트홈 대화형 인터페이스의 의인화 효과: 음성-채팅 인터랙션 유형에 따른 실험 연구. <한국HCI학회 논문지>, 12권 1호, 15-23.
26. 홍정민 (2019, 12, 18). 거인 삼성 ‘갤럭시 홈미니’ 출시 임박…통신·포털·제조사 AI스피커 3파전. <글로벌이코노믹>. Retrieved from https://news.g-enews.com/ko-kr/news/article/news_all/201912161426582936c29df5275_1/article.html?md=20191217201259_R
27. Gray, C. H. (2001). Cyborg citizen: Politics in the posthuman age. New York, NY: Routledge. 석기용 (역) (2016). <사이보그 시티즌>, 서울: 김영사.