The Korean Society for Journalism & Communication (KSJCS)
[ Article ]
Korean Journal of Journalism & Communication Studies - Vol. 64, No. 5, pp.319-372
ISSN: 2586-7369 (Online)
Print publication date 31 Oct 2020
Received 12 Jun 2020 Revised 28 Sep 2020 Accepted 05 Oct 2020
DOI: https://doi.org/10.20879/kjjcs.2020.64.5.009

딥 러닝(Deep learning)기반 동영상 처리 알고리즘을 통한 19대 대선 TV토론 영상분석 : 후보자들의 등장빈도, 표정, 응시방향에 대한 분석

최윤정** ; 정유진*** ; 윤호영**** ; 김민정***** ; 김나영****** ; 첸 루******* ; 신주연******** ; 이주희********* ; 김나영********** ; 여은*********** ; 강제원************
**이화여자대학교 커뮤니케이션미디어학부 교수 yunchoi@ewha.ac.kr
***이화여자대학교 커뮤니케이션미디어학과 박사과정 chung.yoojin@ewhain.net
****이화여자대학교 커뮤니케이션미디어학부 교수 hoyoungyoon@ewha.ac.kr
*****이화여자대학교 커뮤니케이션미디어학과 박사과정 486teamo@hanmail.net
******이화여자대학교 에코크리에이티브협동과정 석사과정 nice_ny@ewhain.net
*******이화여자대학교 커뮤니케이션미디어학과 석사 bomilu327@gmail.com
********이화여자대학교 전자전기공학과 학사 sjy21580@gmail.com
*********이화여자대학교 전자전기공학과 석사과정 juhee69@ewhain.net
**********이화여자대학교 전자전기공학과 박사과정 12skdud21@ewhain.net
***********이화여자대학교 전자전기공학과 석사과정 silverykey@gmail.com
************이화여자대학교 전자전기공학과 교수 jewonk@ewha.ac.kr
Analysis of the 19th Presidential TV Debate Using Deep Learning Based Video Processing Algorithms : Analysis of the frequency, facial expression and gaze
Yun-jung Choi** ; Yoojin Chung*** ; Ho Young Yoon**** ; Minjung Kim***** ; Na Young Kim****** ; Chen Lu******* ; Ju-yeon Sin******** ; Ju-hee Lee********* ; Na-young Kim********** ; Eun Yeo*********** ; Jewon Kang************
**Professor, Division of Communication·Media, Ewha Womans University yunchoi@ewha.ac.kr
***Doctoral Student, Division of Communication·Media, Ewha Womans University chung.yoojin@ewhain.net
****Assistant Professor, Division of Communication·Media, Ewha Womans University hoyoungyoon@ewha.ac.kr
*****Doctoral Student, Division of Communication·Media, Ewha Womans University 486teamo@hanmail.net
******M.S. Student, Division of Interdisciplinary Program of EcoCreative, Ewha Womans University nice_ny@ewhain.net
*******Master's degree, Division of Communication·Media, Ewha Womans University bomilu327@gmail.com
*********B.S. Student, Department of Electronic & Electrical Engineering in Ewha Womans University sjy21580@gmail.com
*********M.S. Student, Department of Electronic & Electrical Engineering in Ewha Womans University juhee69@ewhain.net
**********Doctoral Student, Department of Electronic & Electrical Engineering in Ewha Womans University 12skdud21@ewhain.net
***********M.S. Student, Department of Electronic & Electrical Engineering in Ewha Womans University silverykey@gmail.com
************Professor, Department of Electronic & Electrical Engineering in Ewha Womans University, corresponding author jewonk@ewha.ac.kr

초록

본 연구는 인공지능 딥 러닝 기술을 적용한 알고리즘을 구축해, 2017년 19대 대선 기간에 진행된 TV토론 자료 중 정치인들의 표정 및 응시방향을 분석했다. TV토론에서 후보들이 보여준 비언어적 메시지, 구체적으로 표정(분노, 짜증, 만족, 무표정)과 응시방향을 분석하기 위해 딥 러닝 기술을 이용해 분류 네트워크 알고리즘을 구축하고 6차례의 토론 데이터를 분석했다. 또한 영상분석결과를 패널 설문조사 데이터와 통합하여 TV토론에서 후보들이 보여준 표정의 비율에 따라 후보에 대한 호감도 평가가 어떻게 달라지는지 다층모형 분석을 실시했다. 연구 결과, 문재인 후보가 만족스러운 표정을 가장 많이 지었고 유승민 후보는 무표정한 얼굴로 감정을 잘 드러내지 않았다. 다층모형 분석결과, 긍정적 이미지인 만족스러운 표정은 후보의 호감도를 더 긍정적으로 평가하도록 유의한 영향을 미쳤는데, 변화의 폭은 투표를 하지 않은 그룹보다 투표를 한 그룹에서 더 크게 나타났다. 지상파, 종합편성채널, 온라인 미디어 등 주로 사용하는 미디어로 집단을 구분한 분석에서는 분노한 표정에 따라 호감도 변화가 유의하게 달랐다. 분노한 표정에 노출된 경우 후보에 대한 호감도가 유의하게 변화하는데 종편을 주로 사용하는 그룹은 지상파 주 이용 그룹보다 후보를 긍정적으로 평가하는 것으로 나타났다. 본 연구는 TV토론에서 정치인의 비언어적 메시지가 유권자에게 미치는 영향을 밝혔고, 이후 정치인들의 표정과 응시방향을 자동적으로 분류할 수 있는 알고리즘을 개발했다는 점에서 의미가 있다.

Abstract

This study analyzed the frequency of appearance and nonverbal messages of political candidates in the television debates for the 19th presidential election held in 2017. To analyze nonverbal messages, facial expressions (angry, irritated, satisfied, and neutral) and the direction of gaze in the six televison debates, a classification network was constructed using deep learning technology. For this analysis, videos of six television debates held for 120 minutes per episode were collected, and image data was extracted at the rate of 30 frames per second as a frame, which resulted in image data of approximately 1.25 million frames. After that, this study built an image-analyzing system through deep learning, which automatically recognizes and classifies candidates appearing in videos, and then analyzes video frames by their facial expressions and direction of gaze. Then, the system counts how often each candidate appeared during all television debates in seconds, and analyzes the proportion of facial expressions of the candidates during the entire television debates. The results showed that Sang-Jung Shim appeared the most over three debates, followed by Chul-soo Ahn in two debates and Jae-in Moon in one. The least appeared candidate was Jun-pyo Hong. As for the facial expression, Moon showed the most satisfactory facial expressions, and Seung-min Yoo expressed his emotion the least. Hong showed the irritated expression the most, indicating that he had difficulty managing his facial expression. Additionally, this study conducted a multi-level analysis combining the image data with a panel survey, which measured respondents' preference of candidates before and after the presidential campaign, and the number of times they watched the debates. The multi-level analysis confirmed that the preferences of the candidates changed depending on the exposure to the facial expressions made by the candidates in the actual televison debates. As for the satisfactory expression and the expressionless face, the candidates were evaluated more positively as the expression exposure increased. In the case of the angry expression, the degree of candidate favorability decreased after the exposure. These results suggest that the viewers’ evaluations of the candidates changed substantially according to the candidates' facial expressions in the debate. In an era where communication scholars are expanding their research areas by converging with new disciplines such as media engineering and data science, this study suggests a new research direction. We hope that this study lays the groundwork for the research on media analysis using algorithms and deep learning.

Keywords:

Presidential TV debate, Artificial intelligence video analysis, Mixed model analysis, Facial expression classification algorithm, Politician's direction of gaze

키워드:

대선 TV토론, 인공지능 영상분석, 다층모형분석, 표정분류 알고리즘, 정치인의 응시방향

Acknowledgments

본 연구에 패널 설문조사 결과 데이터를 제공해주신 서강대학교 정치외교학과의 이현우 교수님께 감사의 말씀을 전합니다.

References

  • Abelson, R. P., Kinder, D. R., Peters, M. D., & Fiske, S. T. (1982). Affective and semantic components in political person perception. Journal of Personality and Social Psychology, 42(4), 619–630. [https://doi.org/10.1037/0022-3514.42.4.619]
  • Adorno, T. W., & Horkheimer, M. (1947). Dialektik der Aufklärung: Philosophische Fragmente. In Adorno, Gesammelte Schriften. Amsterdam: Querido.
  • Argyle, M., Alkema, F., & Gilmour, R. (1971). The communication of friendly and hostile attitudes by verbal and nonverbal signals. European Journal of Social Psychology, 1, 385–402. [https://doi.org/10.1002/ejsp.2420010307]
  • Argyle, M., & Trower, P. (1979). Person to person: ways of communicating. London: Multimedia Publications Inc.
  • Benoit, W. L., & Hansen, G. J. (2004). Presidential debate watching, issue knowledge, character evaluation, and vote choice. Human Communication Research, 30(1), 121-144. [https://doi.org/10.1111/j.1468-2958.2004.tb00727.x]
  • Benoit, W. L., Hansen, G. J., & Verser, R. M. (2010). A meta-analysis of the effects of viewing U.S. presidential debates. Communication Monographs, 70(4), 335-350. [https://doi.org/10.1080/0363775032000179133]
  • Birdwhistell, R. L. (2010). Kinesics and Context: Essays on Body Motion Communication. Philadelphia, PA: University of Pennsylvania.
  • Bucy, E. P. (2011). Nonverbal communication, emotion, and political evaluation. In K. Doveling, C. von Scheve, & E. A. Konijn (Eds.), The Routledge handbook of emotions and mass media (pp. 195-220). New York, NY: Taylor & Francis.
  • Burgoon, J. K., Buller, D. B., & Woodall, W. G. (1989). Nonverbal communication: The unspoken dialogue. New York, NY: Harpercollins College Division.
  • Cha, J., & Lee, C. (2009). 17th presidential election and the fairness of broadcasting: Broadcast news, TV debate, and opinion polls. Journal of Social Science, 20(1), 189-205.
  • Cho, J., Shah, D. V., Nah, S., & Brossard, D. (2009). “Split screens” and “spin rooms”: Debate modality, post-debate coverage, and the new videomalaise. Journal of Broadcasting & Electronic Media, 53(2), 242-261. [https://doi.org/10.1080/08838150902907827]
  • Cho, H., & Park, C. (2008). Good behavior of etiquette and non-verbal communication. Journal of the Korean Society of Women's Culture, 17, 121-149.
  • Cho, W., Park, S., & Park, J. (2003). Methods for segmentation of video and extraction of representative frames using statistical characteristics. The Korean Institute of Information Scientists and Engineers, 30(1B), 295-297.
  • Choi, Y. (1999). Nonverbal Communication. Seoul: Communication Books.
  • Choi, Y. J., & Lee, J. H. (2007). Effects of image–issue and positive–negative scene orders in broadcast news. Mass Communication & Society, 10(1), 43-65. [https://doi.org/10.1080/15205430709337004]
  • D'Alessio, D., & Allen, M. (2000). Media bias in presidential elections: A meta‐analysis. Journal of Communication, 50(4), 133-156. [https://doi.org/10.1111/j.1460-2466.2000.tb02866.x]
  • Davis, D. F. (1981). Issue information and connotation in candidate imagery: Evidence from a laboratory experiment. International Political Science Review, 2(4), 461-479. [https://doi.org/10.1177/019251218100200406]
  • Dayan, D., & Katz, E. (1992). Media events: The live broadcasting of history. Cambridge, MA: Harvard University Press.
  • DeVito, J. (2000). Human communication. New York, NY: Longman, USA.
  • Dodd, C. H. (1998). Dynamics of intercultural communication (5th ed.). New York, NY: McGraw-Hill.
  • Druckman, J. N. (2003). The power of television images: The first Kennedy-Nixon debate revisited. The Journal of Politics, 65(2), 559-571. [https://doi.org/10.1111/1468-2508.t01-1-00015]
  • Ekman, P. (1965). Differential communication of affect by head and body cues. Journal of Personality and Social Psychology, 2(5), 726-735. [https://doi.org/10.1037/h0022736]
  • Ekman, P. (1992a). An argument for basic emotions. Cognition & Emotion, 6(3-4), 169-200. [https://doi.org/10.1080/02699939208411068]
  • Ekman, P. (1992b). Are there basic emotions? Psychological Review, 99, 550-553. [https://doi.org/10.1037/0033-295X.99.3.550]
  • Exline, R. V. (1985). Multichannel Transmission of Nonverbal Behavior and the Perception of Powerful Men: The Presidential Debates of 1976. In S. L. Ellyson & J. F. Dovidio (Eds.), Power, Dominance, and Nonverbal Behavior (pp. 183-206). New York, NY: Springer. [https://doi.org/10.1007/978-1-4612-5106-4_10]
  • Exline, R. V., Gottheil, I., Paredes, A., & Winklemeier, D. (1968). Gaze direction as a factor in judgment of non-verbal expressions of affect. 76th Annual Convention of the APA Association, Washington, DC. [https://doi.org/10.1037/e473742008-209]
  • Gerbner, G. (1980). Aging with television: images on television drama and conceptions of social reality. Journal of Communication, 30(1), 37-47. [https://doi.org/10.1111/j.1460-2466.1980.tb01766.x]
  • Grabe, M. E., & Bucy, E. P. (2009). Image bite politics: News and the visual framing of elections. Oxford: Oxford University Press. [https://doi.org/10.1093/acprof:oso/9780195372076.001.0001]
  • Graber, D. A. (1990). Seeing is remembering: How visuals contribute to learning from television news. Journal of Communication, 40(3), 134-155. [https://doi.org/10.1111/j.1460-2466.1990.tb02275.x]
  • Hellweg, S. A. (1993). Campaigns and candidate images in American presidential elections. Lanham, MD: Rowman & Littlefield publishers.
  • Heritage, J., & Greatbatch, D. (1986). Generating applause: A study of rhetoric and response at party political conferences. American Journal of Sociology, 92(1), 110-157. [https://doi.org/10.1086/228465]
  • Hinton, G. E., & Salakhutdinov, R. R. (2006). Reducing the dimensionality of data with neural networks. Science, 313(5786), 504-507. [https://doi.org/10.1126/science.1127647]
  • Imamun93[GitHub]. (n.d). Retrieved 11/18/19 from https://github.com/imamun93/animal-image-classifications
  • Jang, B., Hwang, D., & Choi, S. (2020). The analysis of screen composition method in 19th presidential candidate TV debates. Journal of Broadcast Engineering, 25(1), 67-82.
  • Jang, Y., & Chung, D. (2019). Technology trend for image analysis based on deep learning. Current Industrial and Technological Trends in Aerospace, 17(1), 113-122.
  • Kim, H., Jeong, B., & Park, H. (2010). Key frame detection based using change of motion in video. Korea Multimedia Society, 105-108. [https://doi.org/10.3346/jkms.2010.25.S.S105]
  • Kim, K. (2019). An analysis of video shots shown in presidential debates. Journal of Digital Contents Society, 20(5), 937-946. [https://doi.org/10.9728/dcs.2019.20.5.937]
  • Kim, K., & Park, Y. (2018). Analysis of candidate persuasion strategies in 19th presidential election broadcast debate. Korea Regional Communication Research Association, 18(2), 35-69. [https://doi.org/10.14696/jcs.2018.06.18.2.35]
  • Kim, M., & Na, Y. (2005). Effects of nonverbal communication in national assembly candidates' broadcasted speech on viewers depending on involvement: Voice, gaze and gesture. Korean Journal of Broadcasting and Telecommunication Studies, 19(3), 42-103.
  • Kim, S., & Yoon, K. (2018). Timeline synchronization of multiple videos based on waveform. Journal of Broadcast Engineering, 23(2), 197-205.
  • Kim, Y., Gil, W., & Lee, J. (2017). Agenda-setting and priming effects of emotion: A mixed model analysis with visual frames and survey data on the 19th Korean presidential election. Journal of Cybercommunication Academic Society, 34(4), 53-98.
  • Kim, D. (2016). Fourth Industrial Revolution. Seoul: Communication Books.
  • Kim, W., & Jang, S. (2004). Nonverbal Communication. Seoul: Nanam Publishing Co.
  • Kim, Y. (2005). Persuasion Communication. Seoul: Nanam Publishing Co.
  • Kinder, D. R. (1986). The continuing American dilemma: White resistance to racial change 40 years after Myrdal. Journal of Social Issues, 42(2), 151-171. [https://doi.org/10.1111/j.1540-4560.1986.tb00230.x]
  • Kinsey, G. B. (2010). The nonverbal advantage: Secrets and sciences of body language at work. Personnel Psychology, 63(2), 498-501. [https://doi.org/10.1111/j.1744-6570.2010.01178_6.x]
  • Kleinkie, C. L. (1986). Gaze and eye contact: a research review. Psychological Bulletin, 100(1), 78–100. [https://doi.org/10.1037/0033-2909.100.1.78]
  • Knapp, M., & Hall, J. (2002). Nonverbal communication in human interaction (5th ed.). Belmont, CA: Wadsworth.
  • Knapp, M. L. (1978). Non-verbal communication in human interaction (2nd ed.). New York, NY: Holt, Rinehart & Winston.
  • Kook, K. (2019). Examples of applications by AI technology and industry sectors. Weekly Technical Trends, 20, 15-27.
  • Kraus, S. (1996). Winners of the first 1960 televised presidential debate between Kennedy and Nixon. Journal of Communication, 46(4), 78-96. [https://doi.org/10.1111/j.1460-2466.1996.tb01507.x]
  • Kwon, S., Kwon, J., Kim, D., & Lee, W. (2013). A study on the fairness of the 18th KBS presidential election report. Seoul: KBS Broadcast Research Institute.
  • Kwon, H. (2006). The Theory and Practice of Media Elections. Seoul: Communication Books.
  • Lanzetta, J. T., Sullivan, D. G., Masters, R. D., & McHugo, G. J. (1985). Viewers’ emotional and cognitive responses to televised images of political leaders. In S. Kraus & R. Perloff (Eds.), Mass media and political thought (pp. 85-116). Beverly Hills, CA: Sage.
  • Lee, J., Jeon, J., Park, Y., & Kwon, I. (2016). On metaphoric gestures of candidates in the U.S. presidential debate. Discourse and Cognition, 23(2), 53-79. [https://doi.org/10.15718/discog.2016.23.2.53]
  • Lee, J. (2018). A method of eye and lip region detection using faster R-CNN in face image. Journal of the Korea Convergence Society, 9(8), 1-8.
  • Lee, J., & Lee, H. (2018). Analysis of discrepancy of popularity and votes of presidential candidate Ahn Cheol-soo: TV debate effect or regression of political orientation? Korean Political Science Review, 52(1), 101-123. [https://doi.org/10.18854/kpsr.2018.52.1.005]
  • Lee, S. (2017). Recent AI development trends and future evolution directions. LG Economic Research Institute, 12, 30-31.
  • Lewis, K. M. (2000). When leaders display emotion: How followers respond to negative emotional expression of male and female leaders. Journal of Organizational Behavior: The International Journal of Industrial, Occupational and Organizational Psychology and Behavior, 21(2), 221-234. [https://doi.org/10.1002/(SICI)1099-1379(200003)21:2<221::AID-JOB36>3.0.CO;2-0]
  • Maddox, W. S., & Nimmo, D. (1981). In search of the ticket splitter. Social Science Quarterly, 62(3), 401-408.
  • Masters, R. D., & Sullivan, D. G. (1989). Nonverbal displays and political leadership in France and the United States. Political Behavior, 11, 123-156. [https://doi.org/10.1007/BF00992491]
  • Masters, R. D., & Sullivan, D. G. (1993). Nonverbal behavior and leadership: emotion and cognition in political information processing. In S. Iyengar & W. McGuire (Eds.), Explorations in political psychology (pp. 150-182). Durham, NC: Duke University Press. [https://doi.org/10.1215/9780822396697-008]
  • Matsumoto, D. (1992). More evidence for the universality of a contempt expression, Motivation and Emotion, 16(4), 363-368. [https://doi.org/10.1007/BF00992972]
  • McHugo, G. J., Lanzetta, J. T., Sullivan, D. G., Masters, R. D., & Englis, B. G. (1985). Emotional reactions to a political leader's expressive displays. Journal of Personality and Social Psychology, 49(6), 1513- 1529. [https://doi.org/10.1037/0022-3514.49.6.1513]
  • Mehrabian, A. (1968). Some referents and measures of nonverbal behavior. Behavior Research Methods & Instrumentation, 1(6), 203-207. [https://doi.org/10.3758/BF03208096]
  • Mehrabian, A. (1981). Silent messages: implicit communication of emotions and attitudes. Belmont, CA: Wadsworth.
  • Miller, O., Wattenberg, M. P., & Malanchuk, O. (1986). Schematic assessments of presidential candidates. American Political Science Review, 80(2), 521-540. [https://doi.org/10.2307/1958272]
  • Moon, H., Yang, A., & Kim, J. (2018). CNN-based hand gesture recognition for wearable applications. Journal of Broadcast Engineering, 23(2), 246-252.
  • Moon, S., Jang, S., Lee, J., & Lee, J. (2016). Trend of machine learning and deep learning technology. The Journal of The Korean Institute of Communication Sciences, 33(10), 49-56.
  • Mutz, D. C. (2007). Effects of "in-your-face" television discourse on perceptions of a legitimate opposition. American Political Science Review, 101(4) 621-635. [https://doi.org/10.1017/S000305540707044X]
  • Nam, J. (2004). Objective Avitus in Korean Newspapers: Focused on the Strategic Consciousness of Formal Realism. (Unpublished doctoral dissertation), Korea University, Seoul, Korea.
  • Nam, J., & Choi, Y. (2010). Comparison of the Korean and the U.S. presidential election coverage: Scene unit analysis of structural and content balance. Korean Journal of Broadcasting and Telecommunication Studies, 24(4), 87-121.
  • Nam, Y. (2020). Analysis of emotions in broadcast news using convolutional neural networks. Journal of the Korea Institute of Information and Communication Engineering, 24(8), 1064-1070.
  • Oh, C. (2007). The meaning of media economicization from the perspective of liberalism and communalism: Focused on interpretation of public and private dualistic broadcasting systems. Political Communication Research, 6, 5-41.
  • O'Grady, C. R. (2014). Integrating service learning and multicultural education in colleges and universities. New York, NY: Routledge. [https://doi.org/10.4324/9781410606051]
  • O'Keefe, G. J. (1975). Political campaigns and mass communication research. Political communication: issues and strategies for research. Beverly Hills, CA: Sage.
  • Olivola, C. Y., & Todorov, A. (2010). Elected in 100 milliseconds: Appearance-based trait inferences and voting. Journal of Nonverbal Behavior, 34(2), 83-110. [https://doi.org/10.1007/s10919-009-0082-1]
  • Park, H. (2018). Mediatization of politics: A theoretical review and prospect. Journal of communication research, 55(2), 5-44. [https://doi.org/10.22174/jcr.2018.55.2.5]
  • Park, J., & Kang, D. (2015). Traffic sign recognition with convolution neural network. The Korean Society of Mechanical Engineers, 52-53.
  • Park, J., Kim, J., & Kim, D. (2019). An analysis of verbal messages and facial expressions of the candidates in the 2016 U.S. presidential TV debates. Locality & Communication, 23(4), 33-68. [https://doi.org/10.47020/JLC.2019.11.23.4.33]
  • Park, S., Han, K., & Jang, W. (2018). CNN deep learning acceleration algorithm for mobile system. The Journal of Korean Institute of Information Technology, 16(10), 1-9. [https://doi.org/10.14801/jkiit.2018.16.10.1]
  • Patterson, M. L., Churchill, M. E., Burger, G. K., & Powell, J. L. (1992). Verbal and nonverbal modality effects on impressions of political candidates: Analysis from the 1984 presidential debates. Communication Monographs, 59(3), 231-242. [https://doi.org/10.1080/03637759209376267]
  • Patterson, T. E. (2000). Doing well and doing good: How soft news and critical journalism are shrinking the news audience and weakening democracy—and what news organizations can do about it. Cambridge, MA: Joan Shorenstein Center.
  • Potter, W. J. (2014). A critical analysis of cultivation theory. Journal of Communication, 64(1), 1015–1036. [https://doi.org/10.1111/jcom.12128]
  • Rahn, W. M., Aldrich, J. H., Borgida, E., & Sullivan, J. L. (1990). A social cognitive model of candidate appraisal. In R. G. Niemi & H. F. Weisberg (Eds.), Controversies in voting behavior (pp. 187-206). Washington, DC: Congressional Quarterly.
  • Richmond, V. P., McCroskey, J. C., & Payne, S. K. (1991). Nonverbal behavior in interpersonal relations. Englewood Cliffs, NJ: Prentice-Hall.
  • Samovar, L. , Porter, R., & Jain, N. (1981). Understanding Intercultural Communication. Boston, MA: Wadsworth Cengage Learning.
  • Scheufele, D. A., Kim, E., & Brossard, D. (2007). My friend's enemy: How split-screen debate coverage influences evaluation of presidential debates. Communication Research,34(1), 3-24. [https://doi.org/10.1177/0093650206296079]
  • Schrott, P. R., & Lanoue, D. J. (2013). The power and limitations of televised presidential debates: Assessing the real impact of candidate performance on public opinion and vote choice. Electoral Studies, 32(4), 684-692. [https://doi.org/10.1016/j.electstud.2013.03.006]
  • Sears, D. O., & Chaffee, S. H. (1979). Uses and effects of the 1976 debates: An overview of empirical studies. In S. Kraus (Ed.), The great debates. Carter vs. Ford, 1976 (pp. 223-261). Bloomington, IN: Indiana University Press.
  • Seiter, J. S. (1999). Does communicating nonverbal disagreement during an opponent's speech affect the credibility of the debater in the background? Psychological Reports, 84(3), 855-861. [https://doi.org/10.2466/pr0.1999.84.3.855]
  • Seiter, J. S. (2001). Silent derogation and perceptions of deceptiveness: Does communicating nonverbal disbelief during an opponent's speech affect perceptions of debaters' veracity? Communication Research Reports, 18(4), 334-344. [https://doi.org/10.1080/08824090109384814]
  • Seiter, J. S., & Weger, H. (2005). Audience perceptions of candidates' appropriateness as a function of nonverbal behaviors displayed during televised political debates. The Journal of Social Psychology, 145(2), 225-236. [https://doi.org/10.3200/SOCP.145.2.225-236]
  • Shah, D. V., Hanna, A., Bucy, E. P., Wells, C., & Quevedo, V. (2015). The power of television images in a social media age: Linking bio-behavioral and computational approaches via the second screen. American Academy of Political and Social Science, 659(1), 225-245. [https://doi.org/10.1177/0002716215569220]
  • shamangary[Github]. (n.d.). Retrieved 11/18/19 from https://github.com/shamangary/FSA-Net
  • Shim, H., & Sim, G. (2019). Design of emotional state estimation system in continuous time using deep learning. Journal of Korean Institute of Intelligent Systems, 29(1), 76-81. [https://doi.org/10.5391/JKIIS.2019.29.1.76]
  • Son, Y. (2011). Public's perception on the fairness of television news. Korean Journal of Broadcasting and Telecommunication Studies, 25(5) 122-158.
  • Song, S. (2009). The effect of nonverbal cue in televised political debate: An experimental study. Korean Journal of Broadcasting and Telecommunication Studies, 23(4), 88-127.
  • Sullivan, D. G., & Masters, R. D. (1988). “Happy Warriors”: Leaders' facial displays, viewers' emotions, and political support. American Journal of Political Science, 32(2), 345-368. [https://doi.org/10.2307/2111127]
  • Todorov, A., Mandisodza, A. N., Goren, A., & Hall, C. C. (2005). Inferences of competence from faces predict election outcomes. Science, 308(5728), 1623-1626. [https://doi.org/10.1126/science.1110589]
  • Tuchman, G. (1972). Objectivity as strategic ritual: An examination of newsmen's notions of objectivity. American Journal of sociology, 77(4), 660-679. [https://doi.org/10.1086/225193]
  • Um, G. (2017, 5, 18). The recipient of a TV Debates: Yoo Seung-min > Hong Joon-pyo > Sim Sang-jung. Naeil. Retrieved 11/18/19 from http://m.naeil.com/m_news_view.php?id_art=237583, .
  • Won, H., & Youn, S. (2015). A content analysis on the fairness of the main news of Korean general programming TV stations: Centered around main news during the 18th pre-presidential election period. Korean Journal of Broadcasting and Telecommunication Studies, 29(1), 117-148.
  • Wyer, R. S., Budesheim, T. L., & Shavitt, S. (1991). Image, issues, and ideology: The processing of information about political candidates. Journal of Personality and Social Psychology, 61(4), 533–545. [https://doi.org/10.1037/0022-3514.61.4.533]
  • Yang, T. Y., Chen, Y. T., Lin, Y. Y., & Chuang, Y. Y. (2019, June). Fsa-net: learning fine-grained structure aggregation for head pose estimation from a single image. Paper presented at the annual meeting of IEEE Conference on Computer Vision and Pattern Recognition, CA. [https://doi.org/10.1109/CVPR.2019.00118]
  • Zhang, K., Zhang, Z., Li, Z., & Qiao, Y. (2016). Joint face detection and alignment using multitask cascaded convolutional networks. IEEE Signal Processing Letters, 23(10), 1499-1503. [https://doi.org/10.1109/LSP.2016.2603342]
  • Zhao, R., Ouyang, W., Li, H., & Wang, X. (2015, June). Saliency detection by multi-context deep learning. Paper presented at the annual meeting of the IEEE Conference on Computer Vision and Pattern Recognition, Boston. [https://doi.org/10.1109/CVPR.2015.7298731]

Appendix

부록

  • 국경완 (2019). 인공지능 기술 및 산업 분야별 적용 사례. <주간기술동향>, 20권, 15-27.
  • 권상희·권장원·김동윤·이완수 (2013). <제18대 KBS 대선보도 공정성 연구>. 서울: KBS 방송문화연구소.
  • 권혁남 (2006). <미디어 선거의 이론과 실제>. 서울: 커뮤니케이션북스.
  • 김경호 (2019). 샷 분석을 통한 대통령선거 TV 토론회 연구. <한국디지털콘텐츠학회 논문지>, 20권 5호, 937-946.
  • 김관규·박연진 (2018). 19대 대통령선거 선거방송토론에서 나타난 후보자의 설득전략 분석. <언론과학연구>, 18권 2호, 35-69.
  • 김대호 (2016). <4차 산업혁명>. 서울: 커뮤니케이션북스.
  • 김명주·나은영 (2005). 방송 연설 후보자의 비언어적 커뮤니케이션이 고ㆍ저관여 시청자에게 미치는 영향: 제17대 국회의원 후보자들의 목소리, 응시, 손동작을 중심으로. <한국방송학보>, 19권 3호, 42-103.
  • 김신·윤경로 (2018). 소리 파형을 이용한 다수 동영상 간 시간축 동기화 기법. <방송공학회논문지>, 23권 2호, 197-205.
  • 김영석 (2005). <설득 커뮤니케이션>. 서울: 나남출판.
  • 김우룡·장소원 (2004). <비언어적커뮤니케이션>. 서울: 나남출판.
  • 김윤환·길우영·이종혁 (2017). 미디어 정서의 의제설정과 점화 효과 연구: 제 19 대 대선 후보 영상분석과 설문 조사를 연결한 혼합모형 분석. <사이버커뮤니케이션학보>, 34권 4호, 53-98.
  • 김호근·김병정·박형섭 (2010). 비디오 영상에서 움직임 변화를 이용한 대표 프레임 검출. <한국멀티미디어학회 학술발표논문집>, 105-108.
  • 남영자 (2020). CNN을 활용한 방송 뉴스의 감정 분석. <한국정보통신학회논문지>, 24권 8호, 1064-1070.
  • 남장군 (2017). <딥 하이퍼네트워크를 이용한 TV 드라마의 멀티모달 학습>. 서울대학교 대학원 석사학위 논문.
  • 남재일 (2004). <한국 신문의 객관주의 아비투스: 형식적 사실주의의 전략적 의례를 중심으로>. 고려대학교 대학원 박사학위 논문.
  • 남지나·최윤정 (2010). 한국과 미국 TV 뉴스의 대선 보도 비교: 신 단위의 형식과 내용의 공정성을 중심으로. <한국방송학보>, 24권 4호, 87-121.
  • 문성은·장수범·이정혁·이종석 (2016). 기계학습 및 딥 러닝 기술동향. <한국통신학회지 (정보와통신)>, 33권 10호, 49-56.
  • 문현철·양안나·김재곤 (2018). 웨어러블 응용을 위한 CNN 기반 손 제스처 인식. <방송공학회논문지>, 23권 2호, 246-252.
  • 박성우·한경호·장우영 (2018). 모바일 시스템을 위한 CNN 딥 러닝 가속화 알고리즘. <한국정보기술학회논문지>, 16권 10호, 1-9.
  • 박제강·강동중 (2015). 컨볼루션 신경망을 이용한 교통 표지판 인식. <대한기계학회 춘추학술대회>, 52-53.
  • 박지혜·김재홍·김대중 (2019). 2016년 미국 대선 TV토론에서 나타난 후보자의 메시지와 얼굴표정의 특징 분석. <지역과 커뮤니케이션>, 23권 4호, 33-68.
  • 박홍원 (2018). 정치의 미디어화: 이론적 검토 및 전망. <언론정보연구>, 55권 2호, 5-44.
  • 손영준 (2011). TV 뉴스 공정성에 대한 시민 인식 조사: 시민은 동등 비중의 원칙을 더 원한다. <한국방송학보>, 25권 5호, 122-158.
  • 송석화 (2009). TV 토론에서 정치인의 비언어적 요소가 이미지, 투표 의사에 미치는 영향. <한국방송학보>, 23권 4호, 88-127.
  • 심희린·심귀보 (2019). 딥 러닝을 이용한 연속 시간 감정 상태 추론 시스템 설계. <한국지능 시스템학회논문지>, 29권 1호, 76-81.
  • 엄경용 (2017, 5, 18). TV토론 수혜 : 유승민 > 홍준표 > 심상정. <내일신문>. Retrieved 11/18/19 from http://m.naeil.com/m_news_view.php?id_art=237583
  • 오창우 (2007). 자유주의와 공동체주의적 관점에서 매체경제화가 지니는 의미: 공ㆍ민영 이원론적 방송체계에 대한 해석을 중심으로. <정치커뮤니케이션 연구>, 6권, 5-41.
  • 원희영·윤석민 (2015). 종합편성채널의 보도 공정성에 관한 연구: 제 18 대 대통령 선거에 대한 메인 뉴스 분석을 중심으로. <한국방송학보>, 29권 1호, 117-148.
  • 이승훈 (2017). 최근 인공지능 개발 트렌드와 미래의 진화 방향. <LG 경제연구원>, 12호, 30-31.
  • 이정은·전진리·박영은·권익수 (2016). 미국 대선 후보 공개토론회에 나타난 손짓 언어: 은유적 손짓을 중심으로. <담화와 인지>, 23권 2호, 53-79.
  • 이정환 (2018). 초고속 R-CNN을 이용한 얼굴영상에서 눈 및 입술영역 검출방법. <한국융합학회논문지>, 9권 8호, 1-8.
  • 이지선·이현우 (2018). 19대 대통령 선거에서 안철수 후보의 지지도 변화요인 분석. <한국정치학회보>, 52권 1호, 101-123.
  • 장병민·황동현·최성진 (2020). 제19대 대통령후보 TV 토론회 화면구성방법 분석. <방송공학회논문지>, 25권 1호, 67-82.
  • 장윤정·정대원 (2019). 딥 러닝 기반의 영상 분석 기술 동향. <항공우주산업기술동향>, 17권 1호, 113-122.
  • 조완현·박순영·박종현 (2003). 통계적 특성을 이용한 비디오의 분할 및 대표 프레임의 추출방법. <한국정보과학회 학술발표논문집>, 30권 1B, 295-297.
  • 조희진·박찬옥 (2008). 예절 바른 행동과 비언어적 의사소통. <한국여성교양학회지>, 17권, 121-149.
  • 차재영·이창현 (2009). 17대 대통령선거와 방송의 공정성. <사회과학연구>, 20권 1호, 189-205.
  • 최윤희 (1999). <비언어 커뮤니케이션>. 서울: 커뮤니케이션북스.
  • Adorno, T. W., & Horkheimer, M. (1947). Dialektik der Aufklärung: Philosophische Fragmente. In Adorno, Gesammelte Schriften. Amsterdam: Querido. 김유동 (역) (2001). <계몽의 변증법>. 서울: 문학과 지성.