Araştırma Makalesi
BibTex RIS Kaynak Göster

Duygusal Yalnızlığa Bir Çözüm Olarak Chatgpt: Kişilerarası İletişimin Yeni Aracı

Yıl 2024, Sayı: Cumhuriyetin 100. Yılında Geleceğin İletişimi Özel Sayısı, 81 - 107, 18.03.2024
https://doi.org/10.17829/turcom.1360418

Öz

Yalnızlık, günümüz toplumunun en yaygın sağlık sorunlarından biridir. Kaynaklandığı nedene veya bireyde ortaya çıkan belirtilere bağlı olarak farklı şekillerde ele alınan yalnızlık, çalışma kapsamında duygusal yalnızlık özelinde irdelenmiştir. Duygusal yalnızlık ile başa çıkmak adına bir sohbet botunun (Chatgpt) kullanılmasına yönelik katılımcıların tutumlarının incelendiği çalışma olgubilim araştırma desenine sahiptir. İstanbul’da bir vakıf üniversitesinin İletişim Fakültesi öğrencilerinin örneklem olarak seçildiği araştırmada, nitel veri toplama aracı olan derinlemesine görüşme vasıtasıyla toplanan veriler içerik analizi yöntemi ile analiz edilmiştir. Katılımcıların, olumlu tutumları ve güçlü antropomorfize etme reflekslerinin aksine Chatgpt’nin yalnızlıkla mücadele etme aracı olarak kullanımına temkinli yaklaştıkları anlaşılmıştır. Katılımcıların büyük çoğunluğu Chatgpt ve teknolojik araçların duygusal ve sosyal ihtiyaçları karşılamasına yönelik olumsuz bir yargıya sahiptir. Buna karşın katılımcıların büyük bir çoğunluğu yakın bir gelecekte Chatgpt ve benzeri araçların sosyalleşme amacıyla kullanılmasının yaygınlaşacağına inanmaktadır. Yapay zekâ teknolojilerinin duygusal ihtiyaçları karşılama yeteneğine yönelik güvensizlik ve duygusal ihtiyaçları karşılaması halinde sosyo-kültürel hayata verebileceği zararlar merkezinde ağırlık kazanan endişelerin bir diğer ortak teması ise etik sorunlardır. Ayrıca katılımcıların endişelerini aktarırken popüler kültürde yer alan (film, dizi ve oyun vb.) yapay zekâ temsillerini işaret ettiği görülmektedir.

Kaynakça

  • Abd-Alrazaq, A. A., Alajlani, M., Alalwan, A. A., Bewick, B. M., Gardner, P., & Househ, M. (2019). An overview of the features of chatbots in mental health: A scoping review. International Journal of Medical Informatics, 132, 103978. https://doi.org/10.1016/j.ijmedinf.2019.103978
  • Alase, A. (2017). The interpretative phenomenological analysis (IPA): A guide to a good qualitative research approach. International Journal of Education and Literacy Studies, 5(2), 9-19. https://doi.org/10.7575/ aıac.ıjels.v.5n.2p.9
  • Bartneck, C. (2013). Robots in the theatre and the media. 8th International Conference on Design and Semantics of Form and Movement, 64-70. https://doi.org/10.13140/RG.2.2.28798.79682
  • Boyes, A. (2023, Nisan 24). 10 practical ways to use chatGPT if you have depression. Psychology Today: https:// www.psychologytoday.com/au/blog/in-practice/202304/10-practical-ways-to-use-chatgpt-if-you- have-depression
  • Brandtzeg, P. B., & Folstad, A. (2018). Chatbots: changing user needs and motivations. Interactions, 25(5), 38- 43. https://doi.org/10.1145/3236669
  • Britannica. (2023, Haziran 8). Britannica. Artificial Intelligence: https://www.britannica.com/technology/ artificial-intelligence
  • Chatgpt. (2023, Haziran 10). Seninle duygusal bir ilişki kurabilir miyim? Chatgpt: https://chat.openai.com/
  • Chaves, A. P., & Gerosa, M. A. (2021). How should my chatbot interact? A survey on social characteristics in human–Chatbot interaction design. International Journal of Human–Computer Interaction, 37(8), 729- 758. https://doi.org/10.1080/10447.318.2020.1841438
  • Ciechanowski, L., Przegalinska, A., Magnuski, M., & Gloor, P. (2019). In the shades of the uncanny valley: An experimental study of human–chatbot interaction. Future Generation Computer Systems, 92, 539-548. https://doi.org/10.1016/j.future.2018.01.055
  • Curry, D. (2023, Mayıs 5). ChatGPT Revenue and usage statistics (2023). BusinessofApps: https://www. businessofapps.com/data/chatgpt-statistics
  • De Gennaro, M., Krumhuber, E. G., & Lucas, G. (2020). Effectiveness of an empathic chatbot in combating adverse effects of social exclusion on mood. Frontiers in psychology, 10. https://doi.org/10.3389/ fpsyg.2019.03061
  • DHA. (2023, Temmuz 1). CNN Türk. Yalnızlık erken ölüm riskini yüzde 14 oranında artıyor: https://www. cnnturk.com/saglik/yalnizlik-erken-olum-riskini-yuzde-14-oraninda-artiyor
  • Dosovitsky, G., & Bunge, E. L. (2021). Bonding with bot: User feedback on a chatbot for social isolation. Frontiers in Digital Health, 3. https://doi.org/10.3389/fdgth.2021.735053
  • Duffy, B. R. (2003). Anthropomorphism and the social robot. Robotics and Autonomous Systems, 42(3-4), 90- 177. https://doi.org/10.1016/S0921-8890(02)00374-3
  • Epley, N., Waytz, A., & Cacioppo, J. (2007). On seeing human: A Three-factor theory of anthropomorphism. Psychological Review, 114(4), 864-886. https://doi.org/10.1037/0033-295X.114.4.864
  • Epley, N., Waytz, A., Akalis, & Cacioppo, J. (2008). When we need a human: motivational determinants of anthropomorphism. Social Cognition, 26(2), 143-155. https://doi.org/10.1521/soco.2008.26.2.143
  • Euronews. (2023, Temmuz 6). Euronews. alnızlık salgını: Hangi AB ülkesinde yaşayanlar kendini daha yalnız hissediyor?: https://tr.euronews.com/next/2023/06/08/yalnizlik-salgini-hangi-ab-ulkesinde- yasayanlar-kendini-daha-yalniz-hissediyor adresinden alındı.
  • Eyssel, F., & Reich, N. (2013). Loneliness makes the heart grow fonder (of robots)—On the effects of loneliness on psychological anthropomorphism.8th Acm/ieee International Conference on Human-robot Interaction (HRI) 121-122. https://doi.org/10.1109/HRI.2013.648.3531
  • Farina, A., Wheeler, D. S., & Mehta, S. (1991). The Impact of an unpleasant and demeaning social interaction. Journal of Social and Clinical Psychology, 10(4). https://doi.org/10.1521/jscp.1991.10.4.351
  • Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A Randomized controlled trial. JMIR Mental Health, 4(2), 19. https://doi.org/10.2196/mental.7785
  • Frackiewicz, M. (2023, Nisan 12). The Potential of chatGPT-3.5 in supporting mental health and well being. TS2 Space: https://ts2.space/en/the-potential-of-chatgpt-3-5-in-supporting-mental-health-and-well- being adresinden alındı.
  • Fulmer, R., Joerin, A., Gentile, B., Lakerink, L., & Rauws, M. (2018). Using psychological artificial ıntelligence (Tess) to relieve symptoms of depression and anxiety: Randomized controlled trial. JMIR Mental Health, 5(4), 1-15. https://doi.org/10.2196/mental.9782
  • Gambino, A., Fox, J., & Ratan, R. (2020). Building as Stronger CASA: Extending the computers are social actors paradigm. Human-Machine Communication, 1, 71-85. https://doi.org/10.3316/ informit.097.034.846749023
  • Gasteiger, N., Loveys,, K., Law, M., & Broadbent,, E. (2021). Friends from the future: A scoping review of research into robots and computer agents to combat loneliness in older people. Clinical interventions in aging, 941-971. https://doi.org/10.2147/CIA.S282709
  • George, A. (2011). Luddite and proud: The spirit of the 19th-century textile worker lives on, if vainly. New Scientist, 212(2844), 40-41. https://doi.org/10.1016/S0262-4079(11)63152-7
  • Grudin, J., & Jacques, R. (2019, Mayıs). Chatbots, humbots, and the quest for artificial general intelligence. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems 1-11. Association for Computing Machinery. https://doi.org/10.1145/3290.605.3300439
  • Guthrie, S. (2013). Anthropomorphism. A. L. Runehov, Oviedo, & N. P. Azari içinde, Encyclopedia of sciences and religions, ss. 111-113 Springer.
  • İmamoğlu, S. E. (2009). Kişilerarası illişkiler. Yeni İnsan Yayınevi.
  • Inkster, B., Sarda, S., & Subramanian, V. (2018). An Empathy-driven, conversational artificial intelligence agent (Wysa) for digital mental well-being: Real-world data evaluation mixed-methods study. JMIR mHealth and Health, 6(11) 11-23. https://doi.org/10.2196/12106
  • Jones, V. K., HanusM, Yan, C., Shande, M. Y., Blaskewicz Boron, J., & Maschieri Bicudo, R. (2021). Reducing loneliness among aging adults: The roles of personal voice assistants and anthropomorphic interactions. Frontiers in public health, 9. https://doi.org/10.3389/fpubh.2021.750736
  • Keleş, M. (2019, Haziran 3). VOA. Yalnızlık ruhsal ve bedensel hastalık riskini arttırıyor: https://www.voaturkce. com/a/yalnizlik-ruhsal-ve-bedensel-hastalik-riskini-arttiriyor/4920356.html adresinden alındı.
  • Krämer, N. C., Lucas, G., Schmitt, L., & Gratch, J. (2018). Social snacking with a virtual agent–On the interrelation of need to belong and effects of social responsiveness when interacting with artificial entities. International Journal of Human-Computer Studies, 109, 112-121. https://doi.org/10.1016/j. ijhcs.2017.09.001
  • Li, S., Yu, F., & Peng, K. (2020). Effect of state loneliness on robot anthropomorphism: Potential edge of social robots compared to common nonhumans. Journal of Physics: Conference Series. 1631, IOP Publishing. https://doi.org/10.1088/1742-6596/1631/1/012024
  • Liang, Y., & Lee, S. A. (2017). Fear of autonomous robots and artificial intelligence: Evidence from national representative data with probability sampling. International Journal of Social Robotics, 9, 379-384. https://doi.org/10.1007/s12369.017.0401-3
  • Lin, H., Chi, O. H., & Gursoy, D. (2020). Antecedents of customers’ acceptance of artificially intelligent robotic device use in hospitality services. Journal of Hospitality Marketing & Management, 29(5), 530-549. https://doi.org/10.1080/19368.623.2020.1685053
  • Lucas, G. M., Gratch, J., King, A., & Morency, L. (2014). It’s only a computer: Virtual humans increase willingness to disclose. Computers in Human Behavior, 37, 94-100. https://doi.org/10.1016/j.chb.2014.04.043
  • Luczak, H., Roetting, M., & Schmidt, L. (2003). Let’s Talk: Anthropomorphization as means to cope with stress of interacting with technical devices. Ergonomics, 46(13-14),1361-1374. https://doi.org/10.1080/001.401.3031000.161.0883
  • Maslow, A. H. (1970). Motivation and personality. Harper & Row Publishers.
  • Merrill Jr, K., Kim, J., & Collins, C. (2022). AI companions for lonely individuals and the role of Social Presence. Communication Research Reports, 39(2), 93-103. https://doi.org/10.1080/08824.096.2022.2045929
  • Moussawi, S., Koufaris , M., & Benbunan-Fich, R. (2021). How perceptions of intelligence and anthropomorphism affect adoption of personal intelligent agents. Electronic Markets, 31, 343-364. https://doi.org/10.1007/ s12525.020.00411-w
  • Mullins, L. A., Woodland, A., & Putnam, J. (1989). Emotional and social isalation among elderly canadian seasonal migrants in Florida: An empirical analysisof a conceptual typology. Journal of Gerontological Social Work, 14(3-4), 111-129. https://doi.org/10.1300/J083V14N03_08
  • Nass, C., Moon, Y., Fogg, B., Reeves, B., & Dryer, D. (1995). Can computer personalities be human personalities? International Journal of Human-Computer Studies, 2(43), 223-239. https://doi. org/10.1145/223.355.223538
  • Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81-103. https://doi.org/10.1111/0022-4537.00153
  • Nass, C., Steuer, J., & Tauber, E. (1994). Computers are social actors. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 72-78. ACM. https://doi.org/10.1145/191.666.191703
  • Nass, C., Steuer, J., Tauber, E., & Reeder, H. (1993). Anthropomorphism, agency and ethopoeia: Computers as Social Actors. Proceedings of the International CHI Conference, 111-112. CHI. https://doi.org/10.1145/259.964.260137
  • Neff, G., & Nagy, P. (2018). Agency in the digital age: Using symbiotic agency to explain human-technology interaction. Z. Papacharissi içinde, A networked self and human augmentics, artificial intelligence, sentience, ss. 97-107. Routledge. https://doi.org/10.4324/978.131.5202082-8
  • Patel, S. B., Lam, K., & Liebrenz, M. (2023). ChatGPT: Friend or foe. Lancet Digit. Health, 5, . e102-e103. https:// doi.org/10.1016/S2589-7500(23)00023-7
  • Perlman, D., & Peplau, L. A. (1984). Loneliness research: a survey of empirical findings. P. L. A., & S. E. Goldstone. içinde, Preventing the harmful consequences of severe and persistent loneliness. ss. 12-46. DHHS Yayınevi.
  • Pradhan, A., Findlater, L., & Lazar, A. (2019). “ Phantom Friend” or” Just a Box with Information” personification and ontological categorization of smart speaker-based voice assistants by older adults. Proceedings of the ACM on Human-Computer Interaction, 3, 1-21. https://doi.org/10.1145/3359316
  • Robinson, H., Broadbent,, E., & MacDonald, B. (2016). Group sessions with Paro in a nursing home: Structure, observations and interviews. Australasian journal on ageing,, 35(2), 106-112. https://doi.org/10.1111/ ajag.12199
  • Salles, A., Evers, K., & Farisco, M. (2020). Anthropomorphism in AI. AJOB Neuroscience, 11(2), 88-95. https:// doi.org/10.1080/21507.740.2020.1740350
  • Skjuve, M., Følstad , A., Fostervold, K. I., & Brandtzaeg, P. B. (2021). My chatbot companion – a study of human- chatbot relationships. International Journal of Human-Computer Studies, 149 (102601). https://doi. org/10.1016/j.ijhcs.2021.102601
  • Skjuve, M., Haugstveit, I. M., Følstad, A., & Brandtzaeg, P. (2019). Help! Is my chatbot falling into the uncanny valley? An empirical study of user experience in human–chatbot interaction. Human Technology, 15(1), 30-54. https://doi.org/10.17011/ht/urn.201.902.201607
  • Song, S. W., & Shin, M. (2022, Eylül 14). Uncanny valley effects on chatbot trust, purchase intention, and adoption intention in the context of e-commerce: The moderating role of avatar familiarity. International Journal of Human–Computer Interaction, 1-16. https://doi.org/10.1080/10447.318.2022.2121038
  • Svendsen, L. (2005). A philosophy of boredom. Reaktion Books Ltd. Ta, V., Griffith, C., Boatfield, C., Wang, X., Civitello, M., Bader, H., & Loggarakis, A. (2020). User experiences of social support from companion chatbots in everyday contexts: Thematic Analysis. Journal of Medical Internet Research, 22(3), e16235. https://doi.org/10.2196/16235
  • Tiwari, S. C. (2013). Loneliness: A Disease? Indian Journal of Psychiatry, 55(4), 320-336. https://doi. org/10.4103/0019-5545.120536
  • Troshani, I., Rao Hill, S., Sherman, C., & Arthur, D. (2021). Do we trust in AI? Role of anthropomorphism and intelligence. Journal of Computer Information Systems, 61(5), 481-491. https://doi.org/10.1080/08874. 417.2020.1788473
  • Wagner, K. & Schramm-Klein, H. (2019). Alexa, are you human? Investigating anthropomorphism of digital voice assistants–a Qualitative approach. In 50th International Conference on Information Systems (ICIS) 1-17. Germany: AIS.
  • Wang, P., & Shao, J. (2022). Escaping loneliness through tourist-chatbot interactions. Information and Communication Technologies in Tourism 2022: Proceedings of the ENTER 2022 eTourism Conference, 473-485. Springer International Publishing. https://doi.org/10.1007/978-3-030-94751-4_44
  • Waytz, A., & Epley, N. (2012). Social connection enables dehumanization. Journal of experimental social psychology, 48(1), 70-76. https://doi.org/10.1016/j.jesp.2011.07.012
  • Waytz, A., Epley, N., & Cacioppo, J. T. (2010). Social cognition unbound: insights into anthropomorphism and dehumanization. Curr Dir Psychol Sci , 19, 58-62.https://doi.org/10.1177/096.372.1409359302
  • Weiss, R. (1973). Loneliness: The experience of emotional and social isolation. MIT Press.
  • Weizenbaum, J. (1996). ELIZA—a computer program for the study of natural language communication between man and machine. Communications of the ACM, 9(1), 36-45. https://doi.org/10.1145/365.153.365168
  • Youn, S., & Jin, S. (2021). In AI We trust? “The effects of parasocial interaction and technopian versus luddite ideological views on chatbot-based customer relationship management in the emerging feeling economy”. Computers in Human Behavior, 119, 1-19. https://doi.org/10.1016/j.chb.2021.106721
  • Zamora, J. (2017). I’m sorry, dave, i’m afraid i can’t do that: Chatbot perception and expectations. Proceedings of the 5th International Conference on Human Agent Interaction, 253-260, ACM Press. https://doi. org/10.1145/3125.739.3125766
  • Zehnder, E., Dinet, J., & Charpillet, F. (2021). Social virtual agents and loneliness: Impact of virtual agent anthropomorphism on users’ feedbacks. Advances in Usability, User Experience, Wearable and Assistive Technology: Proceedings of the AHFE 2021, 285-292. USA: Springer International Publishing. https:// doi.org/10.1007/978-3-030-80091-8_33
  • Zumstein, D., & Hundertmark, S. (2017). Chatbots an interactive technology for personalized communication, Transactions and Services. IADIS International Journal on WWW/Internet, 15(1), 96-109. http://www. iadisportal.org/ijwi/papers/201.715.1107.pdf

ChatGPT as a Solution to Emotional Loneliness: A New Tool for Interpersonal Communication

Yıl 2024, Sayı: Cumhuriyetin 100. Yılında Geleceğin İletişimi Özel Sayısı, 81 - 107, 18.03.2024
https://doi.org/10.17829/turcom.1360418

Öz

Loneliness stands as one of the most pervasive and universal health issues in contemporary society. This study, conducted within the scope of emotional loneliness, delves into the nuances of loneliness, understanding its origin and how it manifests in individuals. In the research where students from the Faculty of Communication of a foundation university in Istanbul were chosen as the sample, the data collected through in-depth interviews, a qualitative data collection tool, were analyzed using content analysis method. This phenomenological research design aimed to assess participants’ attitudes toward utilizing a chatbot (ChatGPT) as a tool to combat emotional loneliness. Despite their generally positive disposition and strong tendency to anthropomorphize, participants were found to approach the use of ChatGPT with caution. The majority harbored negative judgments about ChatGPT and technological tools meeting emotional and social needs. However, a significant portion believes that the widespread adoption of ChatGPT and similar tools for socialization is imminent in the near future. Concerns predominantly revolve around mistrust in artificial intelligence technologies’ ability to meet emotional needs and the potential harms they could inflict on socio-cultural life, with ethical issues forming another major theme. Notably, when expressing their apprehensions, participants frequently referenced artificial intelligence representations in popular culture, such as films, series, and games.

Kaynakça

  • Abd-Alrazaq, A. A., Alajlani, M., Alalwan, A. A., Bewick, B. M., Gardner, P., & Househ, M. (2019). An overview of the features of chatbots in mental health: A scoping review. International Journal of Medical Informatics, 132, 103978. https://doi.org/10.1016/j.ijmedinf.2019.103978
  • Alase, A. (2017). The interpretative phenomenological analysis (IPA): A guide to a good qualitative research approach. International Journal of Education and Literacy Studies, 5(2), 9-19. https://doi.org/10.7575/ aıac.ıjels.v.5n.2p.9
  • Bartneck, C. (2013). Robots in the theatre and the media. 8th International Conference on Design and Semantics of Form and Movement, 64-70. https://doi.org/10.13140/RG.2.2.28798.79682
  • Boyes, A. (2023, Nisan 24). 10 practical ways to use chatGPT if you have depression. Psychology Today: https:// www.psychologytoday.com/au/blog/in-practice/202304/10-practical-ways-to-use-chatgpt-if-you- have-depression
  • Brandtzeg, P. B., & Folstad, A. (2018). Chatbots: changing user needs and motivations. Interactions, 25(5), 38- 43. https://doi.org/10.1145/3236669
  • Britannica. (2023, Haziran 8). Britannica. Artificial Intelligence: https://www.britannica.com/technology/ artificial-intelligence
  • Chatgpt. (2023, Haziran 10). Seninle duygusal bir ilişki kurabilir miyim? Chatgpt: https://chat.openai.com/
  • Chaves, A. P., & Gerosa, M. A. (2021). How should my chatbot interact? A survey on social characteristics in human–Chatbot interaction design. International Journal of Human–Computer Interaction, 37(8), 729- 758. https://doi.org/10.1080/10447.318.2020.1841438
  • Ciechanowski, L., Przegalinska, A., Magnuski, M., & Gloor, P. (2019). In the shades of the uncanny valley: An experimental study of human–chatbot interaction. Future Generation Computer Systems, 92, 539-548. https://doi.org/10.1016/j.future.2018.01.055
  • Curry, D. (2023, Mayıs 5). ChatGPT Revenue and usage statistics (2023). BusinessofApps: https://www. businessofapps.com/data/chatgpt-statistics
  • De Gennaro, M., Krumhuber, E. G., & Lucas, G. (2020). Effectiveness of an empathic chatbot in combating adverse effects of social exclusion on mood. Frontiers in psychology, 10. https://doi.org/10.3389/ fpsyg.2019.03061
  • DHA. (2023, Temmuz 1). CNN Türk. Yalnızlık erken ölüm riskini yüzde 14 oranında artıyor: https://www. cnnturk.com/saglik/yalnizlik-erken-olum-riskini-yuzde-14-oraninda-artiyor
  • Dosovitsky, G., & Bunge, E. L. (2021). Bonding with bot: User feedback on a chatbot for social isolation. Frontiers in Digital Health, 3. https://doi.org/10.3389/fdgth.2021.735053
  • Duffy, B. R. (2003). Anthropomorphism and the social robot. Robotics and Autonomous Systems, 42(3-4), 90- 177. https://doi.org/10.1016/S0921-8890(02)00374-3
  • Epley, N., Waytz, A., & Cacioppo, J. (2007). On seeing human: A Three-factor theory of anthropomorphism. Psychological Review, 114(4), 864-886. https://doi.org/10.1037/0033-295X.114.4.864
  • Epley, N., Waytz, A., Akalis, & Cacioppo, J. (2008). When we need a human: motivational determinants of anthropomorphism. Social Cognition, 26(2), 143-155. https://doi.org/10.1521/soco.2008.26.2.143
  • Euronews. (2023, Temmuz 6). Euronews. alnızlık salgını: Hangi AB ülkesinde yaşayanlar kendini daha yalnız hissediyor?: https://tr.euronews.com/next/2023/06/08/yalnizlik-salgini-hangi-ab-ulkesinde- yasayanlar-kendini-daha-yalniz-hissediyor adresinden alındı.
  • Eyssel, F., & Reich, N. (2013). Loneliness makes the heart grow fonder (of robots)—On the effects of loneliness on psychological anthropomorphism.8th Acm/ieee International Conference on Human-robot Interaction (HRI) 121-122. https://doi.org/10.1109/HRI.2013.648.3531
  • Farina, A., Wheeler, D. S., & Mehta, S. (1991). The Impact of an unpleasant and demeaning social interaction. Journal of Social and Clinical Psychology, 10(4). https://doi.org/10.1521/jscp.1991.10.4.351
  • Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A Randomized controlled trial. JMIR Mental Health, 4(2), 19. https://doi.org/10.2196/mental.7785
  • Frackiewicz, M. (2023, Nisan 12). The Potential of chatGPT-3.5 in supporting mental health and well being. TS2 Space: https://ts2.space/en/the-potential-of-chatgpt-3-5-in-supporting-mental-health-and-well- being adresinden alındı.
  • Fulmer, R., Joerin, A., Gentile, B., Lakerink, L., & Rauws, M. (2018). Using psychological artificial ıntelligence (Tess) to relieve symptoms of depression and anxiety: Randomized controlled trial. JMIR Mental Health, 5(4), 1-15. https://doi.org/10.2196/mental.9782
  • Gambino, A., Fox, J., & Ratan, R. (2020). Building as Stronger CASA: Extending the computers are social actors paradigm. Human-Machine Communication, 1, 71-85. https://doi.org/10.3316/ informit.097.034.846749023
  • Gasteiger, N., Loveys,, K., Law, M., & Broadbent,, E. (2021). Friends from the future: A scoping review of research into robots and computer agents to combat loneliness in older people. Clinical interventions in aging, 941-971. https://doi.org/10.2147/CIA.S282709
  • George, A. (2011). Luddite and proud: The spirit of the 19th-century textile worker lives on, if vainly. New Scientist, 212(2844), 40-41. https://doi.org/10.1016/S0262-4079(11)63152-7
  • Grudin, J., & Jacques, R. (2019, Mayıs). Chatbots, humbots, and the quest for artificial general intelligence. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems 1-11. Association for Computing Machinery. https://doi.org/10.1145/3290.605.3300439
  • Guthrie, S. (2013). Anthropomorphism. A. L. Runehov, Oviedo, & N. P. Azari içinde, Encyclopedia of sciences and religions, ss. 111-113 Springer.
  • İmamoğlu, S. E. (2009). Kişilerarası illişkiler. Yeni İnsan Yayınevi.
  • Inkster, B., Sarda, S., & Subramanian, V. (2018). An Empathy-driven, conversational artificial intelligence agent (Wysa) for digital mental well-being: Real-world data evaluation mixed-methods study. JMIR mHealth and Health, 6(11) 11-23. https://doi.org/10.2196/12106
  • Jones, V. K., HanusM, Yan, C., Shande, M. Y., Blaskewicz Boron, J., & Maschieri Bicudo, R. (2021). Reducing loneliness among aging adults: The roles of personal voice assistants and anthropomorphic interactions. Frontiers in public health, 9. https://doi.org/10.3389/fpubh.2021.750736
  • Keleş, M. (2019, Haziran 3). VOA. Yalnızlık ruhsal ve bedensel hastalık riskini arttırıyor: https://www.voaturkce. com/a/yalnizlik-ruhsal-ve-bedensel-hastalik-riskini-arttiriyor/4920356.html adresinden alındı.
  • Krämer, N. C., Lucas, G., Schmitt, L., & Gratch, J. (2018). Social snacking with a virtual agent–On the interrelation of need to belong and effects of social responsiveness when interacting with artificial entities. International Journal of Human-Computer Studies, 109, 112-121. https://doi.org/10.1016/j. ijhcs.2017.09.001
  • Li, S., Yu, F., & Peng, K. (2020). Effect of state loneliness on robot anthropomorphism: Potential edge of social robots compared to common nonhumans. Journal of Physics: Conference Series. 1631, IOP Publishing. https://doi.org/10.1088/1742-6596/1631/1/012024
  • Liang, Y., & Lee, S. A. (2017). Fear of autonomous robots and artificial intelligence: Evidence from national representative data with probability sampling. International Journal of Social Robotics, 9, 379-384. https://doi.org/10.1007/s12369.017.0401-3
  • Lin, H., Chi, O. H., & Gursoy, D. (2020). Antecedents of customers’ acceptance of artificially intelligent robotic device use in hospitality services. Journal of Hospitality Marketing & Management, 29(5), 530-549. https://doi.org/10.1080/19368.623.2020.1685053
  • Lucas, G. M., Gratch, J., King, A., & Morency, L. (2014). It’s only a computer: Virtual humans increase willingness to disclose. Computers in Human Behavior, 37, 94-100. https://doi.org/10.1016/j.chb.2014.04.043
  • Luczak, H., Roetting, M., & Schmidt, L. (2003). Let’s Talk: Anthropomorphization as means to cope with stress of interacting with technical devices. Ergonomics, 46(13-14),1361-1374. https://doi.org/10.1080/001.401.3031000.161.0883
  • Maslow, A. H. (1970). Motivation and personality. Harper & Row Publishers.
  • Merrill Jr, K., Kim, J., & Collins, C. (2022). AI companions for lonely individuals and the role of Social Presence. Communication Research Reports, 39(2), 93-103. https://doi.org/10.1080/08824.096.2022.2045929
  • Moussawi, S., Koufaris , M., & Benbunan-Fich, R. (2021). How perceptions of intelligence and anthropomorphism affect adoption of personal intelligent agents. Electronic Markets, 31, 343-364. https://doi.org/10.1007/ s12525.020.00411-w
  • Mullins, L. A., Woodland, A., & Putnam, J. (1989). Emotional and social isalation among elderly canadian seasonal migrants in Florida: An empirical analysisof a conceptual typology. Journal of Gerontological Social Work, 14(3-4), 111-129. https://doi.org/10.1300/J083V14N03_08
  • Nass, C., Moon, Y., Fogg, B., Reeves, B., & Dryer, D. (1995). Can computer personalities be human personalities? International Journal of Human-Computer Studies, 2(43), 223-239. https://doi. org/10.1145/223.355.223538
  • Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81-103. https://doi.org/10.1111/0022-4537.00153
  • Nass, C., Steuer, J., & Tauber, E. (1994). Computers are social actors. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 72-78. ACM. https://doi.org/10.1145/191.666.191703
  • Nass, C., Steuer, J., Tauber, E., & Reeder, H. (1993). Anthropomorphism, agency and ethopoeia: Computers as Social Actors. Proceedings of the International CHI Conference, 111-112. CHI. https://doi.org/10.1145/259.964.260137
  • Neff, G., & Nagy, P. (2018). Agency in the digital age: Using symbiotic agency to explain human-technology interaction. Z. Papacharissi içinde, A networked self and human augmentics, artificial intelligence, sentience, ss. 97-107. Routledge. https://doi.org/10.4324/978.131.5202082-8
  • Patel, S. B., Lam, K., & Liebrenz, M. (2023). ChatGPT: Friend or foe. Lancet Digit. Health, 5, . e102-e103. https:// doi.org/10.1016/S2589-7500(23)00023-7
  • Perlman, D., & Peplau, L. A. (1984). Loneliness research: a survey of empirical findings. P. L. A., & S. E. Goldstone. içinde, Preventing the harmful consequences of severe and persistent loneliness. ss. 12-46. DHHS Yayınevi.
  • Pradhan, A., Findlater, L., & Lazar, A. (2019). “ Phantom Friend” or” Just a Box with Information” personification and ontological categorization of smart speaker-based voice assistants by older adults. Proceedings of the ACM on Human-Computer Interaction, 3, 1-21. https://doi.org/10.1145/3359316
  • Robinson, H., Broadbent,, E., & MacDonald, B. (2016). Group sessions with Paro in a nursing home: Structure, observations and interviews. Australasian journal on ageing,, 35(2), 106-112. https://doi.org/10.1111/ ajag.12199
  • Salles, A., Evers, K., & Farisco, M. (2020). Anthropomorphism in AI. AJOB Neuroscience, 11(2), 88-95. https:// doi.org/10.1080/21507.740.2020.1740350
  • Skjuve, M., Følstad , A., Fostervold, K. I., & Brandtzaeg, P. B. (2021). My chatbot companion – a study of human- chatbot relationships. International Journal of Human-Computer Studies, 149 (102601). https://doi. org/10.1016/j.ijhcs.2021.102601
  • Skjuve, M., Haugstveit, I. M., Følstad, A., & Brandtzaeg, P. (2019). Help! Is my chatbot falling into the uncanny valley? An empirical study of user experience in human–chatbot interaction. Human Technology, 15(1), 30-54. https://doi.org/10.17011/ht/urn.201.902.201607
  • Song, S. W., & Shin, M. (2022, Eylül 14). Uncanny valley effects on chatbot trust, purchase intention, and adoption intention in the context of e-commerce: The moderating role of avatar familiarity. International Journal of Human–Computer Interaction, 1-16. https://doi.org/10.1080/10447.318.2022.2121038
  • Svendsen, L. (2005). A philosophy of boredom. Reaktion Books Ltd. Ta, V., Griffith, C., Boatfield, C., Wang, X., Civitello, M., Bader, H., & Loggarakis, A. (2020). User experiences of social support from companion chatbots in everyday contexts: Thematic Analysis. Journal of Medical Internet Research, 22(3), e16235. https://doi.org/10.2196/16235
  • Tiwari, S. C. (2013). Loneliness: A Disease? Indian Journal of Psychiatry, 55(4), 320-336. https://doi. org/10.4103/0019-5545.120536
  • Troshani, I., Rao Hill, S., Sherman, C., & Arthur, D. (2021). Do we trust in AI? Role of anthropomorphism and intelligence. Journal of Computer Information Systems, 61(5), 481-491. https://doi.org/10.1080/08874. 417.2020.1788473
  • Wagner, K. & Schramm-Klein, H. (2019). Alexa, are you human? Investigating anthropomorphism of digital voice assistants–a Qualitative approach. In 50th International Conference on Information Systems (ICIS) 1-17. Germany: AIS.
  • Wang, P., & Shao, J. (2022). Escaping loneliness through tourist-chatbot interactions. Information and Communication Technologies in Tourism 2022: Proceedings of the ENTER 2022 eTourism Conference, 473-485. Springer International Publishing. https://doi.org/10.1007/978-3-030-94751-4_44
  • Waytz, A., & Epley, N. (2012). Social connection enables dehumanization. Journal of experimental social psychology, 48(1), 70-76. https://doi.org/10.1016/j.jesp.2011.07.012
  • Waytz, A., Epley, N., & Cacioppo, J. T. (2010). Social cognition unbound: insights into anthropomorphism and dehumanization. Curr Dir Psychol Sci , 19, 58-62.https://doi.org/10.1177/096.372.1409359302
  • Weiss, R. (1973). Loneliness: The experience of emotional and social isolation. MIT Press.
  • Weizenbaum, J. (1996). ELIZA—a computer program for the study of natural language communication between man and machine. Communications of the ACM, 9(1), 36-45. https://doi.org/10.1145/365.153.365168
  • Youn, S., & Jin, S. (2021). In AI We trust? “The effects of parasocial interaction and technopian versus luddite ideological views on chatbot-based customer relationship management in the emerging feeling economy”. Computers in Human Behavior, 119, 1-19. https://doi.org/10.1016/j.chb.2021.106721
  • Zamora, J. (2017). I’m sorry, dave, i’m afraid i can’t do that: Chatbot perception and expectations. Proceedings of the 5th International Conference on Human Agent Interaction, 253-260, ACM Press. https://doi. org/10.1145/3125.739.3125766
  • Zehnder, E., Dinet, J., & Charpillet, F. (2021). Social virtual agents and loneliness: Impact of virtual agent anthropomorphism on users’ feedbacks. Advances in Usability, User Experience, Wearable and Assistive Technology: Proceedings of the AHFE 2021, 285-292. USA: Springer International Publishing. https:// doi.org/10.1007/978-3-030-80091-8_33
  • Zumstein, D., & Hundertmark, S. (2017). Chatbots an interactive technology for personalized communication, Transactions and Services. IADIS International Journal on WWW/Internet, 15(1), 96-109. http://www. iadisportal.org/ijwi/papers/201.715.1107.pdf
Toplam 67 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular İletişim Çalışmaları, Yeni İletişim Teknolojileri
Bölüm Araştırma Makaleleri
Yazarlar

Elif Başak Sarıoğlu 0000-0001-5558-6596

Esra Pelin Güregen 0000-0003-3564-7560

Yayımlanma Tarihi 18 Mart 2024
Gönderilme Tarihi 14 Eylül 2023
Yayımlandığı Sayı Yıl 2024 Sayı: Cumhuriyetin 100. Yılında Geleceğin İletişimi Özel Sayısı

Kaynak Göster

APA Sarıoğlu, E. B., & Güregen, E. P. (2024). Duygusal Yalnızlığa Bir Çözüm Olarak Chatgpt: Kişilerarası İletişimin Yeni Aracı. Türkiye İletişim Araştırmaları Dergisi(Cumhuriyetin 100. Yılında Geleceğin İletişimi Özel Sayısı), 81-107. https://doi.org/10.17829/turcom.1360418

Türkiye İletişim Araştırmaları Dergisi'nde yayımlanan tüm makaleler Creative Commons Atıf-Gayri Ticari 4.0 Uluslararası Lisansı ile lisanslanmıştır.