Abstract
Chatbots are gaining popularity in the healthcare sector, as they provide easy access to information while supporting both users and medical personnel. While some users might be reluctant to talk to a chatbot about their personal health, as this is a sensitive subject area, different factors can contribute to more successful interactions between chatbots and humans - such as self-disclosure. Previous studies within human-machine-interaction have shown that self-disclosure from a chatbot or conversational agent leads to better relationship and rapport building in the interaction. In this work, we investigated if this also holds true for chatbots in health-related applications - e.g., the chatbot uttering “I am scared of viruses, too”. We conducted two studies with two different chatbots and specific use cases, one health assessment chatbot and one health information chatbot. For both use cases, we compared the non self-disclosing version with the self-disclosing version of each chatbot. Overall, both studies show that the integration of self-disclosure into the conversation between chatbots and humans in the context of health-related applications has a positive effect on the rapport building between chatbot and human. We also found that there is a difference in the effect of the introduced self-disclosure depending on the specific context in which the chatbot is used.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Altman, I.: Reciprocity of interpersonal exchange. J. Theory Soc. Behav. 3(2), 249–261 (1973). https://doi.org/10.1111/j.1468-5914.1973.tb00325.x
Altman, I., Taylor, D.A.: Social Penetration: The Development of Interpersonal Relationships. Rinehart & Winston, Holt (1973)
Amtmann, D., et al.: Development of a promis item bank to measure pain interference. Pain 150(1), 173–182 (2010). https://doi.org/10.1016/j.pain.2010.04.025
Argyle, M.: The biological basis of rapport. Psychol. Inq. 1(4), 297–300 (1990). https://doi.org/10.1207/s15327965pli0104_3
Black, N.: Patient reported outcome measures could help transform healthcare. BMJ 346, f167 (2013). https://doi.org/10.1136/bmj.f167
Cella, D.: PROMIS Health Organization Measuring Quality of Life in Neurological Disorders: Final Report of the Neuro-QoL Study (2018)
Choi, Y.H., Bazarova, N.N.: Self-disclosure characteristics and motivations in social media: extending the functional model to multiple social network sites. Hum. Commun. Res. 41(4), 480–500 (2015). https://doi.org/10.1111/hcre.12053
Fan, X., Chao, D., Zhang, Z., Wang, D., Li, X., Tian, F.: Utilization of self-diagnosis health chatbots in real-world settings: case study. J. Med. Internet Res. 23(1), e19928 (2021). https://doi.org/10.2196/19928
Fitzpatrick, K.K., Darcy, A., Vierhile, M.: Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): a randomized controlled trial. JMIR Mental Health 4(2), e7785 (2017)
Gratch, J., Wang, N., Gerten, J., Fast, E., Duffy, R.: Creating rapport with virtual agents. In: Pelachaud, C., Martin, J.-C., André, E., Chollet, G., Karpouzis, K., Pelé, D. (eds.) IVA 2007. LNCS (LNAI), vol. 4722, pp. 125–138. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-74997-4_12
Hanmer, J., Jensen, R.E., Rothrock, N.: A reporting checklist for HealthMeasures’ patient-reported outcomes: ASCQ-Me, Neuro-QoL, NIH Toolbox, and PROMIS. J. Patient-Rep. Outcomes 4(1), 1–7 (2020). https://doi.org/10.1186/s41687-020-0176-4
Ho, A., Hancock, J., Miner, A.S.: Psychological, relational, and emotional effects of self-disclosure after conversations with a chatbot. J. Commun. 68(4), 712–733 (2018). https://doi.org/10.1093/joc/jqy026
Huang, L., Morency, L.-P., Gratch, J.: Virtual rapport 2.0. In: Vilhjálmsson, H.H., Kopp, S., Marsella, S., Thórisson, K.R. (eds.) IVA 2011. LNCS (LNAI), vol. 6895, pp. 68–79. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-23974-8_8
Ignatius, E., Kokkonen, M.: Factors contributing to verbal self-disclosure. Nordic Psychol. 59(4), 362–391 (2007). https://doi.org/10.1027/1901-2276.59.4.362
Kang, S.H., Gratch, J.: People like virtual counselors that highly-disclose about themselves. Annu. Rev. Cyberther. Telemed. 2011, 143–148 (2011). https://doi.org/10.3233/978-1-60750-766-6-143
Kawasaki, M., Yamashita, N., Lee, Y.C., Nohara, K.: Assessing users’ mental status from their journaling behavior through chatbots. In: Proceedings of the 20th ACM International Conference on Intelligent Virtual Agents, no. 32, pp. 1–8. Association for Computing Machinery, New York (2020)
Keller, S.D., Yang, M., Treadwell, M.J., Werner, E.M., Hassell, K.L.: Patient reports of health outcome for adults living with sickle cell disease: development and testing of the ASCQ-Me item banks. Health Qual. Life Outcomes 12(1), 125 (2014). https://doi.org/10.1186/s12955-014-0125-0
Laranjo, L., et al.: Conversational agents in healthcare: a systematic review. J. Am. Med. Inf. Assoc. 25(9), 1248–1258 (2018)
Lee, Y.C., Yamashita, N., Huang, Y.: Designing a chatbot as a mediator for promoting deep self-disclosure to a real mental health professional. Proc. ACM Hum. Comput. Interact. 4(CSCW1), 1–27 (2020). https://doi.org/10.1145/3392836
Lee, Y.C., Yamashita, N., Huang, Y., Fu, W.: “I hear you, i feel you”: encouraging deep self-disclosure through a chatbot. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI 2020), pp. 1–12. Association for Computing Machinery, New York (2020). https://doi.org/10.1145/3313831.3376175
Lucas, G.M., Gratch, J., King, A., Morency, L.P.: It’s only a computer: virtual humans increase willingness to disclose. Comput. Hum. Behav. 37, 94–100 (2014). https://doi.org/10.1016/j.chb.2014.04.043
Mai, V., Wolff, A., Richert, A., Preusser, I.: Accompanying reflection processes by an AI-based StudiCoachBot: a study on rapport building in human-machine coaching using self disclosure. In: Stephanidis, C., et al.: (eds.) HCII 2021. LNCS, vol. 13096, pp. 439–457. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-90328-2_29
Moon, Y.: Intimate exchanges: using computers to elicit self-disclosure from consumers. J. Consum. Res. 26(4), 323–339 (2000). https://doi.org/10.1086/209566
Nadarzynski, T., Miles, O., Cowie, A., Ridge, D.: Acceptability of artificial intelligence (AI)-led chatbot services in healthcare: a mixed-methods study. Digit. Health 5, 2055207619871808 (2019). https://doi.org/10.1177/2055207619871808
Ni, L., Lu, C., Liu, N., Liu, J.: MANDY: towards a smart primary care chatbot application. In: Chen, J., Theeramunkong, T., Supnithi, T., Tang, X. (eds.) KSS 2017. CCIS, vol. 780, pp. 38–52. Springer, Singapore (2017). https://doi.org/10.1007/978-981-10-6989-5_4
Palanica, A., Flaschner, P., Thommandram, A., Li, M., Fossat, Y.: Physicians’ perceptions of chatbots in health care: cross-sectional web-based survey. J. Med. Internet Res. 21(4), e12887 (2019). https://doi.org/10.2196/12887
Ravichander, A., Black, A.W.: An empirical study of self-disclosure in spoken dialogue systems. In: Proceedings of the 19th Annual SIGdial Meeting on Discourse and Dialogue, pp. 253–263. Association for Computational Linguistics, Melbourne (2018). https://doi.org/10.18653/v1/W18-5030
Skjuve, M., Brandtzæg, P.B.: Chatbots as a New User Interface for Providing Health Information to Young People (2018)
van Uffelen, L., van Waterschoot, J., Theune, M.: Show me yours if i show you mine: self-disclosure in conversational agents. In: Proceedings Robo-Identity 2021. Association for Computing Machinery (ACM) (2021)
Vaidyam, A.N., Wisniewski, H., Halamka, J.D., Kashavan, M.S., Torous, J.B.: Chatbots and conversational agents in mental health: a review of the psychiatric landscape. Canadian J. Psychiatry 64(7), 456–464 (2019). https://doi.org/10.1177/0706743719828977
van der Lee, C., Croes, E., de Wit, J., Antheunis, M.: Digital Confessions: Exploring the Role of Chatbots in Self-Disclosure (2019)
von der Pütten, A.M., Hoffmann, L., Klatt, J., Krämer, N.C.: Quid pro quo? Reciprocal self-disclosure and communicative accomodation towards a virtual interviewer. In: Vilhjálmsson, H.H., Kopp, S., Marsella, S., Thórisson, K.R. (eds.) IVA 2011. LNCS (LNAI), vol. 6895, pp. 183–194. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-23974-8_20
Zhao, R., Papangelis, A., Cassell, J.: Towards a dyadic computational model of rapport management for human-virtual agent interaction. In: Bickmore, T., Marsella, S., Sidner, C. (eds.) IVA 2014. LNCS (LNAI), vol. 8637, pp. 514–527. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-09767-1_62
Acknowledgements
We would like to thank Marcel Germer, Sebastian Hell, Leon Hommel, Martin Junge, Marvin Kindermann, Andrea Linden, Christina Löcker, Johannes Sittel and Tim Wittig for their invaluable contributions to the studies described in this paper.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Neef, C., Mai, V., Richert, A. (2022). “I Am Scared of Viruses, Too” - Studying the Impact of Self-disclosure in Chatbots for Health-Related Applications. In: Kurosu, M. (eds) Human-Computer Interaction. User Experience and Behavior. HCII 2022. Lecture Notes in Computer Science, vol 13304. Springer, Cham. https://doi.org/10.1007/978-3-031-05412-9_35
Download citation
DOI: https://doi.org/10.1007/978-3-031-05412-9_35
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-05411-2
Online ISBN: 978-3-031-05412-9
eBook Packages: Computer ScienceComputer Science (R0)