PATIENT PERSPECTIVES ON THE USE OF AI IN MEDICAL DECISION-MAKING – EXPLORING PATIENT TRUST AND ACCEPTANCE OF AI-DRIVEN HEALTHCARE SERVICES- QUALITATIVE STUDIE

Main Article Content

Ihsan Ullah Khan
Urva Rehman
Sidra Hanif
Humaira Mehwish
Wesam Taher Almagharbeh
Muhammad Shahid
Muhammad Waleed Khan

Abstract

Background: As artificial intelligence (AI) becomes increasingly integrated into healthcare, patient perspectives on trust and acceptance are critical to its successful implementation. Despite technological advancements, there remains limited qualitative insight into how patients evaluate and respond to AI-driven medical decision-making systems, particularly in non-Western settings.


Objective: To explore patient trust and acceptance of AI-driven healthcare services within the sociocultural and clinical context of Islamabad, Pakistan.


Methods: A qualitative study design was employed, utilizing semi-structured in-depth interviews with 32 adult patients exposed to AI-supported healthcare tools. Participants were purposively sampled from both public and private hospitals over an eight-month period. Interviews were transcribed, translated, and analyzed using Braun and Clarke’s thematic analysis framework. Ethical approval was obtained, and informed consent was secured from all participants.


Results: Six major themes emerged: perceived trust in AI systems, comparative reliance on human versus AI decisions, emotional reactions and privacy concerns, transparency and understanding, cultural and religious influences, and willingness for future use. Patients viewed AI as potentially efficient but stressed the need for human oversight, emotional empathy, and system transparency. Trust was conditional and deeply influenced by previous healthcare experiences, data security concerns, and personal belief systems.


Conclusion: Patient trust in AI healthcare systems is multifaceted, shaped by technical, emotional, and cultural factors. Enhancing transparency, ensuring ethical safeguards, and maintaining human oversight are essential to increasing patient acceptance. These insights are crucial for designing AI systems that are both clinically effective and socially acceptable.

Article Details

Section
Articles
Author Biographies

Ihsan Ullah Khan, Burns and Plastic Surgery Center, Hayatabad, Peshawar, Pakistan.

 Assistant Professor, Khyber Girls Medical College; Burns and Plastic Surgery Center, Hayatabad, Peshawar, Pakistan.

Urva Rehman, Bahria University, Karachi, Pakistan.

Undergraduate Researcher, Department of Management Information Systems, Bahria University, Karachi, Pakistan.

Sidra Hanif, Ibadat International University, Islamabad, Pakistan.

Assistant Professor, Department of Physical Therapy, Ibadat International University, Islamabad, Pakistan.

Humaira Mehwish, Foundation Public School, Karachi, Pakistan.

 Computer Science Teacher, Foundation Public School, Karachi, Pakistan.

Wesam Taher Almagharbeh, University of Tabuk, Tabuk, Saudi Arabia

 Assistant Professor, Medical and Surgical Nursing Department, Faculty of Nursing, University of Tabuk, Tabuk, Saudi Arabia

Muhammad Shahid, Millat Hospital, Lodhran, Pakistan.

Consultant Urologist, Department of Urology, Millat Hospital, Lodhran, Pakistan.

Muhammad Waleed Khan, National University of Sciences & Technology (NUST), Islamabad, Pakistan.

School of Interdisciplinary Engineering and Sciences (SINES), National University of Sciences & Technology (NUST), Islamabad, Pakistan.

References

Wang B, Asan O, Mansouri M. What May Impact Trustworthiness of AI in Digital Healthcare: Discussion from Patients’ Viewpoint. Proceedings of the International Symposium on Human Factors and Ergonomics in Health Care. 2023;12:5-10.

Chirumbolo S, Berretta M, Tirelli U. Trust, trustworthiness and acceptability of a machine learning adoption in data-driven clinical decision support system. Some comments. Int J Med Inform. 2024;184:105374.

Chen H, Ma X, Rives H, Serpedin A, Yao P, Rameau A. Trust in Machine Learning Driven Clinical Decision Support Tools Among Otolaryngologists. Laryngoscope. 2024;134(6):2799-804.

Aquilino L, Bisconti P, Marchetti A. Trust in AI: Transparency, and Uncertainty Reduction. Development of a New Theoretical Framework. 2023:19-26.

Kostick-Quenet K, Lang BH, Smith J, Hurley M, Blumenthal-Barby J. Trust criteria for artificial intelligence in health: normative and epistemic considerations. J Med Ethics. 2024;50(8):544-51.

Ball R, Talal AH, Dang O, Muñoz M, Markatou M. Trust but Verify: Lessons Learned for the Application of AI to Case-Based Clinical Decision-Making From Postmarketing Drug Safety Assessment at the US Food and Drug Administration. J Med Internet Res. 2024;26:e50274.

Unver MB, Asan O. Role of Trust in AI-Driven Healthcare Systems: Discussion from the Perspective of Patient Safety. Proceedings of the International Symposium of Human Factors and Ergonomics in Healthcare. 2022;11:129-34.

Rodler S, Kopliku R, Ulrich D, Kaltenhauser A, Casuscelli J, Eismann L, et al. Patients' Trust in Artificial Intelligence-based Decision-making for Localized Prostate Cancer: Results from a Prospective Trial. Eur Urol Focus. 2024;10(4):654-61.

Perfalk E, Bernstorff M, Danielsen AA, Østergaard SD. Patient trust in the use of machine learning-based clinical decision support systems in psychiatric services: A randomized survey experiment. Eur Psychiatry. 2024;67(1):e72.

Loftus TJ, Filiberto AC, Balch J, Ayzengart AL, Tighe PJ, Rashidi P, et al. Intelligent, Autonomous Machines in Surgery. J Surg Res. 2020;253:92-9.

Breitbart EW, Choudhury K, Andersen AD, Bunde H, Breitbart M, Sideri AM, et al. Improved patient satisfaction and diagnostic accuracy in skin diseases with a Visual Clinical Decision Support System-A feasibility study with general practitioners. PLoS One. 2020;15(7):e0235410.

Rainey C, Bond R, McConnell J, Gill A, Hughes C, Kumar D, et al. The impact of AI feedback on the accuracy of diagnosis, decision switching and trust in radiography. PLoS One. 2025;20(5):e0322051.

Sîrbu CL, Mercioni M. Fostering Trust in AI-Driven Healthcare: A Brief Review of Ethical and Practical Considerations. 2024 International Symposium on Electronics and Telecommunications (ISETC). 2024:1-4.

Sakamoto T, Harada Y, Shimizu T. Facilitating Trust Calibration in Artificial Intelligence-Driven Diagnostic Decision Support Systems for Determining Physicians' Diagnostic Accuracy: Quasi-Experimental Study. JMIR Form Res. 2024;8:e58666.

Nong P, Ji M. Expectations of healthcare AI and the role of trust: understanding patient views on how AI will impact cost, access, and patient-provider relationships. Journal of the American Medical Informatics Association : JAMIA. 2025.

Chandio S, Rehman A, Bano S, Hammed A, Hussain A. Enhancing Trust in Healthcare: The Role of AI Explainability and Professional Familiarity. The Asian Bulletin of Big Data Management. 2024.

Zondag AGM, Rozestraten R, Grimmelikhuijsen SG, Jongsma KR, van Solinge WW, Bots ML, et al. The Effect of Artificial Intelligence on Patient-Physician Trust: Cross-Sectional Vignette Study. J Med Internet Res. 2024;26:e50853.

Kara MA. Clouds on the horizon: clinical decision support systems, the control problem, and physician-patient dialogue. Med Health Care Philos. 2025;28(1):125-37.

Smith H, Downer J, Ives J. Clinicians and AI use: where is the professional guidance? J Med Ethics. 2024;50(7):437-41.

Ankolekar A, van der Heijden B, Dekker A, Roumen C, De Ruysscher D, Reymen B, et al. Clinician perspectives on clinical decision support systems in lung cancer: Implications for shared decision-making. Health Expect. 2022;25(4):1342-51.

Prinster D, Mahmood A, Saria S, Jeudy J, Lin CT, Yi PH, et al. Care to Explain? AI Explanation Types Differentially Impact Chest Radiograph Diagnostic Performance and Physician Trust in AI. Radiology. 2024;313(2):e233261.

Cimino JJ, Martin HD, Colicchio TK. Capturing Clinician Reasoning in Electronic Health Records: An Exploratory Study of Under-Treated Essential Hypertension. AMIA Annu Symp Proc. 2020;2020:311-8.

Tahtali MA, Dirne C. A Beyond Diagnosis Approach: Fostering Trust in AI's Supportive Role in Healthcare. 2024:74-7.