Javascript must be enabled to continue!
Review of chatbots in urogynecology
View through CrossRef
Purpose of review
Chatbots based on large language models have been rapidly incorporated into many aspects of medicine in a short time despite an incomplete understanding of their capabilities. This review focuses on ways these chatbots have been utilized in urogynecology.
Recent findings
Publications regarding chatbots in urogynecology have centered on patient education, scientific literature review, clinical decision-making, documentation, and research. Several authors have evaluated the ability of chatbots to generate accurate and complete information about prolapse and urinary incontinence. While chatbots can generate accurate information about pelvic floor disorders most of the time, the studies we review indicate that incomplete, misleading, or incorrect information is generated up to 33% of the time. Newer chatbots that are trained for medical applications may help to limit some of these problems. Using chatbots to assist with scientific literature review and research is currently hampered by unpredictable ‘hallucinations’, where the chatbot may generate information or references that sound plausible but are factually incorrect.
Summary
While chatbots are being rapidly integrated into many aspects of medicine, the research evaluating these tools in urogynecology is limited. Publicly available chatbots should only be used for patient education, clinical decision-making, and research with caution.
Ovid Technologies (Wolters Kluwer Health)
Title: Review of chatbots in urogynecology
Description:
Purpose of review
Chatbots based on large language models have been rapidly incorporated into many aspects of medicine in a short time despite an incomplete understanding of their capabilities.
This review focuses on ways these chatbots have been utilized in urogynecology.
Recent findings
Publications regarding chatbots in urogynecology have centered on patient education, scientific literature review, clinical decision-making, documentation, and research.
Several authors have evaluated the ability of chatbots to generate accurate and complete information about prolapse and urinary incontinence.
While chatbots can generate accurate information about pelvic floor disorders most of the time, the studies we review indicate that incomplete, misleading, or incorrect information is generated up to 33% of the time.
Newer chatbots that are trained for medical applications may help to limit some of these problems.
Using chatbots to assist with scientific literature review and research is currently hampered by unpredictable ‘hallucinations’, where the chatbot may generate information or references that sound plausible but are factually incorrect.
Summary
While chatbots are being rapidly integrated into many aspects of medicine, the research evaluating these tools in urogynecology is limited.
Publicly available chatbots should only be used for patient education, clinical decision-making, and research with caution.
Related Results
Assessing the quality and readability of patient education materials on chemotherapy cardiotoxicity from artificial intelligence chatbots: An observational cross-sectional study
Assessing the quality and readability of patient education materials on chemotherapy cardiotoxicity from artificial intelligence chatbots: An observational cross-sectional study
Artificial intelligence (AI) and the introduction of Large Language Model (LLM) chatbots have become a common source of patient inquiry in healthcare. The quality and readability o...
Evaluating the Science to Inform the Physical Activity Guidelines for Americans Midcourse Report
Evaluating the Science to Inform the Physical Activity Guidelines for Americans Midcourse Report
Abstract
The Physical Activity Guidelines for Americans (Guidelines) advises older adults to be as active as possible. Yet, despite the well documented benefits of physical a...
AI Chatbots and Psychotherapy: A Match Made in Heaven?
AI Chatbots and Psychotherapy: A Match Made in Heaven?
Dear Editor,
Artificial Intelligence (AI) is revolutionizing psychotherapy by combating its inaccessibility.1 AI chatbots and conversational agents are among the most promising int...
Improving Students' Linguistic Competence through Chatbots
Improving Students' Linguistic Competence through Chatbots
This study explores chatbots' influence on English linguistic competence and examines students' experiences with chatbots for language development in an educational institution in ...
Implementasi Chatbot Pelajaran Sekolah Dasar Dengan Pandorabots
Implementasi Chatbot Pelajaran Sekolah Dasar Dengan Pandorabots
Chatbot is a virtual conversation that can receive input in the form of voice or writing. A chatbot can be a generative or retrieval chatbot. The creation of the two chatbots provi...
The Digital Therapeutic Alliance With Mental Health Chatbots: Diary Study and Thematic Analysis (Preprint)
The Digital Therapeutic Alliance With Mental Health Chatbots: Diary Study and Thematic Analysis (Preprint)
BACKGROUND
Mental health chatbots are increasingly used to address the global mental health treatment gap by offering scalable, accessible, and anonymous su...
Artificial Intelligence Chatbots Mimic Human Collective Behaviour
Artificial Intelligence Chatbots Mimic Human Collective Behaviour
Artificial Intelligence (AI) chatbots, such as ChatGPT, have been shown to mimic individual human behaviour in a wide range of psychological and economic tasks. Do groups of AI cha...
User Intentions to Use ChatGPT for Self-Diagnosis and Health-Related Purposes: Cross-sectional Survey Study (Preprint)
User Intentions to Use ChatGPT for Self-Diagnosis and Health-Related Purposes: Cross-sectional Survey Study (Preprint)
BACKGROUND
With the rapid advancement of artificial intelligence (AI) technologies, AI-powered chatbots, such as Chat Generative Pretrained Transformer (Cha...

