- Home
- 2025 Annual Symposium Gallery
- Shedding Light on the Invisible Work of Peer Recovery Support Specialists through the Design of a Point of Care Technology
Custom CSS
double-click to edit, do not edit in source
11/18/2025 |
2:00 PM – 3:15 PM |
Room 3
S76: CTRL+ALT+EQUITY: Rebooting Clinical System Design
Presentation Type: Oral Presentations
Navigating Variability in Prostate RT Planning: Real-Time Insights for Human-Centered CDS Design
2025 Annual Symposium On Demand
Presentation Time: 02:00 PM - 02:12 PM
Abstract Keywords: Clinical Decision Support, Patient Safety, Human-computer Interaction, User-centered Design Methods, Healthcare Quality, Qualitative Methods, Cancer Prevention, Informatics Implementation
Primary Track: Applications
Programmatic Theme: Clinical Informatics
Clinical variability in prostate radiation therapy (RT) planning is well documented, but little is known about how clinicians experience and adapt to the factors that drive it. This study explores variability as a human-centered design challenge, with the goal of informing clinical decision support (CDS) design through real-time insight into clinical decision-making. We conducted think-aloud interviews with five radiation oncologists while they contoured prostate cases. Using the Systems Engineering Initiative for Patient Safety (SEIPS) framework, we thematically analyzed the contributors to variability across tasks, technology, and organizational conditions. Results suggest that that variability arises not only from anatomical or guideline ambiguity, but also from individual interpretations of inputs, variation in contouring decisions, and adaptive strategies such as reliance on prior experience and estimation under uncertainty. Our findings support context-sensitive CDS design that reflects real-world planning complexity and preserves necessary clinical flexibility.
Speaker:
Meagan Foster, Master of Professional Science in Biomedical and Health Informatics
Division of Healthcare Engineering
Authors:
Meagan Foster, Master of Professional Science in Biomedical and Health Informatics - Division of Healthcare Engineering; Elizabeth Byrd, MS - UNC-Chapel Hill; Elizabeth Kwong - UNC Chapel Hill; Anirudh Karunaker, MS - Department of Radiation Oncology, University of North Carolina, Chapel Hill, NC, United States; Brian Anderson, PhD - Department of Radiation Oncology, University of North Carolina, Chapel Hill, NC, United States; Michael Repka, MD - Department of Radiation Oncology, University of North Carolina, Chapel Hill, NC, United States; Ross McGurk, PhD - Department of Radiation Oncology, University of North Carolina, Chapel Hill, NC, United States; Shiva Das, PhD - Department of Radiation Oncology, University of North Carolina, Chapel Hill, NC, United States; Lawrence Marks, MD - Department of Radiation Oncology, University of North Carolina, Chapel Hill, NC, United States; Lukasz Mazur, PhD - Department of Radiation Oncology, University of North Carolina, Chapel Hill, NC, United States;
2025 Annual Symposium On Demand
Presentation Time: 02:00 PM - 02:12 PM
Abstract Keywords: Clinical Decision Support, Patient Safety, Human-computer Interaction, User-centered Design Methods, Healthcare Quality, Qualitative Methods, Cancer Prevention, Informatics Implementation
Primary Track: Applications
Programmatic Theme: Clinical Informatics
Clinical variability in prostate radiation therapy (RT) planning is well documented, but little is known about how clinicians experience and adapt to the factors that drive it. This study explores variability as a human-centered design challenge, with the goal of informing clinical decision support (CDS) design through real-time insight into clinical decision-making. We conducted think-aloud interviews with five radiation oncologists while they contoured prostate cases. Using the Systems Engineering Initiative for Patient Safety (SEIPS) framework, we thematically analyzed the contributors to variability across tasks, technology, and organizational conditions. Results suggest that that variability arises not only from anatomical or guideline ambiguity, but also from individual interpretations of inputs, variation in contouring decisions, and adaptive strategies such as reliance on prior experience and estimation under uncertainty. Our findings support context-sensitive CDS design that reflects real-world planning complexity and preserves necessary clinical flexibility.
Speaker:
Meagan Foster, Master of Professional Science in Biomedical and Health Informatics
Division of Healthcare Engineering
Authors:
Meagan Foster, Master of Professional Science in Biomedical and Health Informatics - Division of Healthcare Engineering; Elizabeth Byrd, MS - UNC-Chapel Hill; Elizabeth Kwong - UNC Chapel Hill; Anirudh Karunaker, MS - Department of Radiation Oncology, University of North Carolina, Chapel Hill, NC, United States; Brian Anderson, PhD - Department of Radiation Oncology, University of North Carolina, Chapel Hill, NC, United States; Michael Repka, MD - Department of Radiation Oncology, University of North Carolina, Chapel Hill, NC, United States; Ross McGurk, PhD - Department of Radiation Oncology, University of North Carolina, Chapel Hill, NC, United States; Shiva Das, PhD - Department of Radiation Oncology, University of North Carolina, Chapel Hill, NC, United States; Lawrence Marks, MD - Department of Radiation Oncology, University of North Carolina, Chapel Hill, NC, United States; Lukasz Mazur, PhD - Department of Radiation Oncology, University of North Carolina, Chapel Hill, NC, United States;
Meagan
Foster,
Master of Professional Science in Biomedical and Health Informatics - Division of Healthcare Engineering
Leveraging Provocative Design Methods to Address Implicit Bias in Clinical Interactions through Technology
2025 Annual Symposium On Demand
Presentation Time: 02:12 PM - 02:24 PM
Abstract Keywords: Artificial Intelligence, Health Equity, User-centered Design Methods
Primary Track: Applications
Implicit bias impacts the quality of patient-clinician interactions, influencing patient outcomes and trust in healthcare. Most interventions to mitigate bias rely solely on expensive human assessments, rather than leveraging AI technology with clinician input. To explore clinician-envisioned interventions, we conducted interviews with 16 primary care clinicians using provocative design methods to facilitate innovative ideation on using technology to address implicit bias. Themes from interviews included: patient communication monitoring, clinician self-awareness, systemic solutions, optimizing workflow, clinician education, and patient feedback. These envisioned interventions provide design considerations for technology-based implicit bias feedback tools. The broad range of innovative solutions generated by clinicians at various career stages reflects the utility of provocative design methods in unlocking creative thinking among a population that is not often encouraged to think beyond structured real-world constraints.
Speaker:
Deepthi Mohanraj, B.S.
University of Washington Bioinformatics and Medical Education
Authors:
Andrea Hartzler, PhD - University of Washington; Raina Langevin, PhD - University of Washington; Janice Sabin - University of Washington; Nadir Weibel, PhD - UC San Diego; Wanda Pratt, PhD, FACMI - University of Washington; Libby Shah, RN, BSN - University of Washington; Brian Wood, M.D. - University of Washington;
2025 Annual Symposium On Demand
Presentation Time: 02:12 PM - 02:24 PM
Abstract Keywords: Artificial Intelligence, Health Equity, User-centered Design Methods
Primary Track: Applications
Implicit bias impacts the quality of patient-clinician interactions, influencing patient outcomes and trust in healthcare. Most interventions to mitigate bias rely solely on expensive human assessments, rather than leveraging AI technology with clinician input. To explore clinician-envisioned interventions, we conducted interviews with 16 primary care clinicians using provocative design methods to facilitate innovative ideation on using technology to address implicit bias. Themes from interviews included: patient communication monitoring, clinician self-awareness, systemic solutions, optimizing workflow, clinician education, and patient feedback. These envisioned interventions provide design considerations for technology-based implicit bias feedback tools. The broad range of innovative solutions generated by clinicians at various career stages reflects the utility of provocative design methods in unlocking creative thinking among a population that is not often encouraged to think beyond structured real-world constraints.
Speaker:
Deepthi Mohanraj, B.S.
University of Washington Bioinformatics and Medical Education
Authors:
Andrea Hartzler, PhD - University of Washington; Raina Langevin, PhD - University of Washington; Janice Sabin - University of Washington; Nadir Weibel, PhD - UC San Diego; Wanda Pratt, PhD, FACMI - University of Washington; Libby Shah, RN, BSN - University of Washington; Brian Wood, M.D. - University of Washington;
Deepthi
Mohanraj,
B.S. - University of Washington Bioinformatics and Medical Education
Designing Technology-Assisted Interventions for Justice-Impacted Black American Women
2025 Annual Symposium On Demand
Presentation Time: 02:24 PM - 02:36 PM
Abstract Keywords: User-centered Design Methods, Personal Health Informatics, Mobile Health, Diversity, Equity, Inclusion, and Accessibility, Health Equity, Human-computer Interaction
Primary Track: Applications
Programmatic Theme: Consumer Health Informatics
This study explores the challenges faced by justice-impacted Black women during their reintegration into society, with a focus on mental health care access and the potential for technology-assisted interventions to address barriers. Participants from focus groups emphasized significant obstacles, including inadequate mental health resources during incarceration, insufficient post-release support, and barriers such as discrimination, lack of insurance, and transportation issues. Key recommendations for designing technology-assisted interventions, such as the Welcome Home app, include trauma-informed design, tiered support systems, integration with electronic health records, privacy protection, and culturally tailored content. The study underscores the importance of culturally relevant, user-centered digital solutions to improve health outcomes and facilitate the successful reintegration of Black women impacted by the criminal legal system. Apps that provide a sense of community promote engagement which may improve health outcomes.
Speaker:
Terika McCall, PhD, MPH, MBA
Yale School of Public Health
Authors:
Amelea Lowery; Bria Massey, BS - Johns Hopkins School of Medicine; Meera Swaminath, MPH - Lumanity; Shamima Afrose, MS, MPA - Yale School of Public Health; Monya Saunders, AA - Yale School of Medicine; Karen Wang, MD - Yale School of Medicine;
2025 Annual Symposium On Demand
Presentation Time: 02:24 PM - 02:36 PM
Abstract Keywords: User-centered Design Methods, Personal Health Informatics, Mobile Health, Diversity, Equity, Inclusion, and Accessibility, Health Equity, Human-computer Interaction
Primary Track: Applications
Programmatic Theme: Consumer Health Informatics
This study explores the challenges faced by justice-impacted Black women during their reintegration into society, with a focus on mental health care access and the potential for technology-assisted interventions to address barriers. Participants from focus groups emphasized significant obstacles, including inadequate mental health resources during incarceration, insufficient post-release support, and barriers such as discrimination, lack of insurance, and transportation issues. Key recommendations for designing technology-assisted interventions, such as the Welcome Home app, include trauma-informed design, tiered support systems, integration with electronic health records, privacy protection, and culturally tailored content. The study underscores the importance of culturally relevant, user-centered digital solutions to improve health outcomes and facilitate the successful reintegration of Black women impacted by the criminal legal system. Apps that provide a sense of community promote engagement which may improve health outcomes.
Speaker:
Terika McCall, PhD, MPH, MBA
Yale School of Public Health
Authors:
Amelea Lowery; Bria Massey, BS - Johns Hopkins School of Medicine; Meera Swaminath, MPH - Lumanity; Shamima Afrose, MS, MPA - Yale School of Public Health; Monya Saunders, AA - Yale School of Medicine; Karen Wang, MD - Yale School of Medicine;
Terika
McCall,
PhD, MPH, MBA - Yale School of Public Health
Shedding Light on the Invisible Work of Peer Recovery Support Specialists through the Design of a Point of Care Technology
2025 Annual Symposium On Demand
Presentation Time: 02:36 PM - 02:48 PM
Abstract Keywords: User-centered Design Methods, Workflow, Human-computer Interaction, Qualitative Methods
Primary Track: Applications
Programmatic Theme: Clinical Informatics
Peer Recovery Support Specialists (PRSSs) are certified professionals that provide social, informational, and logistical support to people in recovery from substance use disorders. In this study, we report the findings from design work completed with a team of PRSSs to define workflow and informational needs for the development of a novel point of care software. Through our qualitative research we uncovered a significant amount of undocumented work that is necessary for a PRSS to provide individualized support to their clients. Thus, much PRSS work is not quantified and rendered “invisible.” We present technology design strategies that help quantify these tasks and illuminate the complexity of PRSS work.
Speaker:
Jessica Pater, PhD
Parkview Research Center
Authors:
Elisabeth Andrews, LMSW - Parkview Behavioral Health; Michelle Drouin, PhD - Parkview Research Center; Mindy Flanagan, PhD - Parkview Research Center; Dana Albriht, PhD - Parkview Research Center; Rachel Pfafman, MPH - Parkview Research Center; Jeanne Carroll, RN - Parkview Research Center; Erik Hess, MD, MSc - Vanderbilt University Medical Center; Tammy Toscos, PhD - Parkview Health;
2025 Annual Symposium On Demand
Presentation Time: 02:36 PM - 02:48 PM
Abstract Keywords: User-centered Design Methods, Workflow, Human-computer Interaction, Qualitative Methods
Primary Track: Applications
Programmatic Theme: Clinical Informatics
Peer Recovery Support Specialists (PRSSs) are certified professionals that provide social, informational, and logistical support to people in recovery from substance use disorders. In this study, we report the findings from design work completed with a team of PRSSs to define workflow and informational needs for the development of a novel point of care software. Through our qualitative research we uncovered a significant amount of undocumented work that is necessary for a PRSS to provide individualized support to their clients. Thus, much PRSS work is not quantified and rendered “invisible.” We present technology design strategies that help quantify these tasks and illuminate the complexity of PRSS work.
Speaker:
Jessica Pater, PhD
Parkview Research Center
Authors:
Elisabeth Andrews, LMSW - Parkview Behavioral Health; Michelle Drouin, PhD - Parkview Research Center; Mindy Flanagan, PhD - Parkview Research Center; Dana Albriht, PhD - Parkview Research Center; Rachel Pfafman, MPH - Parkview Research Center; Jeanne Carroll, RN - Parkview Research Center; Erik Hess, MD, MSc - Vanderbilt University Medical Center; Tammy Toscos, PhD - Parkview Health;
Jessica
Pater,
PhD - Parkview Research Center
Coding Fairness: Detecting Demographic-Related Coding Discrepancies in ICD Code Assignments
2025 Annual Symposium On Demand
Presentation Time: 02:48 PM - 03:00 PM
Abstract Keywords: Fairness and Elimination of Bias, Artificial Intelligence, Machine Learning, Data Mining
Primary Track: Foundations
Coded clinical data are crucial in biomedical informatics research. While it is well known that electronic medical records often contain coding errors, numerous studies rely on International Classification of Diseases (ICD) codes for phenotyping in cohort assembly, statistical analysis, and AI modeling. Although fairness has become an important focus in AI research, the potential biases embedded in coded clinical data have received less attention. In this study, we employed a race- and gender-agnostic AI phenotyping model to assess coding fairness across 203 ICD code blocks within the Veterans Health Administration Clinical Data Warehouse. Our findings revealed variability in coding consistency across demographic subgroups, including sex, race, and ethnicity. Notably, over 50% of the code blocks exhibited statistically significant differences in discrepancies between AI-generated and ICD-based phenotypes across these demographic groups. These results suggest the need to recognize and address demographic-related coding discrepancies to ensure coding fairness.
Speaker:
Ying Yin, Ph.D.
The George Washington University
Authors:
Ying Yin, Ph.D. - The George Washington University; Stuart Nelson, MD, FACP, FACMI - George Washington University; Yijun Shao, PhD - George Washington University; Charles Faselis, M.D. - Washington DC Veterans Affairs Medical Center; Ali Ahmed, MD, MPH - Washington DC Veterans Affairs Medical Center; Qing Zeng, PhD - George Washington University;
2025 Annual Symposium On Demand
Presentation Time: 02:48 PM - 03:00 PM
Abstract Keywords: Fairness and Elimination of Bias, Artificial Intelligence, Machine Learning, Data Mining
Primary Track: Foundations
Coded clinical data are crucial in biomedical informatics research. While it is well known that electronic medical records often contain coding errors, numerous studies rely on International Classification of Diseases (ICD) codes for phenotyping in cohort assembly, statistical analysis, and AI modeling. Although fairness has become an important focus in AI research, the potential biases embedded in coded clinical data have received less attention. In this study, we employed a race- and gender-agnostic AI phenotyping model to assess coding fairness across 203 ICD code blocks within the Veterans Health Administration Clinical Data Warehouse. Our findings revealed variability in coding consistency across demographic subgroups, including sex, race, and ethnicity. Notably, over 50% of the code blocks exhibited statistically significant differences in discrepancies between AI-generated and ICD-based phenotypes across these demographic groups. These results suggest the need to recognize and address demographic-related coding discrepancies to ensure coding fairness.
Speaker:
Ying Yin, Ph.D.
The George Washington University
Authors:
Ying Yin, Ph.D. - The George Washington University; Stuart Nelson, MD, FACP, FACMI - George Washington University; Yijun Shao, PhD - George Washington University; Charles Faselis, M.D. - Washington DC Veterans Affairs Medical Center; Ali Ahmed, MD, MPH - Washington DC Veterans Affairs Medical Center; Qing Zeng, PhD - George Washington University;
Ying
Yin,
Ph.D. - The George Washington University
Multi-Adversarial Debiasing in Clinical Artificial Intelligence
2025 Annual Symposium On Demand
Presentation Time: 03:00 PM - 03:12 PM
Abstract Keywords: Fairness and Elimination of Bias, Artificial Intelligence, Health Equity
Primary Track: Applications
Programmatic Theme: Clinical Informatics
While multiple types of biases can occur in clinical machine learning, the status quo in algorithmic debiasing is to optimize a single fairness metric. We propose a multi-adversarial debiasing framework that builds on the established technique of adversarial debiasing to jointly optimize two or more fairness definitions. Our experiments use two adversaries corresponding to demographic parity (DP) and equalized mistreatment (EM). Evaluating four datasets, including two clinical datasets (UCI Heart Disease and a Parkinson’s Disease digital health dataset) and two algorithmic fairness benchmarks (COMPAS and Adult Income), we find that our multi-adversarial approach reduces DP between 0.03-0.22 and EM between 0.02-0.13 while maintaining the F1 score within 0-16% of the baseline models. Analyzing these performance variations, we find that adversarial debiasing is most effective in datasets with adequate representation of positive and negative labels across protected attribute values, but the effectiveness declines when this is not the case.
Speaker:
Md Rahat Shahriar Zawad, MS Student
University of Hawaii at Manoa
Authors:
Md Rahat Shahriar Zawad, MS Student - University of Hawaii at Manoa; Irene Y Chen, PhD - University of California, Berkeley; Peter Washington, PhD - University of California, San Francisco;
2025 Annual Symposium On Demand
Presentation Time: 03:00 PM - 03:12 PM
Abstract Keywords: Fairness and Elimination of Bias, Artificial Intelligence, Health Equity
Primary Track: Applications
Programmatic Theme: Clinical Informatics
While multiple types of biases can occur in clinical machine learning, the status quo in algorithmic debiasing is to optimize a single fairness metric. We propose a multi-adversarial debiasing framework that builds on the established technique of adversarial debiasing to jointly optimize two or more fairness definitions. Our experiments use two adversaries corresponding to demographic parity (DP) and equalized mistreatment (EM). Evaluating four datasets, including two clinical datasets (UCI Heart Disease and a Parkinson’s Disease digital health dataset) and two algorithmic fairness benchmarks (COMPAS and Adult Income), we find that our multi-adversarial approach reduces DP between 0.03-0.22 and EM between 0.02-0.13 while maintaining the F1 score within 0-16% of the baseline models. Analyzing these performance variations, we find that adversarial debiasing is most effective in datasets with adequate representation of positive and negative labels across protected attribute values, but the effectiveness declines when this is not the case.
Speaker:
Md Rahat Shahriar Zawad, MS Student
University of Hawaii at Manoa
Authors:
Md Rahat Shahriar Zawad, MS Student - University of Hawaii at Manoa; Irene Y Chen, PhD - University of California, Berkeley; Peter Washington, PhD - University of California, San Francisco;
Md Rahat
Shahriar Zawad,
MS Student - University of Hawaii at Manoa
Shedding Light on the Invisible Work of Peer Recovery Support Specialists through the Design of a Point of Care Technology
Category
Paper - Regular
Description
Custom CSS
double-click to edit, do not edit in source
11/18/2025 03:15 PM (Eastern Time (US & Canada))