Welcome to HITea with Grace, where we spill the tea on all things health IT—and the moments shaping the future of care. In this episode, Grace sits down with Dr. Bridget Duffy, the nation’s first Chief Experience Officer, longtime patient experience leader, and co-chair of the AI Care Standard™, for a timely conversation at the intersection of AI and human-centered care.
Dr. Duffy shares the journey that led her from championing patient experience to helping define one of the first comprehensive standards for patient-facing AI. As health systems rapidly adopt chatbots, portals, and automated communication tools, she breaks down what prompted the creation of the AI Care Standard—and why existing governance frameworks simply weren’t built for this moment.
Together, they unpack the 10 Core Pillars of the Standard and the real-world challenges organizations will face in operationalizing them. From treating AI communication with the same rigor as clinician interactions to addressing bias, health literacy, and accessibility, this conversation gets into what responsible AI actually looks like in practice—not just in principle.
Looking ahead, Grace and Bridget Duffy, MD, board member of Vital.io and co-chair of the PatientAI Collaborative™ explore how the AI Care Standard could shape regulation, accreditation, and industry norms—and what success really looks like five years from now.
As always, the episode closes on a more personal note, with reflections on leadership, resilience, and advice for women building careers in healthcare and health IT. It’s thoughtful, forward-looking, and grounded in one core idea: technology should strengthen—not replace—the human experience in healthcare.
[00:00:04] Welcome to High Tea with Grace, where we spill the tea on HIT. Today, I'm honored to welcome to the pod Dr. Bridget Duffy, the nation's first Chief Experience Officer, a Patient Experience Leader, and co-chair of the AI Care Standard. Thanks for joining me, Dr. Duffy. Thanks for having me, Grace. So tell me all about the career path that brought you to your role today.
[00:00:26] Well, I guess I have been an evangelist for humanizing the patient experience. And my belief was shaped early probably from a couple of mentors, my parents and Earl Bach and the co-founder of Medtronic, who really believed that we shouldn't treat people as diseased and fragmented body parts, that we should treat them as a whole human being. So my career in medicine took a turn that I least expected when I met this mentor.
[00:00:51] And I've been on a journey the last 20 plus years to humanize the way we deliver medical technology. And when I became the first Chief Experience Officer, this is after years of caring about the patient experience when nobody else did. And it took the government, shame on us, mandating that we measure patient satisfaction, publicly report it, and then we tied reimbursement to it. But thankfully, the world has caught up. But when I first started in this work, the number one complaint from patients was a breakdown in communication.
[00:01:19] Didn't know my name, didn't know me as a whole human being, didn't treat me with compassion, didn't listen to me, didn't help me understand my results. So I set about trying to help fix broken communication. And now it brings us to an era with artificial intelligence. And it's a whole new ballgame on what we need to do to protect the safety and the humanity of the people that we serve. But my early career was shaped by a couple of mentors to try to revolutionize and improve the patient experience.
[00:01:45] But along the way, I realized you also had to focus on the staff that burnt out doctors, nurses, dentists that I've encountered, really what we need to address first. So my journey is really trying to uplift humanity by focusing on the well-being of the staff who then deliver a great patient experience. And we are so grateful for your patient experience evangelism that has led us to this moment.
[00:02:05] When you look at how AI is communicating directly with patients, what concerned you the most and ultimately compelled you and others to help lead the AI care standard? Well, I think I've been part of a couple of teams with AI-focused companies and Vital.io in particular and its founder, Aaron Patzer.
[00:02:24] And as we started looking at this, I almost felt this fear for my loved ones who were not informed about how accurate and how safe the information was when they were plugging in their medical condition or symptoms. And so I think we as a company and as a team felt a sense of obligation to help protect humanity.
[00:02:45] And so we got a small coalition together of like-minded, diverse people from healthcare system leaders, from digital health, and most importantly, patients and patient consumers and advocates and tech leaders to say, there's no standard that exists out there. Why don't we create one for the world to use to safeguard our patients?
[00:03:09] We spent a lot of work coming up with 10 pillars of safety to assess or vet any technology or any tool a patient might seek to use. And one of the most important parameters we looked at, does it actually ease the burden of being a doctor or a nurse in healthcare today? Because we all cared fiercely about the mass exodus from the profession of both nurses and doctors and also dentists in other professions.
[00:03:34] We felt an obligation to do it and it was a grassroots effort, volunteers not tied to any one vendor or company. And we are so excited to have launched this a couple of weeks ago through this AI collaborative and to really serve as a clearinghouse and as a place where others can learn and protect the patients that they serve. I just love it. I've seen it all over the healthcare news. So that was very exciting to see.
[00:03:58] Can you walk us through the 10 core pillars at a high level and explain what pillars you believe will be the most challenging for organizations to operationalize? Absolutely. And I think most importantly, and people can go check this out at AICareStandard.com, AICareStandard.com and actually see the 10 pillars. But I think most importantly, what makes this really unique is that it's actually an operating framework that defines what safe, accurate and clinically responsible is.
[00:04:27] And you can actually assess a tool or the system that you're building and give it a score so that at the end of it, you'll have a score and say, does this actually meet the checklist of always events that will protect the people that you serve? So first and foremost, the number one pillar is around safe and clinically accurate. A lot of people, patient consumers don't realize that things like chat GPT are trained on information just from the internet and the general population.
[00:04:54] It's not trained on clinical evidence-based scientific data. So you may get information back that's believable, but not truthful. And I think what accelerated our work on this was the number of human tragedies that occurred when younger kids would put information into the internet and get information back and the number of suicides and or deaths that occurred. So we felt a sense of urgency to get this out to the masses.
[00:05:22] So that's number one, and clinicians must be involved in helping shape and feed and train the data. Number two, it should be situationally appropriate responses. Does it tailor the tone? Does it address and understand the patient's emotional needs, their cognitive level? And is it clinically correct? So it needs to be sensitive enough to know when to escalate that this person may have suicidal ideation and you need to do something about it.
[00:05:49] Number three, it has to be clear, closed-loop communication so that there's actually an action or the patient is routed to somebody who can help manage their care. Number four, trust and patient-specific accommodation. It has to meet, it has to have cultural humility and sensitivity and understand the diversity of the patient population and meet them where they are in their journey. Number five, patient autonomy and empowerment.
[00:06:15] We have to support and understand the patient's questions without bias or influence on what's being fed back to the patient. Number six, disclosure and training limitations. We have to let people know when they're actually interfacing with AI and when they're not. So it has to be very transparent and clear. Number seven, truth and evidence. As I mentioned before, every claim has to be verifiable with clinical data.
[00:06:43] Because so much that a patient may query is really believable, but it's not grounded in science. Number eight, as I mentioned, we have to optimize care team workflow and well-being. Many times a patient will come in after already having put their information in chat GPT to an emergency room.
[00:07:00] And one of my colleagues, Dr. Nick Sterling said, added 30 to 50 minutes longer with his appointment because he had to disavow them of the believable but not truthful information that they found online. So we have to make sure that these tools protect the team. Nine, we have to acknowledge its limitations and confidence levels and mid-uncertainty when it exists and when the evidence is lacking.
[00:07:25] And number 10, we have to have continuous oversight and improvement, meaning clinicians and healthcare IT leaders have to monitor the performance and adjust the model, which is what we are doing with AI standard. As new things come out, we will continue to modify and adjust this. So those are the 10 pillars against which we vet all tools that have been created and that currently exist.
[00:07:48] And we're hopeful that any new technologies that come up from any company or any startup, that they will use our tool so that they can build a product that they know is safe, reliable, and accurate. That's fantastic. This is all amazing information. And I think, too, hearing these pillars and what the community is thinking about to keep these patients safe and improve their experience is just inspiring. I think there's so much bad news out there about how patients are being harmed.
[00:08:16] And to hear that there are being changes made to impact the future in a positive light is just really encouraging. I know for me and I'm sure for others listening in, who are the types of folks that are going to be utilizing this care standard or have signed on to say, yeah, we're absolutely going to be moving forward with this? Well, our collaborative involves some of the top health systems in the country.
[00:08:38] And every health system is standing up an AI committee to assess, to vet, to safeguard any tools or technology or solutions that they're building or bringing into the system. So it's a no-brainer for all of these AI committees to be able to use the tool. I think it will be for risk teams, for insurance companies to be able to look and de-risk their situations, other vendors, as I mentioned. And then I think government and oversight committees.
[00:09:05] I think our dream is that someday this, I don't like mandates, but that perhaps there's a mandate by government or CMS that says this is a tool or a standard that any company should use that assures and protects the safety of everyone in healthcare. So it's our hope that it's for the masses. And then lastly, our dream in two years or three or five years as this really could become the consumer.
[00:09:31] So also it's for consumers that this could become the consumer report for tools that patient consumers for their loved ones could go to this clearinghouse and say, I'm looking at this tool. Is it safe for me to use for my child or for my parents who I'm navigating their care so that this tool, we can look out to see what's out there. And we will be running that through the checklist of the pillars and see what score it gets.
[00:09:55] So most importantly, we did this for consumers, but also for healthcare so that we can be the bridge between those two worlds. So cool. I love this work that you are doing. Thank you so much for being a leader in the space. Now, while my listeners love to hear about trends in the space and different things that are happening in the industry, they also love to learn from women leaders like yourself about what drives you and helps you just keep going. So, you know, in your personal life, as you are a leader, you're a physician, you are so many things.
[00:10:25] What are things that you do to help work your best and make a difference? I think what drives me and why I get up every day is my father, who passed away not too long ago, said to me, you can never retire. Healthcare and dentistry is still so broken. He was a periodontist.
[00:10:42] And as I witnessed myself and my family navigating a graceful exit from this planet for him, I realized you shouldn't have to be someone or have credentials or know someone to get access to and navigate the healthcare system. So I'm motivated by the inequities that I see and the tools and technology that get in the way of reaching the masses. So one, I want to level the playing field and give a voice to people who have no voice.
[00:11:10] For example, in this particular project, communication is so broken. I have so many stories of loved ones and patients that say they would get breast tissue or prostate tissue biopsied on a Tuesday, have living health through the weekend, weekend, not knowing whether they had cancer. And then while they're driving their car, they get a call from the doctor saying, are you driving pulver to the side of the road? And by themselves, they're told that they have cancer. And now you advance it to a world of AI. So this is why we built the pillars.
[00:11:38] So now, you know, you can get an alert from an app to look at your biopsy report coming from the electronic medical record by yourself, reading it in a language you don't understand. And I can't tell you the calls I get every other week saying, do I have cancer or don't I? What motivates me to do this work is really to restore humanity back to healthcare and to make sure all technology has that human component. So that's why I get up every day. I want to serve as the bridge between technology and the way we deliver care.
[00:12:08] And second, I'm most passionate about creating an environment where doctors, nurses, and dentists want to spend the rest of their career. We have systems and technology that are burning them out on a daily basis. I want to help fix that. So those two things motivate me and keep me going. And I think the word retire should be expunged from the English language. I love it. And I thank you for sharing your story too about your father.
[00:12:30] And it's amazingly inspiring to hear about your North Star and how critical it is for all of us to keep that North Star when we're doing the work that we're doing. It's not easy. It's hard work. But we got to do it because patients and caregivers and care partners, they deserve it. If you could give your younger self a piece of advice moving into healthcare IT, what would you tell yourself? I would have sought mentors sooner.
[00:12:54] I would have used my voice and spoken up in a room where I didn't have the confidence to do it. And three, public speaking was my number one fear in my life. I would have figured out how to address that sooner and use my voice to shape change in the world. And lastly, get a community of posse of other women to support you. Often women take each other out in the business world.
[00:13:18] And I think it's so important that we women mentor the next generation of women to be leaders in healthcare and healthcare IT. Because so often I was the only woman sitting at the table. And on my watch, that's not going to happen. So my passion right now is mentoring the next generation of female leaders to have a voice at the table. I love it. Now to finish this conversation off right, where can our listeners find you online? My LinkedIn would be a great spot. That's terrific. And before I forget, did you happen to bring tea with you today?
[00:13:46] And if so, let me see your mug and tell us about it. Oh my goodness. And the tea is happening after I had my first good strong cup of coffee though. But I love tea. So this is chamomile tea because my voice is a little off today. And this is my Gabby the good girl who also has her own Instagram. One of the loves of my life, my pup. So it's just a treat to be having tea with you.
[00:14:06] And I'm so grateful to you, Grace, for telling stories that help lift up the world and women and other healthcare leaders just to do better and make the world better for others. So thank you for having tea with me. That is an absolutely beautiful dog on your mug. And I'm so glad to learn from you. Thanks for joining me. Thank you, Grace. And thanks to you folks for joining us too. Check out the High Tea with Grace podcast for more interviews with great guests like Dr. Duffy today. Cheers. Like a Girl Media is more than a media network.
[00:14:36] It's a community. We want to meet you and amplify your voice and the voices of outstanding women innovating in healthcare. Interested in starting your own podcast or hosting an event near you? Connect with us online or in person. We're here to support and empower you.


