false
Catalog
Ensuring Techquity in Healthcare Innovation and Im ...
Ensuring Techquity in Healthcare Innovation and Im ...
Ensuring Techquity in Healthcare Innovation and Implementation
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Hello and welcome everyone to the Green Channel. We're going to go ahead and get started now. A few housekeeping things. So if you have any questions or anything else you want the speakers to comment on, please put your question or your comment in the Q&A box on the app. I will be checking it throughout, and we'll be taking questions that way as well. If you also want to stand up and ask your question, we have someone in the back that can hand you a microphone so that you can ask a question if you'd rather do it that way. But the easiest way is probably to put it in the app, and then I can pick it up and talk about it. So the topic that we are going to be discussing is ensuring tech equity in health care innovation and implementation. As I'm sure you've heard, one of the key themes of HRX 2024 is how do we get this novel technologies that we've been talking about over the last couple of years into clinical practice. So we'll be focusing on discussing challenges as well as opportunities when it comes to moving health innovation into the implementation setting, into the real world, into clinical practice. So I have with me today two fantastic panelists, and I'm going to have them introduce themselves, but I'll start with me. So I am Dr. Demi-Ladia Deidenshiwa. I'm a non-invasive cardiologist, and I practice at Mayo Clinic in Jacksonville, Florida. A lot of my interest is in heart disease in women, and on the research side, the use of novel digital tools and artificial intelligence for cardiovascular screening in women, particularly during pregnancy and postpartum. So we're going to start with you, Yvonne. Go ahead and introduce yourself. Thank you so much. Good afternoon, everyone. My name is Dr. Yvonne Commodore-Menza. I'm a nurse and cardiovascular disease epidemiologist at the Johns Hopkins School of Nursing with a joint appointment in the Department of Epidemiology at the Bloomberg School of Public Health. I'm a nurse by training, but I'm a researcher, and I focus on what I call glocal cardiovascular health equity. And usually when I use the term, people say, is that a typo? No. Glocal implying that a lot of the issues we talk about have local as well as global health implications. So how do we use technology as a tool to eliminate these disparities in cardiovascular health? Thank you. All right, then. We're going to go to Aubrey. Go ahead. Thank you, guys. My name is Dr. Aubrey Grant, and I am a clinical sports cardiologist. So I direct the sports cardiology program at MedStar Health and have an appointment at Georgetown University, team cardiologist for the Baltimore Ravens, Baltimore Orioles. In that role, I think I've truly appreciated understanding the bounds of super physiological training in athletes and how that affects cardiac adaptations that we see. In my job that I'm representing today, I also serve as chief equity officer for Equity Commons. So Equity Commons is a virtual reality innovation company that uses VR technology to really help clinicians understand the lived experiences and develop empathy for the patients that we take care of. So we've known that implicit bias is a big driver of negative clinical outcomes for some time, but we haven't really been able to understand mitigation strategies that successfully allow clinicians to be able to appreciate these experiences of our patients. And so the work that we are doing has sort of been birthed out of a crisis of health disparity in recognizing that this is a big driver of inequities that we see today. And so we're using innovation, we're using virtual reality to sort of go beyond the traditional mediation strategies to be able to provide a more sexy, different alternative to traditional implicit bias strategies. Excellent. I'm so excited to hear from our two panelists. So I have a number of questions. Remember, if questions come up during this, please put them in the Q&A and I'll take them. So what is the reason why we have decided to talk about this? So over the last few years or more, we can say that there's really been an explosion of digital technologies in the healthcare space, particularly in cardiology and not just in heart rhythm. So we've seen a number of studies that have already been published showing the effectiveness of these tools and how well they're doing. We see, you know, fantastic AUC values. And the question we get all the time when we read these papers is, how can we get this? When can we start to use this? Unfortunately, there are fewer studies that have started to evaluate these tools in the context of implementation. There are also fewer studies that are looking at clinical trials. What is the impact of using some of these technologies on clinical outcomes? And as we also start to think about moving to that point, there are concerns that come up, ethical concerns, about how these technologies were developed. Are we sure that these technologies would not increase health disparities? Because we know right now that health disparities remains a pervasive issue, despite the fact that we've known about this for several years and a number of mitigation strategies have yet to address it. And if we're using existing data to develop some of these new technologies and we want to put them back into health care, how are we sure that these technologies are going to work the same for everyone? Are these technologies going to promote equity? Or are they going to further cause a divide that we are starting to see more inequity in health care with the use of these technologies? And that's kind of what this session is about, and we want to try and understand this a little bit more. So I'm going to start with a common theme in this field, which is something we call algorithmic bias or AI bias. You may have heard this term before. And this has to do with bias that results due to the use of specific algorithms or specific artificial intelligence tools due to what we consider inherent bias. So we believe that these biases do not occur just by the function of the device or the machine, or the computer, but essentially these tools learn existing bias and then they can propagate this. And this could include things like social inequities, gender or sex-based inequities, and multiple different facets of this. So now I'm going to move this question to our panelists to talk a little bit about what examples of bias have they seen in their work or heard about, and just give us practical examples of how this exists when it comes to digital innovation. And I'll start with Aubrey. Happy to jump in. So in my clinical practice, I'm a sports cardiologist. And so as a sports cardiologist, one of the rites of passage is being dumped a stack full of wearable device information that you get from your patient and trying to be able to sift through and understand that to recognize what is the clinical importance of all of this information that your patient has thoughtfully and meticulously gathered and brought to you. Unfortunately, what we know and what we understand is that there is disparity in the accuracy related to the acquisition of this wearable data based on the skin tone and the amount of melanin that you have within your tone as that wearable device is using PPG technology, which unfortunately was not validated in diverse patient populations. And so as a lot of these wearable device companies use that come-to-business, come-to-market strategy of just bringing a device to be able to validate instead of intentionally having diverse cohorts, now we see a disparity in real time and clinical practice of inaccuracies related to wearable device technology. And so unfortunately, bad data in can oftentimes then lead to misinformation, poor diagnostic accuracy for the patient, and ultimately long-term consequences. So this is just a primary example that if you're not intentional to how you're designing studies, if you're not intentional to having equity within the diversity of the cohort that you're studying, you can really see real-world implications on the back end. Excellent. Go ahead, Yvonne. And just to build on that, I'll take it back a little more in terms of even before we get to the populations that we validate or derive some of these models, I think it's important to consider who is at the table when these models are being developed. And we can take it all the way back to who's present in terms of the technology space and health care. And so I think there's an opportunity for us to improve STEM education and also make sure that we are educating everyone, black and brown populations, so that they too can be part of the technological revolution, so that they can be at the table when some of these technologies are being developed and ideated. Because waiting till the validation stage may be too late. So for me, that's one of the things I'm very passionate about, which is ensuring diversity in the health care profession, but as it applies to health technology as well. Excellent. Thank you so much, Yvonne, for making us see this from two different aspects. So now I'm going to take it outside of what we consider algorithmic bias, bias that can happen as a result of us using digital technologies. Why don't we talk about bias and inequities that already exist today? And the other question we get is, can we leverage novel tools and technologies to address bias? While we know that they can make bias worse, can we leverage it to actually improve health equity? And that's kind of the concept of Techquity. So I want our panelists to talk a little bit about some of the cardiovascular disparities that exist today that they know about in their work and how they're using innovation to address this disparity. And anyone can go first. I'm happy to jump in. One case study that I will highlight is the clear gender bias that exists in diagnostic testing in the United States. So there's certainly, from a population level, we recognize that there's clear disparities based on men and women and who receives certain testing, who receives more invasive testing, and then that can have downstream effects on outcomes. And so we recognize that there is gender bias within the healthcare space, and in particular in cardiovascular disease. And as Demi so eloquently stated, we've known about this for probably the last 30, 40 years, even so much so that for many of our state boards, I mean, they are mandating that we have some level of implicit bias training in an attempt to try to mitigate that. And that has been happening in iterations and in variable forms for some time. However, if you study a lot of these implicit bias trainings, they don't actually do what they're functioned to do, and they're not changing clinical patterns. They're not increasing equity in the space of diagnostic testing, for example, in the space of gender bias. And so a frustration that I have is why would we continue to do the same thing over and over again without, number one, studying it in detail and then recognizing that this is not having the effect that we want it to. In that moment, in recognizing this, that is where I thought there was a huge market opportunity for us to utilize innovation and utilize technology in the form of virtual reality. One of the prime mitigation strategies for bias awareness and mitigation is perspective taking. And so in the conversation of gender bias, I'm a man, and I will never understand fully what it's like to be a woman trying to navigate the healthcare space to get simple things such as healthcare or maternal care, et cetera. But in a virtual space, in VR, perhaps we can create scenarios that allow me to step outside my own traditional identity to then feel in a real haptic sensation and truly be immersed in a world that is outside the bounds of my traditional realities. And in that space, now we're using innovation to up-level and to highlight things that we traditionally are failing at in this space of equity, in the space of implicit bias management and mitigation strategies. And so I think there's a lot of creative ways to be intentionally aware of the pain points that we're having in a lot of these discussions about bias or health disparities, and then potentially apply really awesome innovative strategies to find solutions. That is wonderful. I feel like now I need to try it out. I want to get a chance to walk in someone else's shoes and feel what it is like to be a man, maybe, navigating the healthcare system. So really, really exciting work. So we would love to hear some more about it. But Yvonne, I want you to also talk about some of the work that you're doing. Sure. So as I mentioned earlier, we do some work locally addressing disparities in cardiovascular health. But my journey in the tech equity space actually started in Ghana. So collaborating with some colleagues in Ghana, we started talking about the challenge of getting people to the clinic to have their blood pressure treated and controlled. And as you can imagine, there are people who live 100 miles away from the health centers and still need their blood pressure controlled. So we asked the question, what's the simplest form of technology we can use to eliminate disparities and hypertension outcomes looking at it from a global perspective? And so look no further than cell phones. So that's how it started. We said, most people have smartphones. And so how do we leverage this as a tool to improve blood pressure control? We also gave them home blood pressure devices. And so when I visited Ghana, I had the opportunity to visit one of the clinics. And you see 200 to 300 people sitting there waiting for a doctor or a nurse to check their blood pressure and titrate their medications. And we said, this makes no sense. How do we leverage technology so they are not traveling long distances to get their treatment intensified? So the combination of home blood pressure monitoring and a smartphone technology and an app, we're able to get people to blood pressure control. So in this trial, we randomized people to get this multi-level intervention that included team-based care. So we trained nurses and routinely in some of these settings, there aren't enough physicians, right? And so that's why team-based care was one of the opportunities that we can also leverage technology. So we trained nurses, but we also gave the patients home blood pressure devices. We gave them smartphones. People were shocked we did that. We wanted to make sure in terms of an equity perspective, everyone would qualify for the study regardless of their possession of a smartphone. They got the phones and they downloaded the app. So everyone started off with uncontrolled blood pressure. After six months, those in the intervention arm, 80% of them achieved blood pressure control. We did not titrate their medications. The nurses couldn't do that because it's not allowed, but just the ability to remotely monitor their blood pressure at home, self-manage their condition, and also that bi-directional communication with a nurse, we were able to achieve blood pressure control. And so we took that model and brought it to Baltimore. And so when we think about innovation, we usually think it comes from the global north, the U.S., U.K., and then we transport it to, what, Africa and other low- and middle-income countries. As we said, if it works in Ghana, it could work in Baltimore. So we're currently conducting two trials where we're using a similar approach with home blood pressure monitoring for the use of smartphones and an app as well. And so the trial is ongoing. We don't have results yet, but we're also hoping to demonstrate that, again, when you think about health equity and social justice, these principles cut across all borders, right? So people who don't have access in Ghana may face similar issues as people in Baltimore who may not have access to health care. That's fantastic. I can't wait to see the results from the Baltimore trial. But just hearing about the amazing work that both of you are doing is really, really inspiring. I also wanted to touch on the issue of equity, which is talking about what are the incentives for us to actually address health equities. I know we've talked about it being a problem being here for a while. And I want to highlight the fact that it's more than just the moral issue for us to make sure that everyone has equal access to care. There's actually a strong financial incentive to address health equity. There was a report by the American Public Health Association that was published in 2018 looking at what are the financial gains from eliminating health disparities. And this was actually based on a 2006 estimate, is that if we completely eliminate all health disparities in the United States, we would have a cost savings from a health care standpoint of almost $300 billion. And I was like, how is this number accurate? And this is based on 2006 estimates. So it's not a moral question. And it's not a moral issue alone. There's a huge waste in the health care system from not addressing health equities or from not addressing health inequities. So we have to really be motivated to address these issues and ensure that we are moving more towards equity. So now the next question I want to ask is- Can I just pop in there for a second? Go ahead. No, I completely agree that it's beyond just a moral impairment. This is a real financial benefit to actually invest in these communities. One quick anecdotal story I will tell is, I was finishing my fellowship and it was a busy Friday night in the middle of a pandemic and a black woman presented to our local emergency department. Now she had several intersections being unhoused and also dealing with alcohol use dependency. And so when you have intersections such as this, this oftentimes in a busy environment, create a recipe for implicit bias to where it's head and allow us to take mental shortcuts that can cut across the clinical decision-making tree. And so unfortunately she presented with a posterior MI that was ultimately missed because things are busy. She had been to the ED before. She was a quote unquote frequent flyer. And she sat in our emergency department for three, four hours before it was ultimately recognized after she had a cardiac arrest. And so she had a cardiac arrest, resuscitated her, sent her to the cath lab, full mechanical support, all the business in order to save her life. Ultimately the next morning, once we took her off life support and once we got the breathing tube out, she just kept saying, why was no one listening to me? That touched at my heartstrings, but also from a financial perspective, the amount of costs to keep this woman alive when perhaps we could have eliminated the door to balloon time with a little bit of actual intention and really sort of checking ourselves on our implicit biases that could have not only solved a financial issue, but also a moral impairment from giving this woman the proper care. That is such a strong practical example. Thank you for sharing that with us. But yes, I think we've well established that yes, there is a moral issue here, but there's also a financial reason for us to address health inequities. As we start to think about some of this novel tools coming into healthcare and with the work that you're both doing using novel technologies, VR and blood pressure monitoring a smartphone and an app, we have to think about how do we actually implement this in the real world? While sometimes these things can work in the context of a study, the question remains, can we take these to the real world? And what are the challenges that we need to think about when it comes to implementing this technology? So you've demonstrated how well this works in Ghana. Hopefully we see similar results in Baltimore, but what are the challenges? Where are the opportunities? Where do we have trouble translating innovation into clinical practice? I'm personally really fascinated by the field of implementation science. I hope everyone is familiar with that field because like you alluded to, the evidence exists. I don't think we need any more pilot studies to demonstrate the effectiveness of blood pressure control techniques, but as you alluded to, there are gaps with implementation. So what works in Ghana, in Accra, in Kumasi, may not work in Baltimore, but that's why we have to consider what's in the inner context, what's in the outer context. In Ghana, there are things that a nurse may not be able to do, things that a physician may be able to do. There are issues surrounding who has access to certain technologies. So I think that the field of implementation science forces us to take time to consider the context of where we are implementing a solution, right? And one of the things I want to stress is also who's at the table when we're considering how to implement some of these strategies, right? A strategy can work well, but we need all players at the table to understand from your perspective as a physician, as a pharmacist, as a nurse, right? As a biomedical engineer, right? What are all the ways that we can come together to solve this problem, right? And I appreciate that implementation science gives us this common language of this is the problem that we want to solve. How do we take this knowledge, but also get it into practice in real-world settings? Because we conduct all of these amazing trials, get these amazing p-values published in these fancy journals, and then we know that there's a 14-year gap, right? Between when some of these data become publicly available and when they're actually implemented in clinical practice. So we really need to eliminate that gap. And the way we do that is making sure that as we're designing some of these trials, the end users, right, patients, and human-centered design is one of the ways that when we are thinking about developing technologies, we can get the end users at the table and healthcare professionals, patients, communities. At Hopkins, Dr. Nino Azzacazzi and her team have used human-centered design to design an app for cardiac rehab, right? And so that takes time. So that's the other thing I wanted to stress is that sometimes we want shortcuts. We want to just get right to solving the problem. But if you want to thoughtfully and meaningfully develop technology that is more likely to be implemented and more likely to be sustained, it requires time and effort and an investment and engaging with the right people to get to that solution. Okay, go ahead, Aubrey. Yeah, a couple of things that come to mind, to be honest. So, you know, one thing that we truly value as we're creating our virtual reality scenarios is authenticity. And we want the narratives and stories that we tell to be as realistic and culturally sensitive and culturally appropriate as possible. And so to that end, we are intentional about every new virtual reality scenario that we create, we're involving community partners to be able to test this. Does this feel authentic? Is this, have you had this experience before? Just to be sure that, you know, this is a true experience that highlights the lived situations that a lot of our patients are dealing with. And so what we have found actually is that through that process is that now our community partners are then going to other stakeholders to then advocate for this technology to be brought into the practice of medicine. And so, you know, we've had community partners that we worked with and then go to healthcare systems, say, you guys really need to be including this in your training programs. And so that multi-pronged approach about, you know, us certainly creating, but also having community involving them in the, from the, you know, stop from the boots on the ground to be able to create this product has really, really been helpful. Beyond that, I will say that we found the most support when you have top level down support. So, you know, when the guidance of, you know, adding something into the lexicon of the healthcare ecosystem coming from the C-suite, that goes a long way. And, you know, if you're able to garner that level of support, I think that, you know, it really sort of helps ease the process of incorporating new technologies into practices. And so, you know, I think there's lots of ways to do it, to be honest. I think one essential aspect of it for me though is just making sure that even in these processes, it's as inclusive as possible. Excellent. I love the examples that both of you gave. I'm gonna go back to one that Yvonne mentioned earlier, and which is, she was talking about the context, right? Maybe something that you do in Ghana might not be directly applicable to what is happening in Baltimore. And how do you understand that context if you actually don't design something to gather that information? So I'd like to share an example of a study just talking about like some of the research that I do as well, which is screening for cardiovascular disease during pregnancy and postpartum. So I'm sure you probably know this, but if you don't, maternal mortality has actually been on the rise in the United States over the last three decades. It's surprising really because, you know, this is a high income country. And when it's compared to other similar countries, the US has the worst maternal mortality rates. And then when we start to look at differences by race as well there's a significant inequity there where black women are three times more likely to die during pregnancy or within the first year of having a baby compared to white women. So another huge issue. If we have access to healthcare and we have state-of-the-art, you know, diagnostic equipment, and we have very well-trained physicians, why are we seeing our maternal mortality rates go up? And why are we seeing this huge disparity for black women compared to white women? And when we start to tease out what are the conditions or what are the things that lead to death, cardiovascular disease is actually one of the leading causes of death during pregnancy and postpartum, which is shocking because most people think heart disease is a disease of the old, right? Not a young woman who's about to have a baby, which is also a concern. And then when we tease out what types of heart disease, heart failure is actually one of the top ones. And what we found out is that it is very challenging to diagnose heart failure during pregnancy and postpartum because a lot of the symptoms that come with normal pregnancy due to physiologic changes overlaps with heart failure symptoms. So if you've ever seen a pregnant woman or if you've ever been pregnant, it is not unusual for a pregnant woman to complain about swelling in the legs. It is not unusual for a pregnant woman to get short of breath with minimal distances going up a flight of stairs or get short of breath just laying in the bed at night on her back because of the effect of that gravid belly on the lungs. So if you tell your physician about these things, they're like, well, you know what? You're pregnant. You don't expect to feel the same way. But if you are not pregnant and you tell a physician any of the symptoms, those are huge red flags for heart failure. So how do you tell a pregnant woman who has heart failure? So a lot of studies are showing that there's a delay in the diagnosis of heart failure during pregnancy and this leads to poor outcomes. Actually having heart failure is the number one cause of death in the late postpartum period. So after that six week period where you no longer have contact with a physician through 12 months postpartum is the number one cause of death. So we start to look at some of the existing technologies that we have that analyzes data from an ECG to diagnose the likelihood of heart failure, which is simple, noninvasive, safe for mother and baby. Can we detect heart failure earlier and can we plug this woman to care and reduce cardiovascular disparity during this time window? And we were able to evaluate this retrospectively as well as prospectively and showed that it actually had strong performance. Then we move this out to a different context and we decided to evaluate this technology in Nigeria. Nigeria has the highest reported incidence of peripartum cardiomyopathy, which is a unique type of heart failure that happens during pregnancy and postpartum, estimated to be about one in 100 cases. And what we did was we tested this technologies two ways. Remember context being important. We looked at a 12 lead ECG as well as the use of a portable technology, a digital stethoscope that can record the ECG and still provide you with this prediction. First thing we learned from our pilot study was that a 12 lead ECG is not necessarily always available in the OB setting. For cardiologists, it's a no brainer. We have this all the time. But for OBs, it's a different story, right? Where are we gonna get a 12 lead ECG? Who's gonna interpret it? That was the bottleneck that we needed to address. But having this technology embedded in something portable, like a digital stethoscope, solves that access problem in the OB space. Also in a country like Nigeria, where there's concerns about stable electric power supply, having to plug in a device is a challenge. But if it's battery operated, then you don't have to worry about that because then you can use this in different settings. And we actually just completed the clinical trial from that study. And the paper was just published about two, three days ago. And what we were able to demonstrate is that screening women with this AI powered digital stethoscope doubles our detection of cases of cardiomyopathy compared to usual care. And we did not see that with the 12 lead ECG. Other challenges with that, right? The stethoscope, we're able to get the AI predictions in real time because it's available. For the 12 lead ECG, we had to upload the ECG. Sometimes it takes a while analyzing the cloud before the result comes back, before it's actionable. So learning all of those different contexts into which we are deploying this technology helps us understand how best to take this technology from like the development phase into an implementation phase. So I wanted to share that example and I love all the other examples so far, but please feel free to jump in. So glad you shared that example because our team is currently working on a pitch to train midwives in Ghana to be able to screen pregnant women for peripartum cardiomyopathy. And that statistic that you just said, the one in 100 is one of the things that we noted. And so what I hear you saying loud and clear is that we also have to rethink, what is the simplest form of technology that will still get to the same outcomes? And so that's something if we are successful, we know that there aren't cardiologists in rural parts and rural parts of Ghana. So we're looking at portable ultrasound devices and we're trying to train the nurses and midwives to use these devices and also using AI. And one of the things we also noted is that when you think of these AI models and the data that already exists, very few African women are included in some of these models. So that's one of the ways that we want to contribute to more diverse data sets in that regard. So thanks for sharing that. So, Ray, I don't know if you wanna add something. Yeah, I would just say, I mean, the phenomenal work that both of you guys are doing, to be honest. And so, speaking from the American experience, we recognize here that when you think about outcomes, social determinants of health really can make up about 80% of how someone will experience life. And the outcomes associated with that are so clearly tied to certain determinants that affect how they are going to receive care, how they're going to be able to follow up with their cardiologist or their primary care doctor. And so what's so clearly evident to me is that you guys have created and used application innovation technology to really step beyond some of these traditional barriers that have halted a lot of our patients from having success in healthcare space. And so that application of technology to solve real world problems. That's where we're going. That's where we're trying to get in. And it's also impressed upon me that when you have diverse creators, when you have diverse innovators, we can really be thoughtful about these things. And so it's important to me, number one, that when we're having conversations about innovation and the uses of innovation, the room must be diverse so we can have, you know, a diverse way to think about solving a lot of these real world, global and certainly local problems. Excellent. I love it. I'm gonna take a quick pause here from the questions that we have initially talked about discussing to allow the audience to participate a little bit more. I see a question coming through the app. I'll ask that one. But if anyone else wants to ask a question, please raise your hand. There's someone at the back that can give you a microphone so that you can ask a question. So this question came through and it says, as doctors, we're very proud of independent decision making, but too many of us make the wrong choices for or with the patients that do not have access. When and how will algorithms be inserted into patient management processes and decision making to reduce disparities in quality of care? Good question. I'll open it to our panelists who wants to take this one. You know, I'm very, I was just talking to my co-panelists. I am very clinical. I see patients just about every day of the week. And so I certainly appreciate, you know, my autonomy and being able to practice medicine. In that though, I also recognize that everyone has bias. You know, everyone is biased in some way, just based off of existing in the United States and being a member in the society. And so what's important, not necessarily just to reflect on the fact that I have bias, I have bias, we all have it, but then how do we mitigate that bias? How do we check our biases every day in the patients that we're taking care of? Because to your point, I think a lot of times, things can be somewhat subjective. And so we need to, you know, have at least a sort of system of checks and balances that allow us to be able to say that, you know, we're doing the best for our patients, even if maybe their lived experiences or their challenges can be a little bit different based on each clinical scenario. And so, you know, I think, you know, the other caveat to this too is recognizing that sometimes in some of the algorithms that have been created in clinical decision-making, the use of race has been, is somewhat complicated, to be honest. You know, we have known, and it's clear that race is really just a social construct, and it's not any sort of driver of, you know, genetic diversity. However, in some clinical calculators and clinical formulas, race has been used as some sort of surrogate for genetics. You know, you think about the VBAC equation, you think about the use of Bayh-Dill, you think about, you know, other implications, the GFR equation, I mean, certainly race has now come up, you know, in sort of a renaissance way of how we're thinking about the utilization of that and how we take care of our patients. And so I think there's opportunity to think about the inclusion of algorithms. However, you know, who's creating those algorithms? Are we being intentional about, you know, managing our bias in the creation of these algorithms? And then beyond that, are we thinking about race in a really sort of thoughtful, nuanced, 20th century way about it's just being a social construct and a vehicle to recognize maybe a shared lived experience? Go ahead. To build on that, my practices in the community. So I used to work as a cardiac nurse, and now I focus more on reaching communities to improve awareness of cardiovascular disease in the community. And one of the points that you made, it reminded me of an experience I had where we were screening at a church and there was an African-American woman. We use point of care testing to check her cholesterol level. Her LDL was about 250. So at the last station, people get to me and we talk about the numbers. And she gave me this look, was not aware of what the numbers meant, the difference between LDL, HDL. And I asked her, have you ever been told that your cholesterol may be a little high? She said, no, never. Well, it turns out she was a Hopkins patient. And so she pulled up her MyChart and we looked at her LDL numbers over the past six years. Consistently high, 250s, 260, 270. For six years, her LDL had remained high without intervention. So an example would have been, why was it not addressed appropriately? But if there was a clinical decision support tool that would notify or nudge the provider that this person's cholesterol level is elevated, what's the decision you're going to make? How are you using shared decision-making to come up with a treatment plan? But inaction for six years is unethical. And that's what happened. So for me, seeing or hearing these examples in the community reinforces the notion that we have a lot of work to do, right? We failed a lot of patients in terms of how we are practicing what evidence-based care. And technology may be a tool in case a provider forgot. It's hard to also think, did you forget for five or six years, right? Or maybe there are different providers. Not to blame anyone, but it's simply unacceptable for patients to not get the high-quality care they need. And clinical decision support tools may be appropriate. But like you said, we have to be thoughtful about how they are developed. But there's definitely a need for them. I love the two examples and the two different angles that you've addressed. Like the issue of race-based medicine is one that has come under scrutiny more recently about exactly how we're using this. And is this actually helping the patients or making disparities worse? So just to reemphasize what Aubrey said, race is a social construct and does not imply any biological differences. So making decisions, clinical care decisions, based solely on race is problematic. And we're learning about that more. And that's why as we start to think about novel technologies and innovation, we really need to be very thoughtful about how we are using them and how we are applying them and how we are even incorporating race into this algorithms. And I also love the other example that Yvonne gave, which is talking about clinical decision support, which goes back to the question that we got from the audience. And you can look at this both ways. Yes, maybe this tool can alert the physician and make sure that they act on that LDL earlier. But what about if this was a different context where sometimes the use of artificial intelligence sometimes can give a prediction for something and it's hard for you to figure out how it came up with that. This is a very objective example. You can see the LDL is clearly high and you have guidelines on how to treat and manage that. But what about if it's predicting future disease? How do you deal with that? How does that help you with clinical decision-making? So there's three different ways I like to look at this when it comes to use of technology or AI or innovation is what are we using it for? Is it for decision support? Is it for decision augmentation? Or is it for decision automation? Which means that you're just relying completely on whatever this device is telling you or this technology and just letting it make the next step for that patient. And I think that the highest risk comes to our patients when we start looking at that highest level of it, which is decision automation. I don't think we're at that point where physicians will be removed from the care of our patients. The goal of using technologies and AI is to improve the care quality, is to improve that care that we're providing to that patient, not excluding the physician from making the correct decision for the patient. And we also have to think about it like what are the potential risks, right? Who takes the blame when an algorithm is incorrect? As much as we want to use this to improve patient care, we also have to balance what are the important things to think about. The responsibility right now still falls on the physician. We cannot take an algorithm or a device to jail if you do the wrong thing for your patients, which is why I think that the future of medicine really is going to be that, you know, human-computer-patient interaction. It's not just going to be moving everything to devices or technology, but using technology to actually make us a bit more human. At least this is how I see it, and this is how I think the future of healthcare, this is where I think it's going. Any other thoughts, comments from our speakers? I know we have just a few more minutes. If there are questions, don't forget to raise your hand. You can ask a question, okay? See your hand in front and you'll get a microphone, but I don't know if you had a thought, okay. Thank you so much for this wonderful session, you know, insightful, and thank you for all the work you do at the community level. A question for you is, you know, you've been doing a lot of work in the cardio OB space, you know, pregnant women. We know we have gaps. We know we have poor outcomes. A couple of things I see in my practice because of cardio OB is women are unable to travel, to come to see us, even if you make it a very collaborative model because of poverty, you know, they just don't have resources. They have other children. Philips, as you know, has come out with this fetal monitoring patches, you know, these companies and all these, how do we start partnering with them? Because that puts extra expense on the patient. So I struggle with this. How do we partner with these companies to close the gaps at the community level, just especially for the cardio OB patients? So excellent question. I know that cardio OB is my space, so I can take that one first. And I think that that's, I think that's one of the big importance of a meeting like this, where you're essentially able to bring innovators together with industry partners and regulators for us to figure out how do we close these gaps. I remember when I attended this meeting last year, there was a question about why do we think that, you know, technology has to come from the top down, you know, and, you know, there's this concern about if we are letting, you know, richer technology companies, technology giants and huge research institutions take the lead, what happens when these things don't trickle down, you know, to the community and how do we close that? But the truth is that we know that there's a power dynamic when it comes to getting technologies into the hands of people who don't have access to it. That's just the unfortunate, you know, situation that the world is in. There's this huge gap between the haves and the have nots. How do we get things that are accessible to those who have to those that do not have? And I think that is going to be a multi-stage type intervention. There really needs to be investment in that space. Like I said, health equity is a problem for all of us, not just those who are experiencing inequities. So we need to think about like potential reimbursement options. For instance, if we're able to demonstrate that the use of this technology is likely to improve outcomes, then we don't end up spending as much money on that patient. And technologies like this can be reimbursed. A lot of these companies that are innovating now are looking at what are the options to make sure that we can get CPT codes to bill for this so that we can get insurance to cover this so that this is not coming out of pocket. And those are questions that we haven't really addressed because there's not many studies looking at implementation. And that is why this is so important now to discuss. And if you have thoughts, feel free to jump in. And I have a few more questions I can take. I'll just quickly add, we know the perils of technology, but the example that you just provided is the promise of technology. That one of the things I appreciate is that technology can make care more convenient. I think we all have to acknowledge that the healthcare system now and how we provide care is not convenient for the average American, right? So there are certain instances when we know that someone is at high risk and they have a lot of things going on, right? Between work, other kids. How can we be more creative and deliver care in the convenience of their home? And that's one of the things I'm so passionate about because the populations that are disproportionately affected by these disparities, these are the same people who are at highest risk for these adverse outcomes. So how do we make care convenient? By partnering with a lot of these companies, but ensuring that cost, right, is not a deterrent. And like you said, there can be creative models where there's reimbursement, but usually they wanna see data. I think it's intuitive that, I know, it's like, which comes first? Data, data. But I think it's intuitive that if you're providing care that is still high quality, but more convenient for someone, you're more likely to prevent an adverse outcome. So we should be investing more in what? Prevention, right? But unfortunately, from a payment approach, that's not where we invest our dollars. So actually there's a number of questions that are coming in super last minute, 20 seconds to the end. It's okay, I'm gonna try and lump them into one. We might just have to extend this by a few minutes, but a lot of them really center around the financial part of this. What is the financial impact? How do you see this technologies being funded? What are the options in other countries where we demonstrate that, okay, this intervention works, and then what is the next step? So great, great questions. I'll let Aubrey jump in here about potential thoughts about what are the financial solutions? What are the options? I mean, that's honestly the holy grail. And we have had, and the partners that we've had, the conversations that we're able to say that we will come in and save your organization money, or we're able to highlight and say, this is how that we can benefit the bottom line. And that's where we've had the most success with business development. And to your example, with the Cardio Obstetrics example, and being able to have home monitoring, if at some point you're able to prove that this saves dollars to the department that is where you will see success, and you'll be able to have implementation, and that's where we're going. I think for us, I mean, speaking, I mean, the work that we do is proprietary. And so we bootstrapped a lot of this, and starting the foundations of the work that we've done. And then beyond that, we've been able to have grant support to then gather that data, and then have some longitudinal data on the back end to say how we're able to improve patient outcomes, which turns down to patient dollars, right? And so ultimately, when you think about innovation, the start point is creating the product. And then the last conversation is, okay, now how are we going to implement this to save money? Excellent. And I would just like to round up because they would like for us to stay on time. So you have the opportunity to network with our panelists, ask them questions after this. But one of the question came up on was the idea of sustainability. How do you ensure that this technology, after you've developed it and you've now deployed it, how are you sure it's going to become sustainable? And I want to just circle back to what Yvonne mentioned earlier, which this is what we learned from implementation science. These are really the hard outcomes. We look at things like affordability, how much is it going to cost, things like feasibility, also acceptability. The fact that you develop something fantastic doesn't mean that people want to use it. Is it clunky? Does it make the physician work harder? Is it adding more to their plate? And they're just not interested. And one other outcome is sustainability. After all is said and done, can this technology continue to be used? And all great questions. But if we don't study the context, if we don't actually want to figure this out, then it's hard. And sometimes you'll find out that a deployment strategy is different for different technologies in different contexts. But this was great. I'm glad to see all of the engagement coming in on the app. Please meet with our panelists after this to ask your questions and make sure you network with others. Thank you so much for joining us. Thank you.
Video Summary
The Green Channel's latest session covered the critical topic of ensuring technology equity in healthcare innovation and implementation, focusing on how to integrate novel technologies into clinical practice. Dr. Demi-Ladia Deidenshiwa, a non-invasive cardiologist at Mayo Clinic, along with Dr. Yvonne Commodore-Menza, a nurse and cardiovascular epidemiologist, and Dr. Aubrey Grant, a clinical sports cardiologist, were the key speakers. <br /><br />Key points discussed included:<br />1. **Importance of Technology in Healthcare**: Emphasis on the significant advancements in digital technologies and their effectiveness in cardiology.<br />2. **Health Disparities and Equity**: Recognition that health disparities persist despite awareness and mitigation strategies, highlighted by studies showing increased maternal mortality rates in the U.S., particularly among black women.<br />3. **Algorithmic Bias**: Discussion of AI bias arising from non-diverse datasets, exemplified by inaccuracies in wearable technology data among different skin tones.<br />4. **Human-Centered Design**: Highlighted the need for diverse creators and inclusive development processes to ensure technology applications are equitable.<br />5. **Implementation Challenges and Sustainability**: Importance of ensuring technologies are user-friendly, accepted, affordable, and sustainable post-deployment.<br /><br />Dr. Demi-Ladia and the panelists advocated for careful study of the context in which technologies are deployed, involving all stakeholders in the design and implementation processes to ensure technologies do not exacerbate existing disparities but rather bridge gaps and improve health outcomes equitably.
Keywords
technology equity
healthcare innovation
cardiology advancements
health disparities
algorithmic bias
human-centered design
implementation challenges
sustainable technology
HRX is a Heart Rhythm Society (HRS) experience. Registered 501(c)(3). EIN: 04-2694458.
Vision:
To end death and suffering due to heart rhythm disorders.
Mission:
To Improve the care of patients by promoting research, education, and optimal health care policies and standards.
© Heart Rhythm Society
1325 G Street NW, Suite 500
Washington, DC 20005
×
Please select your language
1
English