Addressing Health Inequities in Healthcare: A Thought Leadership Roundtable
By Candace Stuart
The COVID-19 pandemic has highlighted racial inequities and the devastating impact on communities of color and marginalized citizens in our society. Healthcare leaders recognize that racial disparities contribute to a public health crisis that shortens lives and reduces quality of life. Unintentional bias in the selection of the data used to design digital technologies may be exacerbating problems of inequity. With efforts now under way to vaccinate people nationwide against COVID-19, it is important that populations that disproportionately have felt the brunt of the pandemic are not underserved.
To address those concerns and explore actions digital healthcare leaders can take to help close the disparities gap, five members of the College of Healthcare Information Management Executives (CHIME) joined Dick Flanigan, Senior Vice President of Cerner, in a virtual thought leadership roundtable. CHIME President and CEO Russell Branzell moderated. CHIME participants included:
- Chani Cordero, Chief Operating Officer, Presidio of Monterey Health Services
- Adnan Hamid, Vice President and CIO, Good Samaritan Hospital
- Shafiq Rab, MD, Chief Digital Officer and CIO, Wellforce
- Jeffrey Sturman, Senior Vice President and CIO, Memorial Healthcare System
- William Walders, CIO, Health First
A Call to Action
An analysis released by the Centers for Disease Control and Prevention in December 2020 found that the rates of COVID-19 infection, hospitalization and death were higher for African American, Hispanic and Native American communities in the U.S. compared to other racial and ethnic groups. The agency cited “long-standing health and social inequities” for the disparities, noting that lack of access to quality jobs, housing, education, healthy environments and healthcare contribute to these gaps. Now, with vaccines being distributed nationwide, many question if those same groups will continue trending at a disadvantage.
Some digital healthcare leaders see the moment as a call to action, an opportunity to raise awareness about racial and ethnic disparities, examine health IT’s role as an unintentional contributor and create resources to eliminate inequities now and in the future. The question is, how to start and where to focus? Cerner’s Dick Flanigan offered context at the beginning to help frame the discussion and CHIME’s Russell Branzell led the questioning. Below is a highlight of the conversation.
DF: When we look at data, when we look at outcomes, we see big differences in how people of color are receiving services in the United States. Sometimes the algorithms or the care management approaches we have put together, while not intended, have led to reinforcing some of these disparities of care and access to care.
We began to ask the question, how could we work together to ensure that the systems we put in place, the data we collect, the algorithms we use, the interventions we evoke, don’t further perpetuate what we think is some less than desirable outcomes of the healthcare system.
CHIME is a really good place to have this discussion. When you think back to the Opioid Task Force that some of you participated in, we had a direct role to play. We could (use IT to help) change the prescribing habits and it worked. This one’s a little more insidious.
RB: This is the hypothesis: There’s a natural and historical bias in the systems we use. The data we both put in and the things we pull out of that are just as flawed. We don’t have a checklist against bias. Do you agree with that premise?
AH: One example that came to my mind was when I was at Henry Mayo (Henry Mayo Newhall Hospital). We were trying to figure out how to deal with patients who were transitioning from male to female and how to configure the systems to manage the preference for the patient. The systems only gave us male, female and maybe unknown. We were talking about patient satisfaction, patient centric and patient empowerment, but we had no way of trying to configure the system to adapt to those choices and those requirements. The systems are standard, which I get, but not flexible. I agree with that hypothesis, that systems need to do better.
CC: Sometimes when we look at the data and the metrics, we make assumptions and at times, beliefs are inherently false. I read about this study looking at high-risk patients for a care management program. You are correlating that maybe high-risk patients cost your organization more money because the cost of care means they need more services. But when you start peeling back the onion, you see that minority communities don’t necessarily go into the facility as often as other communities. As a result, the opportunity to manage chronic care for Black patients is overlooked. Oftentimes they had to be a lot sicker to be considered high risk. These types of considerations in algorithms are missed.
The part that makes it very difficult in healthcare, is that the social determinants of health are so squishy. It is very hard to narrow it down. If you have two people who look alike, sound alike, live in the same community, why are their health outcomes different? That is because we don’t think about the environmental factors or genetic factors that play a part. Until we are able to nail that down, we are always going to have a little bit of this disparity.
JS: In healthcare, usually that first point of contact is a registration or a scheduling person. Often, if I’m standing in front of them there is an assumption made about my background and they are not even necessarily asking me a question. Without asking the questions, we are making some really broad-based assumptions, which can be dangerous. I think systems can help facilitate this, but at the end of the day, some interaction and some relationship building needs to come into play to make it all happen better.
SR: When people talk about inequities, systemic racism and other things, we have to go through a very painful dialogue. A lot of talking has to happen for the healing to happen and to understand it. Once we understand it, then the systems will change. That awareness, that understanding, can only happen when we are willing to listen and try to understand the cultural background from where the person is coming from. The most important thing is to understand from the perspective and experience of the other person.
WW: (In roles in the U.S. Navy and Department of Defense) I had the great luxury of designing hospitals in Central America. What did we do? We took a United States model of a hospital, put it in Central America. What happened? Well, little did we know that when someone is sick in a Latin country, their entire family occupies the bedside with them and those rooms were no longer supportive and, frankly, couldn’t work. To Shafiq’s point, listening, understanding, learning, adapting. As we listen, we will change.
RB: Are the algorithms you are building, the people who build those reports for you, by definition are we propagating a second layer of flaws here?
WW: We build systems to maximize revenue, not maximize participation. We have accounts for bad debt, we do outreach, we do all those right things for the community, but we tailor services to foster and facilitate margin. I think that’s part of the problem. Making folks well is important and a priority of our industry, but I don’t think we commit as much time as we should to some of these softer elements around outreach, particularly around social determinants of health.
SR: We already have a chief health equity officer (at Wellforce). Everything we do is now based on frictionless care, based on experience for the patient, for the employee, and for the staff. On that note, we are saying that any data we collect, whether it is race, ethnicity, experience, language preference, is based on that awareness. We are aware of it and we are including people to help us guide the information technology on a health equity basis.
CC: (Referring to Emily Chang, a keynote speaker at the 2018 CHIME Fall CIO Forum and her book Brotopia). It talked about a certain type of male stereotype, for lack of better words, where the (men) of Silicon Valley designed these applications in a likeness of themselves. Knowing this, at DOD (Department of Defense), we purchase nothing but commercial, off-the-shelf products, and it never really crossed my mind that the data that we are pulling is flawed in a sense because it doesn’t have all those other factors that we talked about.
I think one of the (actions) that maybe we, as a panel, or this CHIME collective can do is go back to the vendor community and say, “When you are designing these systems, do you have diverse teams that ask these tough questions? Are people of difference represented in the algorithms they’re creating?”
RB: Let’s say you are no longer called chief information officer. You now are the chief data information equity officer. What is your responsibility?
AH: If we want to be patient centric, then we really have to focus on collecting information that is about the patient, not about what we’re going to bill or reimbursement. Chani brought up a great concept. What do the teams building the systems look like? Are they incorporating their life experiences into the system so that when we roll out the system, it truly captures the patient’s story? We talk about patient satisfaction and we talk about patient empowerment, but is the system telling the patient’s story? That is key for me.
DF: It starts with a sincere attempt, both at the systems and at the practices level, to solicit and to seek completeness in that intake record and to do so in a way that is culturally appropriate.
RB: There are people lining up for their COVID-19 vaccinations already. Has this one sailed and there is nothing we can do about it?
CC: I would definitely say for the Black community, that we are aware of the Tuskegee experiment. The trust factor is going to be huge. If you don’t see someone in a facility who looks like you or has the same type of culture as you, you will not share your information with them because you don’t trust them. How do we build that trust? That’s a huge task to take.
SR: Trust means that you need to know the cultural aspect of that place and it has to be represented by the people. It has to be a safe place.
RB: We all agree there is a problem. There are multiple reasons why. It can be from software development to software configuration to data entry to data utilization. We also all recognize, at least in positions of authority, we have responsibilities to fix it. I don’t think any of you would disagree, based on what we just talked about. Now the question is, what do we do?
SR: What in data or in machine learning do we need to collect that affects healthcare, that affects reimbursement, that affects life? Those things have to be talked out, and that’s one important thing that CHIME can provide.
WW: I fully endorse anything with the name task force. It really is an actionable verb. We saw that the Opioid Task Force was able to change policy and knock down some barriers on data sharing at the state level and federal level. Imagine, if you will, the thing we’re all motivated by, CMS (Centers for Medicare & Medicaid Services) reimbursement, now including these variables as a requirement for reimbursement. A task force could take a stab at some of those low hanging fruit that would help gain momentum where we tend to be influenced and start shaping some policy around how we can better represent our underserved constituents.
CC: One of the areas that we have always struggled with in the health IT community is that there is no standard for the demographics in an EHR. There may be some influence that we can have within the vendor space to say that not just race, but gender identity, language, by ZIP code. There are so many areas we can talk about, like disability. All those factors make a difference in a person’s healthcare that we are not necessarily capturing equitably across all the different EHR vendors out there. That may be one area that we could have some influence.
AH: I look at the endgame. I think it is equipping our CHIME members to have those critical conversations, not only with their executive team, but also with the board of the organizations they serve about how important it is to tell the patient’s story. In doing so, it will help us achieve equity.
DF: It starts with data; it starts with intake. It ends with equity, but along the way is something called access. People have to have access to our capabilities.
I hope we also come up with some way to test the data we’re using to build our algorithms and that when we build our algorithms, that we test them for bias, and we look early on in their implementation to test them for bias again. Otherwise, we do what we’ve been doing the last five years. We are perpetuating some long-standing, embedded bias in our health systems about what people will respond to, how people react differently to care, and all those other misconceptions that we have.
I hope we can work that into a task force or work group. As the health IT leaders in America and around the world, I think we can play this unique role.
Conclusion
The COVID-19 pandemic laid bare inequities in healthcare that disproportionately have affected people of color in America. In a healthcare ecosystem that has become increasingly digital, unintentional bias in data collection and use may be perpetuating these disparities. Thought leaders from CHIME and the CHIME Foundation agreed that to provide equitable care, healthcare organizations and industry suppliers need to include social determinants of health and other patient-relevant data elements as well as diversify the teams who design algorithms and other digital tools. As next steps, they suggested leveraging the knowledge and expertise of the CHIME and CHIME Foundation community to address bias, shape policies and create resources to foster equitable healthcare in the U.S.
RETURN TO CHIME MEDIA