Diversity in Clinical Trials and Community Health Initiatives Blog

Health Equity Examples From Acclinate | Acclinate

Written by Acclinate | December 20, 2024

Key Takeaways and Learnings 

  • Inclusive clinical trials are essential to addressing health disparities in underrepresented communities.
  • Building trust through engagement and transparency fosters participation in healthcare initiatives.
  • AI and data-driven approaches enhance diverse population outreach and decision-making.
  • Learn how one of our collaborations helped support more equitable inflammatory bowel disease (IBD) outcomes. 

 

Whether we’re talking about emerging MedTech, new drugs, or cutting-edge care, equitable access is a problem. Approaches to effectively address this must begin with attracting better marginalized representation in research. Hear from leaders in technology, governance, investment, and community engagement about how they are improving health equity through increased diversity and representation in research.

You can watch the video or read the transcript below:

Z: 

Good morning. Thank you guys so much for being here. I am Z. I've had the great pleasure to organize the sessions that are happening in this room, all weekend. I want to offer first a thanks to Advanced Technology International, ATI, who is sponsoring not only this panel, but everything happening here this weekend.

You'll see some QR codes around, including one on the way out. You can scan that for information on the other panels. And then we'll also have recordings of all of the sessions for any that you might miss. That'll be up on the website a couple days after South by is over. But with that, let me turn it over here and invite the panel up onto the stage with Amy.

Please join me in welcoming them for Representation Matters: Improving Health Equity

Amy:

Thank you so much Z and welcome everyone to this excellent panel. I'm telling you excellent in advance. But I know it will be because I have an excellent roster here of panelists and also because we'll be counting on you to also contribute to the conversation. So I'll note now that we'll have time at the end for questions and answers and there's a mic here.

So as we get closer to that session think of your questions and be ready to walk to the mic for that.  And with that, as I wanted to open the panel, Representation Matters: Improving Health Equity through inclusive research. My name is Amy Lin. I am the director of the ARPAnet H Health Innovation Network at ARPA H, the Advanced Research Projects Agency for Health.

To my right, we have Del Smith, of Acclinate. To his right, we have Anya Sheese of J. P. Morgan. And to her right, we have Bennett Graham of Main Street Health. I will encourage you all to look at their esteemed bios in the agenda so that we can dive soon into the conversation here. Right before we do that, I did want to get a sense of who we have here in the room, and something drew you to this topic of improving health equity in research and development.

I'm curious if any of you play a role, or how you might play a role in R&D. Raise your hand if you are currently a funder in any way of health R&D efforts. All right. We have a few here. Are any of you researchers, investigators actually doing the research? All right. A couple of the practitioners.

And then are any of you advocates for making the case for why R&D matters? Okay, several. So it's a good mix. We've got some funders. We've got some researchers and and quite a few advocates in the room. I also wanted to reflect on then, in your different roles, whether you're funding or researching or advocating for R&D, what do you think is the greatest challenge for why we haven't achieved health equity?

What do you think is really holding us back? I'll just give you a few seconds to think about it. 

And now, as you were just thinking that over, going through your mind, was the issue that you were thinking about related to a lack of money? No. Okay. Was it related to a lack of expertise, of know how, of how to get this done right? Oh, I am curious to see what you think are the biggest challenges. Was it related to trust with communities? 

All right. A little bit of non trust. Now I want to shout out, what did I not touch on? I heard access. Anything else? Okay. So it wasn't related to money. Not related to know how. A little bit on trust. Maybe access, and then perhaps we'll hear more in the Q& A. So that's an interesting backdrop for our, our discussion as we, as we dive in.

Okay. Wonderful. Well, in a second, I'm going to ask each of my panelists, how does their organization work on health equity and achieving those goals of making health research more inclusive? I will model it by starting a little bit of what ARPA H does here at my own agency. And just note that it's a really important value for us as an agency.

And we'll continue constantly looking at how can we improve health equity through all of the ways that we operate. And so we are an agency that's investing in R&D breakthroughs. And one of the ways that we're making sure that those breakthroughs reach the end users is by building a health innovation network that we call ARPAnet H, and it's anchored by two hubs.

One is the customer experience hub managed by ATI and the other is the investor catalyst hub. And together, those two hubs and this network overall are looking at how can we transition or move innovations from proving that something is possible to actually making it useful and in the hands of end users and available in the marketplace. 

So the customer experience hub particularly is really looking at how can we make sure a diversity of users is being considered as we think about the design of those innovations? Our first program is called Nitro. It's looking at osteoarthritis and addressing osteoarthritis with tissue regeneration techniques.

Osteoarthritis affects women twice  as often as men, and the populations that are most affected are black and Hispanic populations. So one element of this program is making sure that the clinical trials that test any innovation coming out of it will be as diverse as the underlying population that needs it.

And then one more plug, and then I'm going to turn it to my panelists, is you may have seen last week, we issued a solicitation posting for a 100 million commitment to women's health R&D. And so we're really trying to shine a bright spotlight and put our money where our mouth is on how to make sure that we're really truly investing in women's health issues and areas of concern as well. 

So with that, let me start with Del and ask Del, how is he and his organization working on addressing these types of health equity challenges that have traditionally excluded some populations? 

Del:

Yeah, absolutely. And we appreciate the invitation to be on the panel as well. Good morning, everyone. My name is Del Smith.

As you heard, I'm the co founder and CEO of Acclinate. We specialize in accessing and engaging communities of color so they can make informed decisions about clinical trial participation. And I want to pause that there for a second because what I didn't say is that we're a clinical trial recruitment company.

And I bring that distinction. I see a lot of faces in the audience that are saying we understand is because, you know, the traditional model is like, I'm to folks and being like, "hey, you don't know me, but would you like to be part of research?" And we know that that doesn't work for a lot of people, but particularly for our communities of color, for various reasons, are really thinking, "hell no, you're not going to get me to take part." They may not tell you that in your face, but that's the thought. And so what we do as a company is we are a tech enabled service company and the service part of what we do is actually community engagement and we do that both on the ground and digitally.

We had a team that was in Houston and Dallas last week activating those communities and we are there in front of people there to help access them to your point, but also then to engage them. And then to educate and ultimately empower them to be able to make decisions. So I'll, I'll stop there and just kind of say, as we lean into that topic of access is important, but you know, particularly for our underrepresented communities, historically access itself has not solved issues, whether it be about financial issues, economics wise, and it is not going to solve the health issue with, with having access be kind of the center issue because we have to engage and we have to build trust.

Amy:

Thank you for sharing that and already touching on some of the pieces that have come out from the audience poll. Anya, how does J. P. Morgan think about health equity and your investment decisions?

Anya:

Yeah. Well, thanks, Amy. And thanks for having me here. It's really interesting coming now from J. P. Morgan, where I've been about a year and a half, which is we've two very different parts of the organization focused on this area.

One is my colleagues that sit in Morgan Health. which is an effort from J. P. Morgan Corporate. So balance sheet investing focused on employees and health equity, health access, et cetera. And so their lens of the world is looking at the 500,000 covered lives, so employees and their dependents of J. P Morgan, and how we can make sure that the health equity is much more democratized. And just interestingly, because we're all one payer, right, we're one company, so being able to dig into our own data gives us a sense of if you hold payer constant, right, so theoretically there should be no difference, we see tremendous difference, and not only geographically, but also racially, ethnically, sexual orientation, things like that have significant impact. 

So that's one part and one lens to which you look at it. But I'm the managing partner of our life science venture funds, right? So from an investing perspective, you know, how do we look at it? And how do we think about for the therapies that we.  We're just saying, you know, how do we get the information that we think will benefit the world to get approved?

So whether it's osteoarthritis, oncology, whether it's whatever, how do we make sure we're funding companies that are doing the right things both in general and the right things by patients?

Amy: 

Excellent. Bennett, can you tell us more about Main Street Health and how you're looking at health equity from that lens?

Bennett:

Sure. I'm Bennett. Thank you again for having me. I'm the president of a company called Main Street Health. We're a healthcare delivery company that is focused exclusively on rural health across the country. So the 20% of the country that makes up about 60 million Americans.

And really thinking about how do we deliver? A different way of both buttressing existing infrastructure, but also thinking about how to improve the access within these rural communities. The genesis of Main Street was really my co founders, one with whom I had done previous entrepreneurial ventures with, had really seen how business models were working to drive more value based models and incentivize not just doing the care and sort of per click, but actually think about the outcomes of that care that we had seen work across all these other communities and yet the penetration of value based care in rural was really missing.

You just didn't see those same resources flowing into rural communities. Our third co founder was a practicing primary care physician in central rural West Virginia, where she ran a large federally qualified health center that she had grown from two clinics on paper to 70 clinics on very sophisticated EMRs and using advanced technology and when you would go up to one of those, that clinic in Clay County, West Virginia.

It does not look anything like the Mayo Clinic, or St. David's here in Austin. It is a very simple brick building, and yet the caliber of care that's being delivered inside of that clinic is world class. It's really, really high caliber quality. But the thing that Dr. Shannard, my business partner, really ran into was this challenge of the moment that that patient walked out of that clinic.

What happens, and that patient didn't have access to a lot of the other screening tools or acute care that you might have access to in a city like Austin, but also there are real challenges in just terminology and literacy, whether that be asking anyone out on the street today what a rheumatologist is.

They can't tell you, right? Let alone navigate some of the idiosyncrasies of health insurance, which in and of itself, I mean, all of us take care  year to just figure out how to re enroll in our health benefits because it's just so confusing. And so, really starting to think about how do you actually improve the care between the visits because that's really where so many of the challenges happen in regard to really navigating your health. 

Amy:

Excellent. And we're hearing several themes come out here of different ways that we're thinking about health equity. So certainly by where people live rural versus urban, it could be harder to access, but also demographics. And you were mentioning communities and the skepticism they might have when you ask them to participate in clinical trials or something like along those lines.

So maybe, Dell, to pick up on that, as you were speaking about Acclinate, you've mentioned artificial intelligence and technology can really be useful here. AI and community engagement don't often go in the same breath and so curious and especially picking up on this theme of trust. How do you use AI to build that trust to reach those communities?

Del:

Yeah, that's a great question. So internally as a company, our platform is a combination, as I said, tech and enabled services. What our tech does is we make predictions as to, of all the people that we engage with, when is the most appropriate time to present an opportunity for someone to make a decision about taking part in clinical research.

So something we call a participation probability index that we, we have a patent. And, uh, On and, um, so we use it that way. So instead of engaging with all these communities and then next thing you know, we're sending them all these messages, "Don't you want to take part in research?".

Our primary objective is to engage and empower people to make healthier decisions overall. But it's that process that generates two things for us. Trust, it builds trust with that individual and data. And it's that trust in the data. So we use the AI in that sense, but I will tell you, it's interesting, we finished up a project, we had a project with NIH, where we proposed that we need to get the community involved in discussions about AI, otherwise it's just going to increase their mistrust of it.

And so we proposed a participatory action research model where we said, let's go and actually get the community to understand about AI and actually put their hands on it. And then let's assess that and see if it actually decreases their mistrust and increases their use of it as it comes to advocating and empowering them to make health decisions.

So we did that project for a year with some astounding results and what we hypothesized was true. That when you started with the process of engaging with the communities, and we would go and do focus groups and community groups, and we introduce AI and people are like, "Oh, my gosh, what is this thing? I'm scared of it."

But at the end of that process, they were leaning in into various checks, GPT and other AI functionality and trying to get answers about health, trying to figure out how they could create a health campaign and how they can communicate it with their neighbors and their community members, that type of involvement and integration with AI being transparent, having people be involved in the process is what's going to keep this being an issue.

I read this article yesterday that said the in the United States alone about 50 percent of people in the United States had a mistrust of AI. But that number is actually dropped to only 35 percent of people in the United States have a trust of AI. And that was as of yesterday. So you can imagine that if we don't figure out a way to not use AI to do something to someone, but use AI to bring them and have them be part of the process, we're going to be in trouble.

Amy:

That's really interesting. And I wonder, especially in that participatory research approach that you took, was there some insight or change or improvement that you were able to make to how you use AI that came from the community? How do they're very different perspectives help expand how you think about it in a more expert way?

Del:

Yeah, absolutely. I'll use the Instagram example. I'm not a huge user or even fan. I know this is being recorded, but you know, there's so many algorithms there that when you look at things, it knows what to put in front of you based upon your use.

And we utilize a similar type of technology and model so that when we're engaging with communities, particularly digitally. We use that same type of information AI to determine what best to put in front of someone, but we're also very clear and transparent about that process with them. So they opt into our community.

We've got about 100,000 engaged members right now in our community. They opt in, and then we let them know we're using your activity, your engagement to help us better Um, put content in front of you that might educate and empower you. But it's a tricky process and I think anything that has that level of involvement with AI in a community needs to have a very similar type of approach.

So letting people know we're utilizing this technology for what we think is good for you and what we think we're going to provide for you, but we need to let you know in the process. And then the last thing I'll say is again, going back to the participatory action research, whenever we implement an AI solution in our company, we always bring in the community to say from the very beginning, "This is what we want to use AI for, this is what the process is going to, what we want it to be, and here's what the outcome is. Tell us if you're okay with that and if you have suggestions for improvement." That's how we handle integration of AI. 

Amy:

Excellent. You're taking consent even earlier in the process an
d really sharing that transparency. 

Anya, I have a trade offs question for you. Often when we hear about investing in health and certainly from a large capital deployment firms like J. P. Morgan is positioned as a trade off between accomplishing health equity goals or getting a higher return on investment, making more money, getting your profits.

How do you balance that tension between trying to include new populations, but also get the return that you're seeking? 

Anya:

Yeah, it's a great question. And I think it's often sort of misunderstood in the sense that there's not necessarily a trade off actually there. And when you look at the things that are sort of going on in health care and why it's sort of fundamentally changing the next 10 years, probably versus 10 years ago, are really three factors.

One is just the biological and clinical understanding that we have of the body, right? The flywheel of knowledge keeps increasing. The second is the age of AI. So what AI itself is allowing us to do. And then the third is value based care. So we're sort of all aligned in terms of what our companies are looking at here.

And when you combine all three, the access piece and the equity piece is actually driving a lot of the returns. And so here's what I mean by this. If we have a drug and the addressable patient population is 10 million people, but it works in 30 percent of people.  Okay? Or, if we have a drug and the addressable population is a fraction of that size, but it works in 90 percent of people. Right? When you think of commercially, so if you just did the expected value and you assumed you'd get paid for it and whatever, sure, you'd go after the bigger population with the lower percentage, but as we sort of evolve in both value based care, as well as the pressures around pricing.

Being able to better select the patient population, even if it's a narrower patient population, but with a much bigger effect size is actually from an investor's perspective. What's going to lead to the better returns? And so we think a lot about that. And there's tremendous advances in things like spatial biology, the ability to see what's happening kind of between cells in space or metabolomics, so see what the metabolites of different processes are in cells and other things where we can actually see at the patient level differences.

For the longest time, healthcare was prosecuted in a population manner, which was appropriate because that was sort of the level of fidelity that we could get to. But today we have the technology both scientifically as well as socially and we talk a little bit about what social media and things like that are doing in terms of data aggregation and what you can know about patient populations, that we can actually start to get down to the individual level, the personal level of how these drugs or different treatments or whatever work. And that's allowing us to, too slowly, I admit, but that's allowing us to start to break down those barriers.  

Amy:

It's helpful to know that it's not just a simple arithmetic that gets you to a decision on yes or no on investment, and there's a lot more nuance there. And I really appreciate how you were pointing out that some of those old paradigms of thinking of large population versus small are the only factors and instead that flywheel can get activated and then that can generate a high return as well. So they don't, you don't have to choose between profit and equity.

Bennett, Anya just mentioned value based care and I know that's a foundational value for how Main Street Health is looking at reaching rural populations. Maybe similarly, it can sometimes be mentioned, reaching rural populations is too expensive, it's too costly, it's too hard.

So, in order to be profitable or to make something commercially sustainable, we're gonna have to just target more the urban populations. How has Main Street Health overcome that apparent trade off in a way that's been really successful in reaching rural populations and getting to scale?

Bennett:

I want to build off of what Anya was saying, because I think that it's interesting where oftentimes in health care, you have sort of the biosciences world, life sciences world operates very separately from the delivery world, and they end up meeting in the middle on the science. But as far as the innovation sectors, even the investors are not the same investors as far as like who's investing in heavy R&D and things that are gonna go through clinical trials, etcetera, very different investors than a value based company that's going to go work with a bunch of primary care clinics.

But I think what's, what's interesting is that I do think that this discipline around understanding the effectiveness of the impact, I think is starting to converge more because I think that the historical way that a lot of healthcare population health work has been done is we put a lot of data together.

We sit around a table and have like a really interesting academic conversation around wouldn't it be cool if we could do something with this diabetic population. And then when you actually go out and start doing it, you can only actually engage like maybe 10, 20 percent of those patients. And even then I think as far as like the discipline around measuring the actual impact you're having.

The reality is that a lot of care management has a real problem with regression to the mean. Where you pour a lot of dollars into something that feels really good and there may be good reasons to do that are beyond just the sort of clinical outcomes that have more to do with patient experience, etcetera.

But I think that there is an increasing discipline, especially in the value based care world that I think is where we are starting to, you know, where we are measuring things really closely around both clinical outcomes as well as the financial outcomes and trying to do that with a certain level of discipline that I think is gonna sort of separate wheat from chaff.

I think in regard to being able to scale that, You know, I think that for us. We saw the secret. We didn't necessarily, we didn't invent a new business model. This business model's been around for the last decade. And you have incredible companies like Oak Street and Agilent and Allidade and all these companies who've been doing a lot of this work.

Not to mention the health systems themselves here in town, but I think that what we saw was there was just this gap, and no one was really focused on rural because of that problem, and usually, honestly, it's a windshield problem, because it just takes more time, it costs more to send a provider out to a rural community, and I think there is like, it is a very different culture in a small town can still be very distrustful and honestly sort of, you know, as Del's kind of been talking about, that's kind of been an earned distrust, over decades and decades of being taken advantage of.

So I think that for us, as we've scaled to think of us basically like half the country and we're in, you know, we're taking care of basically 1 percent of the entire U. S. Population in our clinics. The thing that I think has allowed us to scale as quickly as we have is that we really have taken the time to get to know those communities and enter in and like, Really earn the trust on the front end  and that is both hiring locally and Spending the time to really get to know that what matters to that community And not just rushing past and saying well "Hey, this is this is what somebody in an academic center said is the right thing to do", but actually listening to say well, how does this actually work for you today?

And so I think that you know, because as we work with our clinic partners, and so think of it as like the core thing that we do is we actually embed an additional person into, you know, we're in a thousand primary care clinics across the country. We give them an extra staff person and that, and their person, that person's job, we call them a health navigator, but I think of them as like, this is the niece or nephew we all hope to have when we're 80 who is there, this is not someone with a white coat, this is not someone who is necessarily clinically credentialed, but they are trained enough to be able to navigate you through the healthcare system, both inside of that building, as well as when we step out, the literature says that we retain about 40 percent of what doctors tell us.

Yes. Most doctors, they say that sounds really high. And so a lot of what this navigator is trying to do is say, "Hey, like let's just make sure that you understand what that doctor just said." And so I think taking the time and I think very similar to what I think Del's talking about of you really have to, you know, I think this idea of like coming in over the top and being sort of "Hey, this is just the right thing to do."

Unfortunately, like that is the way that a lot of medicine works because we want to look for, we want expertise, when we're going to the doctor, we want somebody who really knows what they're doing. The problem is that the way that that is received a lot of the times as a patient, if it's not communicated well and we just rush right through lots of Latin and acronyms and in a lack of awareness around, well, how are you gonna actually get to that doctor? And are you gonna be able to pay for that drug? And do you understand what it means to take this twice a day? All of those things actually become major impediments into actually delivering.

And so I think a lot of what I think has allowed us to scale is actually, measuring it on the front end, making sure that we're, we're not just throwing stuff out there just because it feels good, but also taking the time to really understand, well, how does, how do things work today so that we're not coming in and not listening first in regard to the design.

Amy:

Del, do you want to add on that?

Del:

I can. I mean, I really love what Bennett's saying. And, you know, the example I tell people sometimes to really try to understand that, like I was in New York city and I caught COVID very early. It was like February and I went back to Alabama. My friend told me the other day, he goes, "I think you were ground zero, a patient zero for Alabama."

But our headquarters is there and I was sick. I went to my doctor, my doctor gave me this whole slew of things that could be a primary care physician, right? You could have all these things, viral infection. And I talked to my grandmother that night because I try to do that like once a week.

I call her Big Mama. That's kind of sometimes what black families do when they call their grandmother Big Mama. So I called her, "Big Mama, I'm not feeling well." And she says, "You know what you need to do? Put some Vicks on your chest, wrap up in a blanket, turn your heater on and sweat it out." That's all.

And I remember sitting there thinking to myself, I was like, my doctor went to school for 12 years plus, right, but probably spent five minutes with me and didn't think about me after my left, my grandmother, who didn't finish the sixth grade, but loves me and never say or do anything to hurt me. Right. And I was going back and forth those things. And maybe I'm a little shy to say it, but that night I put Vicks on my chest.

And we actually use that experience as a fundamental part of our company. We call it the difference between affective versus cognitive trust. I had cognitive trust for my doctor, right? I mean, I know he's, he knows his stuff.

He went to school. Yeah. But I have affective trust from my grandmother, affective, A F F E C T I V E, right? I know emotionally she loves me, she cares for me. I think what we're trying to do in this scalable solution is try to combine both this cognitive and affective piece. So we say to people, listen, this isn't about just giving you more information so you can make a decision.

This is about letting you know that we care about you. And we're gonna do our best to protect and make sure that we don't put anything in front of you or present anything to you that could be detrimental to you, uh, or your family. And when, when healthcare I think does that, a little bit more of that, we'll, we'll see a little bit better.

Yeah, we have to have more opportunity to scale and bring in groups that historically have been very hesitant to come into the fold. 

Anya:

And can you scale affective trust? I mean that feels where the disconnect is.

Del:

A question by an investor, which is what I've heard. Yeah, we pitched to investors and we got a couple rounds of funding and that's the question.

How do you scale this thing, right? And I think that's part of the challenge, but that's the opportunity, and I think we've shown some initial success and traction with doing, utilizing technology to do some of this cognitive work, but then also recognizing that you're not going to, there's no switch, there's no tool, there's no website that's going to give somebody information that's going to all of a sudden have them trust and want to move forward, and so you have to do what Bennett is doing and what we're doing with our company, it has to be part of the solution, and we need dollars.   

Bennett:

Yeah, I mean, I would say yes, you can do it, but I think that, like, the incentive structure is not built for it. The incentive structure is all around cognitive trust.  Because, I mean, even if you think about why do we have fewer doctors becoming primary care doctors?

Well, you get paid a lot more if you're a specialist or a surgeon. And so just the earning potential is a lot higher. I think that whether it's that, or when you actually look at how does a doctor's office get paid? They get paid on the amount of time that they spend in that room, and the number of things they do to you.

And so the problem is ,especially when you look at that in a, you know, for us, most of the clinics that we work in, it's like one or two providers. Oftentimes it's like a nurse practitioner, and a front desk, and maybe an M. A., like a medical assistant, and so all of those people, in order to keep the lights on in that clinic, you've got to see at least 20, if not 30 patients a day, and so you just got to like, it's in and out and so the problem is that there isn't actually any of those people, none of them is actually incentivized to stop and actually have the affective conversation to say like, well, how do you feel about this? And like, do you understand any of this? Do you know what to do next? Because there is, there isn't an incentive for it. 

Because the way that doctors are incentivized is not on, do they keep you healthy? It is just how many times do they see you. And so yeah I think, and the majority of health care is still structured that way. 

Anya:

Yeah, so with Medicare Advantage, I mean I know you're not - are you in the Medicare Advantage space?

Bennett:

We're in - all seniors.

Anya:

Yeah. Medicare Advantage has the highest penetration now of value based care. So on the commercial side it's like 12%. It's it's really low, but Medicare Advantage it's in the 30s or 40s. Now, Do we see that difference? Cause theoretically, you know, it's the old adage, show me the incentives and I'll tell you what the system looks like. Right? So theoretically the incentives there are such that we shouldn't see more of that. Do we? 

Bennett:

You do. You see a lot more of like the, the actual sort of, the time that a patient actually gets to spend with their doctor and with the broader system is a lot longer, even if you just want to sort of look at it as like time there. But I think that The challenge does get back to a data measurement one, because the reason why you have more penetration of value based care in the Medicare population is that you have a longer time that a patient is on that particular insurance plan, right?

Because you have so much churn, whether it's in Medicaid or commercial people nowadays, I don't know what the average now is, but for a millennial in a job, it's no more than two or three years, right? And so, as opposed to a Medicare patient who's going to be in that same plan for a decade plus. And so, you actually have the time to be able to invest in that and measure it over time.

Now I think that, are there ways to start thinking about how do you do that? Sure, but I think it's like, it's complicated, of how do you actually start to track a patient across a commercial plan and then they lose their job and they're on Medicaid and then they age into Medicare. There's a lot of data degradation that happens across that, which makes it a much harder for anyone to say, well, what happened a decade ago? Did you actually influence the savings?

Amy:

So Anya,  if you could take that timeframe that Bennett was just walking through and put it into the investor timeframe, which may not have the patience to see how the health, the patients, two different kinds of patients might move through a system and shift from place to place, shift from plan to plan, how do you calculate that? How do you take that into your, I know it's not just one formula, but how do you take that into the roster of criteria that you're waiting to think through? Is this worth investing in or not? How can we afford effective care as well as cognitive care?

Anya:

Yeah,  I mean, it's a great question. And if I had the answer, I probably shouldn't be doing this, I should probably be sort of influencing health care at a broader scale, but I would say the first thing that we need to figure out in the US is we spend the most by far of any country on health care and get mediocre returns.

And health care is fundamentally still a human-mediated sector. And you know, to get affective trust and to have these care navigators and everything, we're proposing or we're coming up with solutions that perpetuate the human-mediated-ness, which, if there's anything, to say about sort of human-mediated, it doesn't scale or I mean, it scales, but it scales linearly, right? It won't scale sort of super linearly. And so figuring out a way to solve that problem just in general, as we think about for the puts and takes and the payments and healthcare is gonna be really important, and I think it can be done, and I think a lot of what, you know, Del's doing and, and others using AI to sort of help solve some of these problems where we're not gonna get away from human-mediated care.

I mean we're not gonna be cared for by a bunch of robots with no people there, at least certainly not in my lifetime or children's lifetime, maybe. Maybe in the future. But so, so that's sort of number one. But then number two, to your question about timeline of fund, as an investor, our funds are 10 years.

That's sort of a very typical length of time for investors. And that's not only sort of when you make an investment in an early stage company, that's all the way through selling it, right? So you have 10 years to give your returns back to your investors. And so I think for us, it's less about "Oh, we can only, invest in in products that are used in a patient population that has a three year time frame," that isn't sort of the lens at which we think about it, but it's more of the how is the health system going to evolve to pay for these things.

So think about gene therapy as an example, right? Gene therapy is curative, or should theoretically be curative. But okay, you give a gene therapy once, maybe a few times, depending on the dosing, but within a relatively short timeline, within a couple years, and a patient is now rid of the disease for life.

Well, Does the, does the plan that the patient happens to be part of that couple of years need to bear the overall burden or should that be shared and how can that be shared, etc. And that's all stuff we're working through right now. If, if there was perfect mobility, it wouldn't matter because, you know, the plans would sort of pay their price.

Fair share because the people would sort of democratically move around, but that is not the case, right? Obviously because most of the health care in this country is commercial pay and you know  different companies have sort of different plans and different longevity and whatever, so figuring out how to getting back to incentives, how to figure out how to do this so that at a population level, which, again, you got to think of things population level and a personal level, but we can sort of deal with these things is something that we think a lot about. Another example is poly pharmacy. Patients that are on sort of multiple products and it's an analog of what I was just talking about, but if one product will let you live two years longer, another product will let you live two years longer, but the two together will let you live three years longer, right?

Payers are sort of sitting there going "Okay, but I'm not paying this and this." Can we figure out how I can pay for the three years, but like divide, figure out you companies how to kind of divide it amongst yourselves. And so it's a different problem, but it's an analogous problem. And, and we're working through that now.

And I think we will have a solution for it. So as an investor, when we look at products like a gene therapy product, there's a bit of faith. Frankly, that we have to take that says, "You know what? I think by the time this is on the market, you know, 5, 10 years from now, we'll have a solution."

Amy:

Interesting. I'm gonna put the audience on notice. Think of your questions, and if you're interested, come up to the mic to ask a question. We'll have one more point here, and then we'll start taking questions from the audience. Dell, before we do that, I wanted to pick up on a couple of the themes that have just been mentioned.

I think Bennett, you were mentioning that oftentimes when you're looking at reaching out to new patients, you might only be able to reach 10 to 20 percent. And then Anya, you were really speaking to the systemic barriers that might prevent you from seeing the full returns because there are blockages in the way things are set up.

And so I was hoping you could speak to, since you're focused on clinical trial research and recruitment and engaging communities into that. What do we lose as a society when we don't have a fully representative research population at that early design stage? So we've been talking a lot about services and payment - that's after the solutions are developed. But what do we lose at that beginning stage, at the clinical trial stage, if those solutions are not tested and getting data from an inclusive population? 

Del:

We lose a lot and I probably don't have time to cover all the things that I think we lose in society But I'll touch on the high points.

I mean clearly from a farmer's standpoint, we've got regulatory agencies or regulatory agency to be more specific that wants to ensure that, but what we actually recognized is that when a drug gets to market that it has a level of efficacy and doesn't have side effects, you know, based upon a person's demographic, right, race, ethnicity, age, whatever the case may be.

And the only way we're going to know that is to make sure that there's inclusive representation in the trial. So you have a regulatory piece of it. But even aside from regulatory piece, right, we have disease states that we've been talking about that have a higher prevalence in certain communities.

And we think about the occurrence of breast cancer and how that, uh, occurs in greater numbers for African American women. Three to four times in a lot of cases, prostate cancers, but yet you're seeing these studies that barely have A few people of color in these studies. And so you really start scratching your head to say the people that are impacted by this, the most are the ones that are the least involved in the research.

So one, we are questioning the efficacy and the, you know, the safety. But then we have to think about, you know, how much are we contributing to this idea of minimizing health inequities. And we tell people sometimes clinical trials are, are good for you to be part of, because it could be a treatment option you wouldn't otherwise have.

So there's elements of there that we're missing out of as well too. And and then it's just the overall trust in the system and that, you know, something I read a week ago, it was, it was the industries that Americans trust the least. And I was looking through the list. Government was second to last. You know who last was? It's the pharma industry. And so it's one of these aspects of just being inclusive, being representative, making sure that it's intentional. Part of it is getting people to trust more and be more willing to lean in and take part irrespective of the structural barriers that exist.

I think the industry and we can do a better job of letting people know we welcome them. We want them to be part of the solution while we may not be able to directly relate to the lived experiences we want to understand. We want to make sure that we take that into account from the time that we're developing the protocol for drugs all the way to when the drugs are on the market, and then some.

Amy:

Thank you for bringing that back and really to the theme of, you know, that early stage research just informs the solutions that we can then deliver and pay for and invest in later on as well. And that that participation itself builds trust for all of the ongoing stages.

For questions, maybe we'll take two and then we'll keep going down so you could say your name and where you're from. 

Bill:

My name is Bill Howell. I work with ATI, but I was formerly doing R&D for the army for many, many years. It's interesting to hear that, culture, money, really do drive a whole lot of this in the sense of "how do you get to these people?"

And there's no incentive financially for their companies to go there to start off with. And then government, who's probably one of the few people who can go out there and try and get the mistrust done, is no longer trusted anymore after COVID and the rest. So you hit on my topic. Is there another methodology?

Would something as simple, well, it doesn't sound simple really to do, but avatars or something of that nature that gives you a front end - could those be used to potentially break down sub sectors of culture? We know rural is, hell, they never get health care anyway, so why am I going to want some now?

You're going to have to teach them. I think that was a very good statement earlier about they just don't know what is available and in some cases you have to bring it to them because they're not going to drive 125 miles to go find it. Anyway, what's another methodology to get in the front door to build that trust?

Del:

Yeah, I'll be brief with my response and appreciate your service as a fellow army veteran. I appreciate that. You know, it's funny you talk about avatars. I'll say that real quick. Just yesterday, our team was really leaning into the use of avatars for engaging our communities.

And so we don't have the metrics yet to know how different it is in terms of its own level of efficacy and reaching people, particularly from an affective trust, but we are exploring it because it kind of gets to the issue. How can we reach more people and access them more and still be true to understanding what it really takes?

We know it's missing. The avatar can look like the person, the avatar might even have an accent. The avatar, the background might be, you know, something that we feel our patients and communities can relate to, but there's still something that's missing... something that's missing.

We haven't got our hands on that yet to figure out what that secret sauce. But at the end of the day, not to sound too philosophical, I think it's the human being that's missing and that being part of actually being in the present now.

Amy:

And Bennett, do you have any ideas, just, maybe not avatars, but just new ways of reaching people?

Bennett:

So,  I think what's been so interesting is I've been so privileged. Literally for the last three or four years I've gotten to go to more rural America than most anybody else in the whole country. And what's so interesting is that that trust does exist, in a deep way with the existing, usually, that country doctor who's there has a really important role that they play in the community.

And sometimes they sort of underplay it, but usually this is one of the wealthiest people in town, they're an elder at church, they're the team doctor on Friday night, and even if you don't feel like you have a deep, close relationship with that doctor, you definitely know who the doctor is.

Aand the doctor takes care of half a dozen patients when he goes to the pharmacy or to the grocery store, right? And I think that one of the things that we found is that there is this incredible social capital that actually that physician has in the community, but it really has not been activated. And because they only have so much time, they're trying to sort of try to just live a little bit of a life, but if you talk to any rural doctor, they've not taken a vacation in 20 years.

So the opportunity to extend that doctor has been really influential because, getting to when I had built kind of care management models in the past, the problem was we really would top out in the low thirties - if you gave me a hundred patients, I could get engaged with 30 of them if I did a really good job. 

If I call from the rural doctor's office. I'm gonna engage 95 percent of those patients. And that is really different. And so I think it's this idea of you got to meet people where they are. And I think that, yes, like telemedicine and data and AI, all these things are great tools, but it does not replace the human element.

Amy:

Great. Here, let's do the next question. 

Adam:

Adam Grant from Total Reentry Solutions. I've got two questions. One's kind of easy and it's for the entire panel, and I want to know your understanding of what equity is, because a lot of times I hear these discussions and I hear equality. I don't necessarily hear equity, but I have one for Del that actually came from the last panel that I was sitting in.

You actually helped me to understand this a little bit better and feel a little bit better about it, but I want to know what your thoughts are. And I said, artificial intelligence relies primarily on existing data - are there concerns in the medical field that the lack of diversity of data in the past will lead to new generation of biases going forward?

And, since that is the most primary question that's going to be based on what's gone in the past, how do you counter that with new information that is going to be contrary to what things have read before?

Amy:

Right before you answer I'm gonna ask the questioner behind you to also ask and then we'll bundle them together.

Raj:

Thanks, I'll do my best to connect the dots. My name is Raj, I'm the head of strategy for Culture Plus Group, and the reason I say that is our purpose is health equity, to your point, yes, health equity and health equality always gets mixed up. So the question is, part of my job is threefold.

Half of my time within my work is to educate on the healthcare side, the importance of culture,  beliefs. All that. On the second half of my job is to tell to the people the importance of science. Keep the culture aside. While to your example of grandma and the wicks, I was chuckling back there because I hear it all the time in the interviews that I do.

Then there's a third part of it, which is the investment. I have to always look at where are the areas where investment is being made to solve certain problems. So the question to the panel is, for example, gene therapy, immunology, these areas are huge, we're always looking at biotech companies, we're looking at new things that are coming in, investors are putting in - what about the things that are already been there where the investments are? When you talk about health navigators, when you talk about health deserts out there in the rural areas. And I'll give you a very specific example. Recently, I spoke to, I interviewed a couple of specialists on HIV.

I'm just using it as an example. And I asked them, the 50-50-50 rule, by 2030, are we gonna hit it? Unanimously, in the United States, all the health specialists, particularly, they said, nope, not gonna happen in our lifetime. Where UK, Australia, Africa, for all the reasons, have hit those numbers already and surpassed that.

So, I'm trying to understand from an investment perspective. Technology plays an important role and to your point, yes, maybe not in my lifetime, my kid's lifetime, down the road, AI is going to make a difference and I'm fighting every day for data inclusivity, making sure with a clinical trial. So I would love to hear on the investment side of things, what to expect in the next five years.

Perhaps someone like me can get excited and bring it back to the people out there when we communicate. Thank you. 

Amy:

All right. Excellent. Thank you for those questions. Del, we'll start with you. All right. 

Del:

I'll hit the AI equity piece and I know you'll jump on the investor question. When we developed our algorithms, our conceptual algorithms about four years ago for our model, and then we started taking initial data, preliminary data in to try to create this participation probability index.

You can imagine our algorithm was producing outcomes that were not very diverse, very homogenous in terms of being able to say, these people are gonna be willing to take part in studies. And we're like, "well, those are people that look like the first people that take part." So we have to go in and pull some levers, and at first we're like, "this feels kind of uneasy", because the system is saying these factors and the algorithms are running, but we're going in and we're overriding them.

To get to an outcome that we're looking for. I think we've come to later find out that that's really what people need to be thinking about. Don't just let the algorithms run free like kids in a room and say whatever they come up with is what they come up with. And the last thing I'll say is I was very encouraged by what I know Johnson and Johnson did.

In this case, we interact with Johnson and Johnson and they have a tool now. That says, if you're a provider, a vendor, and you have any element of AI and your solution set to us, we're going to ask you questions to see how much you've mitigated the risk of bias in this. And it's a pretty extensive tool that I know that they had us go through as a vendor, and I'm assuming they have other vendors go through that.

But that encourages me to think that if we adopt that type of nature of not just accepting whatever AI tool and solution set, but to be able to ask questions about what data was involved and how did you try to mitigate bias? I think that's going to help. 

Anya:

Yeah, so to answer sort of the investment side of it, but also touch on that.

I think Data is critical until you can measure things and until you actually know where your starting point is, you can't really improve, and even if you look at health outcomes, again, if you look at health outcomes on a population level, I'll take J. P. Morgan, for example, with all of our covered lives.

If we said, "Hey, how good are we as an organization doing across the board?" for the entirety of the patient population that we are responsible for paying, which is almost half a million lives, how do we do on these different things? We might pat ourselves on the back and go along our merry way.

But once you start to look at, okay, let's sub segment these people, whether it's, our employee base, whether it's geographically, whether it's socioeconomically, whether it's racially, whether it's ethnically, you start to see pretty significant differences. And until you have the courage to look it's easy not to. It's sort of like you have a lot of bills in your mail, and you're just sort of putting off opening the bills. You know what's gonna be there, but you just kind of don't want to see it. It's a little bit like that.

There's also a problem of data collection, right? So even asking some of those demographic questions to your employee base, or your patient population, or you know, whatever sort of context you're coming at this, potentially opens you up, if you're a lawyer, you'd say it opens you up to risk. I don't know about risk, but at least, you know, that's PHI.  Even that demographic information, right? So there's a whole sort of rubric around kind of what you can do with that data that a lot of people don't want. So I think first you do have to know it, but to the point, I mean, yes, AI, literally the way it's built, right.

It's just sort of recapitulate patterns that it picked up on. And so if you're beginning with not a diverse enough data set, you are going to have problems with that. I sort of like to use this example. So if you look at the number of parameters that chat GPT was trained on. Okay, so that number of parameters. If we were to do the same thing with health data, how many patients do you think we'd need data for? Any guesses?  I mean, I'm setting you up, I do realize, but it's 14 billion percent.  All right. We all might realize there are not 14 billion people that are alive today or have been in the era of, you know, medical records being kept.

So we just, we can't, AI is going to play a super important role, but we have to understand kind of where we are with healthcare.

Amy:

Great. And actually, maybe I'll turn it to Bennett. What is your definition of equity to get to that foundational question that was raised at the beginning?  

Bennett:

I mean, I think when we think about it, it's really sort of like, regardless of where you live, you should still be able to get good care, right?

Ahat should not be the thing that, you know, that there should not be, um, the fact that you live in a rural community should not impact the way that you get any sort of less than if you lived here in Austin. It's probably how we think about it. 

Amy:

No, I think that's great. And I think the question of, you know, how is that different from equality, I've sometimes heard it described as equality is everybody literally gets the same thing, and equity is everyone's getting what they need to get to the same outcome.

So if you imagine people of different heights, the shorter person might get a stepstool to reach the top shelf and the taller person doesn't because they don't need it, but it doesn't mean it so they're all equitable, but not necessarily equal, if that helps too. Right before we go, if we only have three minutes left, I just wanted to pull out some of the themes that I've been hearing today and really appreciate my panelists before I ask them to give us a call to action to close this out.

And one is how important this topic is, how important equity is in making sure that those who want to participate can, and they're represented in our research, which then informs the solutions that can be delivered and, and offered later on. And so I think that this theme, and what drew you all here, is really grounded in that spirit of making sure that we're including a, a diverse population there.

We heard some of the reasons why that's not always there and a lot of it actually did center on trust, affective versus cognitive trust, maybe sometimes from a grandmother, but sometimes from patient navigators who can help break down the Latin and the jargon to something more digestible. And also to think about different time scales and the financial incentives that underpin those and how can we see through systemic blockages to get to financial incentives that go towards this goal of health equity.

And then finally, thinking about scale. What we have talked about here are achieving health equity, but not sacrificing scale. And I think that's really a piece that's really important as inspiration for future efforts. We may not have 14 billion people today, but we might someday. And so how do we think about reaching as many as possible with solutions that are equitable and fair?

Two minutes left, Del, Anya, Bennett, call to action. What would you say to this audience?  

Del:

I would say a call to action is if you're working in this space of health equity, the landscape is changing, the political environment is changing, the culture is changing. When you say things like diversity and equity and inclusion, we all know that that's being viewed in a different way.

We have to move these discussions from being something that's right to do, and I think Bennett talked about this, to being metrics driven. So if you're in this space.  OKRs, KPIs, ROI, these are all terms that you should be well versed in, and whatever initiative you're around, make sure that you have those things that you can show so that when people start feeling less comfortable with these issues, we have data that we can say that this makes sense for us.

Anya:

Building on that, I would say collect the data, share the data.  We can't improve until we have it all and that can be real world evidence, you know, it doesn't necessarily need to be super, super, super tightly controlled, but we're only as good as what we sort of record and what we measure.

Bennett:

So whenever I go to the doctor, and I'm sitting on that piece of paper in a backless gown, like my palms start to sweat, right?

And like all of a sudden, like, the three glasses of wine that I have a week turns into one. Because I want to impress the doctor. And I think that like the general experience of going to the doctor is really, really not good. Is not a good experience. And honestly, I think like the challenge of being a doctor is increasingly not a good experience, because, yes, we've designed the system to capture all this data, but the UI of those systems is really not very good. I mean, there's not a single doctor who will tell you they like their electronic medical record. And yet we're trying to build this system to capture all this data and we need the data to be able to improve the system. But we're going to run out of doctors because we're going to burn straight through them because they hate the EMRs. 

And so I think that one of the design principles that we've disciplined ourselves in our business model is one that I don't think we often ask in health care, which is how is this gonna feel? How is this gonna feel to the patient? How's this gonna feel to the doctor? How's it gonna feel to the health insurance company? Etc.

And I think that what that has done is it's forced us to actually design what ends up being much simpler workflows than maybe it would have been otherwise, which actually is a lot harder, I think. But I think that feeling piece is actually going to be whether actually people will follow through on it and they'll actually go do a clinical trial or they'll take the pill or they'll go see the doctor because I think that we continue to cut corners around that experience piece in healthcare that I think we've got to fix.

Amy:

All right, get data, collect data, use it well to get a simpler system, try not to lie to the doctor. Thank you so much everyone. Please join me in giving a round of applause to our panelists.

Want to hear the full discussion? Watch the SXSW 2024 Health Equity Panel.