In this LEI webinar video clip, Dr. Monique Smith, executive director of Health DesignED, Emory University’s Acute Care Design and Innovation Center, explains why traditional customer analysis no longer works and illustrates the challenges with two specific examples. Find a lightly edited transcript below.
There are a number of different experiences that we’re all bringing to the table when we think about the products and services we design. And there’s certainly a lot of intersectionalities that we all live with in terms of who we are. [For example], I’m certainly an emergency doctor, but I’m also a mom, I’m a daughter, I’m a wife. I also think about innovation and design for a big academic institution. At one point, I was a start-up founder, and my family is from Jamaica. I’m an immigrant, but I grew up in the states. So across all of these statements [from the earlier in the webinar], I have two that are true and five that are false.
The constant recognition that, throughout my life, I’ve been exposed to multiple perspectives and, gratefully, continue to be. But, even with all those perspectives that I bring to the table, there’s still a certain amount of bias that I bring into the design process. And that bias is something that I continuously have to challenge to iterate and innovate and make better things for people fundamentally.
Designing for Diversity in the Automobile Industry
When you think about the things that we’ve created across different industries — think about motor vehicle safety — there are assumptions that we make in the design process. [For example], you think about how you situate yourself in a car, where the headrest sits, where the seatbelt comes across your body. As an emergency doctor, I see a lot of people who come into my department who’ve been in motor vehicle accidents, and it’s something that I never put too much thought to. But, there’s very clear evidence out there that says that women are 47% more likely to suffer injuries after motor vehicle accidents compared to men. And that’s even after you adjust for the things you think you should adjust for — you adjust for weight, you adjust for height — and what it comes down to are these elements of design that weren’t considered, that it isn’t just weight and height, but there’s also this difference in female neck strength and musculature, and the way you sit within the seat and where head restraints are positioned.
This situation becomes an opportunity for partnerships [that enable you to] think through how we can design for those different elements in the manufacturing process — and to think about, when we have those high-activation traumas, how we ingest that kind of data at the individual level to design better products and services for what is a fundamentally diverse population.
Designing for Diversity Healthcare
That extends, as well, into healthcare. Fairly early on in the pandemic, as a healthcare system, we were expecting large numbers of people showing up and not knowing how to fundamentally triage them. One of the ways we triage is getting very close and personal and getting some vital signs, getting a temperature, your oxygen levels, things like that. And we’re fortunate to work with some of our colleagues in big academic centers to think through how we [could] leverage facial recognition software and AI [artificial intelligence] algorithms to think through how we can predict who has a temperature, who has low oxygen levels, who should we put in different spaces to optimize safety — and also maybe [help us] use less of our personal protective equipment, [so we’re] able to give that amount of distance so that we’re treating patients in appropriate zones — where we’re thinking through, again, how we best provide care, but also protect those who might not yet be infected with Covid.
As we came through with this low prototype solution, we acknowledged that our training data set had some diversity, but it didn’t have enough diversity. We were using facial recognition software, which is very sensitive to skin tone. And so that aspect of low-oxygen levels and recognizing blueness around the lips was much harder on darker skin tones. So we made the algorithm publicly available. We published it. And a few months later, I was fortunate to see one of our big industry partners come back to me and say, “Hey, we’ve got this great new tool, and this is a tool that’s going to allow you to triage for vital signs is something that you use alongside your virtual care. You can use it in person. It’s really incredible.”
It sounded a lot like what we had done before, so the first question that I asked was, tell me about the training dataset. Tell me what it looks like: How did it respond across populations and different skin tones? What did the validation of it look like? And there were no answers: No one in the room knew the answer to the question. First red flag.
A few weeks later, I connected back with the same partners. We were talking about something else. I said, “Hey, whatever happened with that triage [solution]? We’re super interested in it and want to know more.” And, it turns out that they just never tested for it. They’d never thought of it.
And part of it is that piece of experience, of when you’re thinking through what you’re building your technology on, you have to be intentional about making sure that it serves diverse populations.
Designing the Future
An Introduction to Lean Product and Process Development.