Soojin Jun, PharmD
4 min readSep 12, 2020

--

Computers are stupid.

This line cracked me up this morning. It was from a nurse who brought her computer up to document a narcotic waste. When I signed my name electronically in the computer with a password, the program briefly said it could not find my name after typing a few first letters. She said, “Your name is like my name. C***** (the program we use at the hospital) cannot find me quickly when I type my name when others have no problem finding their names after typing a few letters.” As she was leaving the pharmacy, she said, “Computers are stupid.” Then I replied, “Well, they are designed by humans.” Now, this line really cracked her up. We had a good laugh together. She laughed and said, “Wow, that is so true.” as if I said something revelational.

If you pay close attention to what she has said, computers are no longer “a thing”. They are “thinking” machines. They are supposed to know what we are expecting and be consistent because they are supposed to be quick and smart. Yet, computers are not perfect because they are designed by humans as we are very imperfect. Imperfect in the way that we forget that we are holding our phones in our hands and look for them. Imperfect in the way that we cannot remember what we ate yesterday. Imperfect in the way that a pharmacy technician and a pharmacist both miss the mistakes when we are double checking against a prescription and a final product being dispensed to patients. Imperfect in the way that nurses document under wrong patients. Imperfect in the way that surgeons get lawsuits for wrong sided surgeries.

I wandered in my thought at the intersection of healthcare, use of artificial intelligence (AI), and empathy. Avishek Choudhury, my PhD candidate mentee, tells me AI is any computation that represents human like action as the definition can be as broad as that. As my title shows — I call myself empathy enthusiast — that I relate every relationship with empathy, no matter what the medium is. Medium — as the name of the writing platform this writing is on — is a tool to communicate, like a writing, an email, an art, a video, a brochure, or anything that helps at least two entities understand. As we rely on AI to handle more healthcare communication for patients and if they are not designed well, not only patients may not get proper care, but even may end up dying.

I asked a lot of questions in this interaction. Here are some food for thought for AI in healthcare.

  1. As AI in healthcare becomes a norm, we expect AI to be like a person and even call it a stupid. How can AI designers make AI more person-like? My answer is by growing their empathy.
  2. The above leads to the second question. How do you grow empathy? I get this question a lot from many people including students. By creating something, no matter what medium you are using, I answer. Creative people make objects to communicate in a different format whether it is a painting, a program, a brochure, you name it. In that process, and this is a crucial point, think of how viewers will use your creation while creating. This empathy with passion for art are beautiful combination that will naturally enhance the growth of empathy.

3. This is a true story from an undisclosed hospital that shows how AI can hurt patients. A patient who was a Japanese speaker, tried communicating his symptoms in English with his doctor through electronic portal. He did not get a proper treatment on time in their time period of communication of back and forth. He ended up dying with pneumonia. Although this story may have different factors that have contributed to his death, the point of the story is that patients use imperfect tools to even take care of life threatening health issues, hoping it can help the situation. Healthcare also encourages patients to rely on AI by directing to electronic portal, websites, apps, and many others, to manage workflow and also to provide solutions for patients. The intentions are good, but more reliance on imperfect tools hoping them to be perfect, can cause devastating results like the ending of the story.

4. The patient mentioned above also had to use AI that was written in a different language from the language he would normally use. Can the portals have at least Google translate functionality? If this is not done due to legal reasons, so is not providing proper way to communicate for patients who use different languages. Why isn’t AI designed for patients who use different languages in healthcare when 67.3 million US population use a language other than English at home? As healthcare rely more on AI, how is the industry taking care of non-English speakers? Isn’t the gap of care for this population widening because of more reliance on AI?

5. Speaking English does not mean patients have abilities to describe symptoms well. I experienced this myself while I was trying to describe my symptoms to my English speaking doctor. I had no issues communicating otherwise before. When it came to describing my symptoms of positional vertigo that I would get sometimes, however, it was so hard since my language did not have direct words in English. Until I went to pharmacy school, I really did not know how to describe my symptoms. I even see this for people who are native English speakers. Describing symptoms is a hard task for patients.

I hope these points give you a perspective in AI, healthcare, and empathy. AI designers, please take the time to think as this can be life and death issues for some people, like that Japanese patient in the story.

--

--

Soojin Jun, PharmD

Cofounder of Patients for Patient Safety US. Owner of I am Cheese pub. Pharmacist and empathy believer. Check out www.pfps.us and www.npsb.org & be a hero!