Nursing & Healthcare News

Artificially Intelligent, Naturally Flawed

Artificially Intelligent, Naturally Flawed

A nurse is clicking on a robots screen which shows a heart and EKG

Even if artificial intelligence hasn’t yet touched your practice, the chances are that it will in the foreseeable future. A recent National Academy of Medicine report recommends that all healthcare professionals learn more about the potential of AI — and its significant limitations.

Promise and Risk

Anyone who’s ever watched “Star Trek” has at least a basic understanding of artificial intelligence: computer programs that can perform complex analysis to supplement — and even supplant — human judgment.

If you believe the tech industry hype, sophisticated neural networks will soon be able to detect and treat disease as readily as current algorithms predict your online shopping habits.

The new report from the National Academy of Medicine (NAM), released December 17, is cautiously enthusiastic about AI’s potential to transform everything from radiology to medical coding.

However, the report also offers some sobering warnings about the limits and risks of healthcare AI.

When Computers Take Shortcuts

If you’ve ever asked a child to clean their room and come back to find clothes and toys shoved under the bed rather than put away properly, you’ve already had a preview of how real-world neural networks approach decision-making.

RN Career Events

Artificial intelligence may be logical and efficient, but its strategies and conclusions aren’t always what its users had in mind. One example the NAM report cites is a recent PLOS Medicine study in training neural network algorithms to detect pneumonia based on chest X-rays.

During the study, the trainers were puzzled to find that their AI was significantly better at detecting pneumonia from X-rays taken at the hospital where the AI was originally trained than from X-rays taken elsewhere.

Eventually, the trainers realized the AI was using a trick they hadn’t taught it: At the training hospital, the equipment with which an X-ray was taken (which the AI could determine from the image data) gave a good indication of the department the patient was in and thus their acuity, which affected their pneumonia risk. If the X-rays were taken with unfamiliar equipment, the neural network couldn’t reliably make that correlation and its accuracy suffered. It’s easy to see how scenarios like this could be dangerous in clinical practice. For example, the AI might dismiss early signs of pneumonia if a patient were in a unit where pneumonia was uncommon. Worse, clinicians relying on the AI’s assessment might not know exactly how or why it reached its faulty conclusion.

Get the Friday Newsletter

Lively career advice, nursing news and the latest RN job openings delivered to your inbox every week. Feel inspired by your work.

View Sample

Bias In, Bias Out

The authors of the NAM report say that the biggest limitation of any AI is not the algorithms, but the data used to train them. For example, a neural network trained on data from exclusively preteen patients might leap to strange conclusions when asked to evaluate an adolescent, while an AI trained on data from mostly men might flounder in evaluating female patients.

“If the training data are systematically biased due, for example, to underrepresentation of individuals of a particular gender, race, age or sexual orientation, those biases will be modeled, propagated and scaled in the resulting algorithm,” the authors warn. “The same is true for human biases (intentional and not) operating in the environment, workflow and outcomes from which the data were collected.”

Keep Your Eyes Open

No one expects nurses to analyze flaws in algorithmic design or AI training methodology, but these problems emphasize the importance of human clinical judgment in the deployment and use of artificial intelligence.

A nurse who understands the ways biased data or faulty algorithms can produce misleading conclusions — and who speaks up when something doesn’t seem to make sense — can help prevent costly or even deadly errors.

You can download the full report at nam.edu/artificial-intelligence-special-publication.


In this Article: ,

Latest Articles

Experience the Digital Flip Mag

Flip through the pages of the latest Working Nurse magazine on your device.