Bytees Sence

Experience the future, today

Dangers of AI tops health tech hazards list for 2025

Dangers of AI tops health tech hazards list for 2025

The use of artificial intelligence (AI) models in health care settings without proper oversight is the most significant health technology hazard for 2025, according to nonprofit patient safety organization ECRI. The organization’s 18th annual report of the top 10 health technology hazards listed unmet technology support needs for home care patients and vulnerable technology vendors and cybersecurity threats in second and third place, respectively.

While the full document is available only to ECRI members, journalists can download an executive brief summarizing the main highlights. The report provides health journalists with a good primer on concerns to watch and can be a rich source for story ideas. It identifies potential sources of danger the organization believes warrant the greatest attention, and offers recommendations to reduce the risks.

ECRI’s top 10 health tech hazards for 2025

  1. Risks with AI-enabled health technologies.
  2. Unmet technology support needs for home care patients.
  3. Vulnerable technology vendors and cybersecurity threats.
  4. Substandard or fraudulent medical devices and supplies.
  5. Fire risk in areas where supplemental oxygen is in use.
  6. Dangerously low default alarm limits on anesthesia units.
  7. Mishandled temporary holds on medication orders.
  8. Infection risks and tripping hazards from poorly managed infusion lines.
  9. Skin injuries from medical adhesive products.
  10. Incomplete investigations of infusion system incidents.

ECRI compiled the list based on responses to member surveys, literature reviews, testing of medical devices in their lab and investigations of patient safety incidents.

Dangers of AI

While AI promises to increase the efficiency and precision of medical diagnoses and treatments, the authors note, there is a potential for preventable harms. Biases that exist in data used to train AI models can lead to disparate health outcomes or inappropriate responses, the authors said. Additionally, AI systems can produce “hallucinations,” or false responses to some prompts, and the applications’ performance could worsen over time.

AI offers “tremendous potential value” as a tool to assist clinicians and health care staff, they said, but only if human decision-making remains at the core of the care process.

“Placing too much trust in an AI model — and failing to appropriately scrutinize its output — may lead to inappropriate patient care decisions,” they wrote in the executive brief. “Leveraging AI to improve patient care requires that organizations define clear goals, assess and manage risks, evaluate options, develop effective implementation plans, manage expectations, and monitor performance for signs of degradation over time.”

Home care tech support

Regarding the second item, technology support needs for home care patients, the authors noted that delivering care in the home has “unique concerns,” particularly when a patient or family member is responsible for operating a complex medical device such as an infusion pump, ventilator or dialysis machine.

Health care at-home models that are growing in popularity take machines that traditionally would be used under clinical supervision and bring them into a home environment, said Priyanka Shah, M.S., a principal project engineer for ECRI, during a Dec. 5 webinars covering the report’s findings.

“The end users are different,” she said. “These are patients, caregivers or lay users who are now tasked with maintaining the devices, troubleshooting, setting them up, etc.”

Health care institutions offering these programs need to select devices that match a patient’s needs, abilities and environment, Shah said. They should also ensure that instructions are available in a patient’s preferred language and easily understandable. Patients should also receive verbal and visual training on how to use the equipment. Ideally, Shah said, end users should have a chance to practice using the equipment.

Inattention to such practices can lead to errors or care delays and other harms from unresolved device malfunctions, the authors wrote. ECRI has encountered “numerous examples of patient harm” from improper setup or lack of familiarity with medical devices used in the home setting.

Cybersecurity threats

Vulnerable technology vendors and cybersecurity threats were third on the list of concerns. Many health systems today employ and rely on technology products hosted by external vendors. This can include anything from scheduling and billing services to electronic health records or other programs. While there are benefits to using third-party tools, authors said, hospital operations can be jeopardized by cyberattacks, data breaches or service disruptions to the vendor.

Case in point: the February 2024 attack on Change Healthcare, one of the nation’s largest clearinghouses for insurance billing and payments. An attack on the company severely disrupted operations for its hospital, medical office and pharmacy customers nationwide.

Health care organizations can try to mitigate these risks by thoroughly vetting vendors at the start of their service acquisition process, conducting simulations to go over how health care processes would continue in the event a particular program or service was offline, and developing procedures to recover from potential events, the authors said.

Too often, businesses focus only on breaches that impact companies in their field, said Kallie Smith, ECRI’s vice president and information security officer, during the webcast. “One of the best ways to look at cybersecurity incidents and general IT incidents is to assume that anything like that could happen to any type of organization,” she said.

Additional resources