Nonprofit raised concerns over health care technology that could impact homecare providers
By Meg Herndon

WILLOW GROVE, Pennsylvania—Artificial intelligence (AI) chatbots, digital darkness events and counterfeit medical products are the most significant health care hazards according to the Emergency Care Research Institute (ECRI). ECRI, an independent nonprofit organization, emphasized the importance of “prevention over reaction” in “Top 10 Health Technology Hazards for 2026” report, Jan. 21.

Several of the issues highlighted are connected to home medical equipment and in-home care. According to the report, the top 10 threats associated with health care technology this year ranked order are:

  1. Misuse of AI chatbots in health care
  2. Unpreparedness for a "digital darkness" event, or a sudden loss of access to electronic systems and patient information
  3. Substandard and falsified medical products
  4. Recall communication failures for home diabetes management technologies
  5. Misconnections of syringes or tubing to patient lines, particularly amid slow ENFit and NRFit adoption
  6. Underutilizing medication safety technologies in perioperative settings
  7. Inadequate device cleaning instructions
  8. Cybersecurity risks from legacy medical devices
  9. Health technology implementations that prompt unsafe clinical workflows
  10. Poor water quality during instrument sterilization

According to the ECRI, the risks were chosen based on what the health care industry should prioritize now to ensure patient safety efforts—not based on the frequency of reporting or association with the most severe consequences.

“Reducing preventable harm requires more than just vigilance on the part of technology managers and device users,” the report read. “The medical device industry also has a role to play.”

Misuse of AI Chatbots

According to OpenAI, more than 40 million people use ChatGPT daily for health information and AI services and chatbots are being utilized more often in health care settings. While AI can provide assistance, it can also provide false and misleading information, which could result in patient harm. ECRI advised using caution when turning to a chatbot for patient care information, with the nonprofit’s experts saying chatbots have suggested incorrect diagnoses, recommended unnecessary testing, promoted subpar medical supplies and even invented body parts in response to medical questions.


On top of AI “hallucinations,” ECRI said chatbots can exacerbate health disparities. Biases already present in data that is used to train chatbots can distort how AI interprets information, which leads chatbots to respond in ways that reinforce stereotypes and inequities.

“Medicine is a fundamentally human endeavor,” said Marcus Schabacker, president and chief executive officer of ECRI. “While chatbots are powerful tools, the algorithms cannot replace the expertise, education and experience of medical professionals. Realizing AI's promise while protecting people requires disciplined oversight, detailed guidelines and a clear-eyed understanding of AI's limitations.”

Unpreparedness for ‘Digital Darkness’

A digital darkness event—a sudden loss of access to electronic systems and patient information—is a growing risk for health care organizations, ECRI said. These events run the risk of impacting care delivery, treatment and patient safety.

Examples of digital darkness events include:

  • Cyberattacks
  • Natural disasters
  • Vendor outages
  • Internal system failures

The homecare industry, and health care as a whole, saw just how impactful one of these events can be with the Change Healthcare data breach in 2024. The attack disrupted claims and payment processes and left many health care providers without adequate cash flow to operate.


Substandard & Falsified Medical Products

Counterfeit and substandard medical products are a continued problem which were also featured in ECRI’s 2025 report. ECRI said the frequency at which these products are making their way into the U.S. market is “alarming.”

These substandard products threaten patient safety, as they could fail to function as intended.

“We challenge government agencies and manufacturers to strengthen efforts to prevent defective products from entering the market,” ECRI said.

CGM Recall Communication Failures

Home diabetes management technologies, including continuous glucose monitors (CGMs), continue to make strides. But if recalls and product update information does not reach patients, their health is at risk.

For example, an issue with certain FreeStyle Libre 3 and FreeStyle Libre 3 Plus glucose monitor sensors lead to incorrect low glucose readings. If undetected, these low readings could lead to the wrong treatment—and possibly to injury or death.


ECRI said home medical equipment (HME) providers should have proper processes in place in order to provide clear communication with users.

Tubing Misconnections

The Luer-lock design makes the inappropriate connection of tubing a threat, ECRI said. A connection gone wrong could lead to medication, nutrition or gas being introduced into the wrong line, leading to patient injury and death.

The Luer-lock’s compatibility—and incompatibility—with devices and lines is what allows misconnections to occur. To combat this hazard this, ECRI recommended:
•    ENFit connectors (enteral fitting)
•    NRFit connectors (neuraxial and regional block fit)

Find ECRI’s report here.