The Protecting and Transforming Cyber Health Care Act of 2022 (PATCH Act) comes into force on October 1, 2023. The Act strengthens cybersecurity requirements for medical device pre-market submissions and post-market surveillance. Companies must develop a product monitoring plan, cyber-anomaly response plan, coordinated messaging of cyber vulnerabilities, Software Bill of Materials (SBOM) and demonstrate the ability to release critical vulnerability patches ‘as soon as possible.’
In accordance with the Patch Act, the FDA announced that it may Refuse to Accept (RTA) premarket submissions that do not meet these requirements, beginning on October 1, 2023.
To learn more about the challenges medtech manufacturers and developers face in meeting these new requirements, we spoke with Erez Kaminski, former head of AI with Amgen and founder of Ketryx, an MIT-funded startup with an AI developer tool that helps safety-critical, medtech software teams develop safer software, and Paul Jones, Executive Vice President of Ketryx, and former FDA official who contributed to the development of IEC 62304 and founded the FDA’s software engineering lab.
Are companies prepared for the October 1, 2023, implementation of the Patch Act?
Kaminski: I don’t think people necessarily understand the urgency. The Patch Act requirements are an obligation for pre-market submissions, but they could be an obligation in the case of an audit as well. We’ve definitely seen an uptick in companies that are asking us for help in complying, but I’m not sure that everyone’s regulatory intelligence systems are sending the alarms just yet.
Jones: When you look at the medical device and medtech industry, it’s a very homogeneous group of actors. When you combine that with the fact that the Patch Act and new guidances for software are changing the type of documentation companies need to submit for pre-market clearances, it’s a huge change for the industry, and a lot of people do not know what to do. Some larger companies are nimble enough to comply quickly and are largely set at this point. But large portions of the industry are going to need a lot of help.
What are the key changes companies will need to adapt to?
Kaminski: One is the ability to move fast enough to meet the Patch Act requirements for an off-cycle release. Then, from a software bill of material and cybersecurity perspective, it is making sure that you have all the required information in a form that corresponds to guidances and standards that are relevant to your device, making sure you have access to that information and that it is retrievable. Companies, in general, don’t do a great job at post-market cybersecurity monitoring. So, the biggest change and challenge is in looking at products you’ve had in the market for one year, two years or three years to see if they have vulnerabilities, and if they do contacting hospitals and patients to make sure they don’t continue using them. Companies, particularly the larger ones, are making a lot of effort to cover this requirement, but it’s a big challenge for the industry.
The FDA announced that as of October 1, 2023, they may “refuse to accept” (RTA) premarket submissions that do not meet requirements under 524B of the FD&C Act. Are there new regulations within the Patch Act that pertain to existing products as well?
Jones: It pertains to new products arriving at the FDA after that date of October 1, 2023. FDA will probably provide some slack for 90 days or so—maybe as much as a half a year—and then it will get real serious and start rejecting submissions if the devices are not properly constructed.
Kaminski: A lot of the stuff here has already been required, because it depends on what is reasonable for your device. If you have a connected device, it’s very reasonable to expect you to have an SBOM and do validation. The additional requirements are unique to medicine versus run of the mill cybersecurity monitoring, and these have been around for a while. They include medical device standards such as IEC 62304.
The one difference between the RTA guidance and the Patch Act is the Patch Act also works retrospectively. If you have a device that has been on the market for a long time, while you can’t patch all the devices, the FDA expects you to have the ability to do that. So, if you have an app that was connected four years ago, the FDA expects you to be able to fix any vulnerabilities within about 30 days.
Has the FDA issued any warning letters related to cybersecurity vulnerabilities or events, and what can companies glean from those warnings?
Kaminski: There has not yet been a warning letter related solely to cybersecurity. We have seen three warning letters this year that mention software, and they are very interesting because two of them were issued to very small manufacturers versus organizations developing high-risk devices. The letters reflect that the FDA is seeing a lot of deficiencies related to practices around software validation and the validation of non-product software used in manufacturing processes, quality processes and to record complaints. A lack of general process control in companies that develop software is a general theme of these letters.
Are there any changes in terms of reporting if you have a breach within your manufacturing or product software?
Jones: Absolutely, there are processes for reporting vulnerability events or attacks on a system. And if it affects a particular device, companies are required to keep track of that, manage it and if there are any serious injuries or deaths, they are required to report it to the FDA with a plan on how they’re going to fix it and distribute the fixes to their customers.
Kaminski: There is also a lot more focus from the FDA on did the company have control over specific devices in the field, and which version and in which location did the cyber security issues occur? There is a very well-founded fear that if one device is vulnerable someone could use that device to get into other devices or systems in the network. And while there are a lot of manufacturers that make sure they have proper risk controls so a bad actor cannot get into their device from the hospital network, many manufacturers are finding it challenging to deal with that level of complexity in their traditional methods of manufacturing and design. In the past 10 years software complexity has grown by 32%, while productivity for engineers has grown by only 2%. This raises the question of how can we do all of the things we need to do now if we are using the same processes?
Is there any clarification through the Patch Act and new guidances and standards on who is responsible for the patches and how they are communicated between the device manufacturers and healthcare delivery organizations?
Jones: The FDA guidance documents are clear on the manufacturers responsibilities to address vulnerabilities. The challenge is that different healthcare providers may have older versions of a company’s device that are running on different operating systems, and somehow the manufacturer must resolve all those differences. They may have to support a device for 15 years or 20 years, because the hospital doesn’t want to change their device. Plus, the healthcare providers may have their own networks and systems that they use to connect devices to each other and to their business systems. So, getting fixes out to customers and making sure the devices are performing as intended is a very complicated and scary situation.
The connectivity question is interesting because there is a need for better connectivity, yet it does create a scary situation for the device manufacturer because you’re not in control of other software that’s interacting with your device. Do you think that creates a hesitancy within companies to create more connected devices?
Kaminski: In a previous role I worked on high-risk devices. If you’re working on high-risk devices, you should be sure that all of the functions that relate to an immediate patient event cannot be controlled by the patient or other software. That doesn’t mean you can’t give patients the ability to monitor their device from a phone, for example, but there are very standard and commonplace ways to prevent that from giving them access to the device itself.
There are misconceptions in the industry about the Cloud and using things that are made by other people. The reality is, if you host your own servers or build software yourself—these are two different scenarios but similar stories—it would be less safe than if someone else did it for you and for 100 million other people. Companies like Amazon, Google and Microsoft have really robust security processes—better than what someone who is not engaged full time in that specific business will have. The Cloud is safer than almost any other option, because the companies that manage it are so equipped to do this.
Another misconception is that companies can avoid connectivity and data-driven or automation software to reduce these risks and requirements and prevent AI software from impacting their devices. That is not in the cards anymore. This technology is providing better access to care and bringing down costs. The question is no longer, am I going to have a connected device or not? It is, are you going to connect your device to an app now or a few years after your competitors have done it? There is no way to prevent the move to connectivity; companies instead need to learn how to organize their processes for this new environment.