Читать книгу Do No Harm - Matthew Webster - Страница 22
Innovate or Die
ОглавлениеPeter Drucker, considered the father of modern management techniques, popularized (or perhaps originated) the phrase “Innovate or die.”35 What he was referring to was that companies needed to stay ahead of the pace of change from a market perspective or face obsolescence. In business today that means continually changing and updating your products to ensure marketability. As a former salesperson myself, if a competitor has an innovation you lack, that could be enough to give them the competitive edge to stay in business. Innovation is here to stay.
Perhaps the most poignant example of this is Blockbuster Entertainment Inc., more commonly known as Blockbuster. It had a brilliant business model in the 1990s. It offered a very convenient way to rent movies, video games, and so on, but it did not innovate. Blockbuster held the philosophy that people enjoyed going to a store to rent movies. They did not anticipate that people would prefer the advantages that a modern streaming service offers people. That philosophy was its undoing, and now it is out of business. The world is full of examples where this is true. For instance, Polaroid, Compaq, Borders, Tower Records, Atari, Kodak, and Xerox were all big, recognizable names 20 years ago. But the reality is, they did not innovate with the times and thus suffered or went out of business as a result. So, too, is that true for medical device companies.
Numerous studies have been done on the companies that have survived for decades or longer. The number-one trait that all of the companies share in common is adaptability—a willingness to change with the times. What is interesting to note is that this “innovate or die” attitude has taken on another life in Silicon Valley. At this point it is well known for both its innovation and its technology disruption. Deloitte stated it best: “More than one-third of the 141 companies in the Americas, Europe, and Asia Pacific that grew to a valuation of greater than $1 billion between 2010 and 2015 were located in the Bay Area.”36 CEOs and entrepreneurs know this. While there are countless keys to success, part of the success criteria is to make companies more agile by shortening the time to market. Indeed, many companies have had to shift their overarching philosophy. Large companies used to take the time to create products that were fully ready for the market. Billions of dollars were lost this way as a result of smaller, more agile, more innovative companies creating something more quickly. The old ways of doing business simply do not apply anymore.
In short, innovation is not just a random concept that is haphazardly thrown into business. It is the basis for business models. Companies are shaped around this philosophy and quite often far more successful than they were as a result of the transformations that go along with the desire to innovate. As you can imagine, the pressure to innovate is a very strong driving force in companies today. If we look back at the CEOs who were responsible for the downturn (and sometimes demise) of companies, they were under very harsh criticism for not turning companies around or losing business due to lack of innovation. This kind of pressure leads people and thus companies to make decisions that do not always have security in mind.
This kind of pressure also affects medical device manufacturers—so much so that there is a new name for the medicine coming out of connected medical devices: Medicine 2.0. Medicine 2.0 utilizes digital diagnostic capabilities, including wireless devices, mobile health solutions, data, smartphone apps, wearable devices, and remote monitoring. Other technologies such as cloud and artificial intelligence allow for greater innovation and more rewards for both the medical industry and the patient. These technologies are only accelerating as the pace of innovation changes and grows.37
Imagine what kind of pressure CEOs, product marketers, etc., face when creating products. Where does security fit in in this kind of world? Is it surprising that there are as many vulnerabilities in products that we are seeing today? The sad answer is no, and this is for multiple reasons. If we sidestep to nonconnected devices for a brief minute, the Associated Press reported that since 2008, medical devices for pain have caused more than 80,000 deaths—and this is a 2018 statistic.38 The same Associated Press article talked about how little testing there is for those pain stimulators. That article was fair in that it pointed out that there are more than 190,000 devices on the market and very rarely do they need to pull devices. Yes, occasionally things get through that are bad, but the system is working remarkably well in the FDA's estimation.39 The central point here is that despite having processes in place, those processes are far from perfect and could use greater transparency so we as consumers can make better decisions.
To that point, Kaiser Health News (KHN) did a fantastic report in late 2019 about patient deaths related to heart devices. In this case, the report discussed more than 5,800 deaths reported about the MedTronic heart valve since 2014. FDA made it sound as though the deaths were related to the heart valve. As it turns out, the FDA was not as transparent as it could have been about the deaths. KHN reported that many of the deaths were due to how fragile the people were who were receiving the heart valve and not related to the heart valve itself. Many of the device injury reports were kept effectively hidden from the public. Even safety experts were not aware of the problems.40 What this does point to is that we do not have enough information to make a strong judgment call about the risks pertaining to internet-connected medical devices in all cases.
There is the obvious feature and hardware side of things when it comes to connected medical devices, but the more important part from this perspective is the software that is written in and around medical devices. As the business world must evolve or die, so must software live within those paradigms. To that end, software security has often taken a back seat compared to other disciplines. There is an old joke that has evolved since 1997 about how unstable Windows is and comparing its operability to that of a car.41 The joke is much more involved, but it does serve to illustrate the problems with software development even back then—not just from a security standpoint, but an overall operability standpoint.
Automobile engineers have to have high safety standard. People can die as a result of a car's brakes not working (for example). With software it is historically acceptable to have flaws because, generally speaking, lives are not on the line. The discipline never developed the kind of rigor typically found in automobile engineering. Software coding is a little bit more of art than science in many cases because there can be numerous ways of obtaining an objective. The end result is systems that are riddled with vulnerabilities. Windows 10, at the time of this writing has 1,111 known vulnerabilities.42 The more complex a system is, the more likely there will be risks associated with those vulnerabilities.
What is almost as concerning as the number of vulnerabilities is how we got here. In 2016, Global Newswire published the results of a CloudPassage study of United States’ universities failing at cybersecurity. The key findings brought out by the study may be jaw dropping for the uninitiated, but not surprising for those in the cybersecurity profession. The most startling finding was the almost complete lack of security required by computer science programs for graduation. Of the top 36 Computer Science Programs (according to U.S. News and World Report) only one had a cybersecurity required course. According to Business Insider's top 50 list, there were only three programs that required cybersecurity to graduate.43
When it comes to cybersecurity, quite often universities are not the place to get that education. People walk out of school barely cybersecurity literate, but eager to start building IT systems. How secure do you think those systems will be if no one educates them on how to build secure systems? While there are certainly exceptions and companies have degree programs in cybersecurity, it does show the extreme deficit about the methods for protecting organizations. People who are interested in cybersecurity either need to learn on the job, go to a very specialized school, or go get cybersecurity certifications.
From a software development perspective, organizations need to supplement the understanding of the workers to get on board. Further, the lack of cybersecurity education helps to contribute to a lack of understanding of cybersecurity within organizations. That, in turn, affects the culture of the organization and ultimately the cybersecurity posture within organizations. Only companies with strong regulatory requirements or that have gone through a breach feel that they need a team to get them up to speed. Some of the requirements of cybersecurity may even appear bizarre due to cybersecurity illiteracy.
We could stop here, but the story is really more complex than that. Todd Fitzgerald, in his book CISO Compass, brilliantly lays out the course of cybersecurity over the last 30 years. He points out from the 1990s to 2000 that security was seen as an IT problem—basically login security, passwords, antivirus, and the firewall. From 2000 to 2004 we started to see more regulatory security practitioners as the regulations began kicking in, so there was an emphasis on the regulatory landscape. From 2004 to 2008, there was a turn toward a more risk-oriented approach to doing information security, and from 2008 to 2016 the move was more toward understanding the threats and toward understanding the cloud. After 2016, the move was to privacy and data awareness as another aspect of security evolution.44
Having worked in IT and security for roughly 25 years, I have seen firsthand the evolution that Todd Fitzgerald is talking about. All of these are extremely valuable insights that demonstrate the growth and change within the information security landscape. Given that company culture can take up to five years to make changes45 (depending on the size and speed of cultural change), often businesses are well behind the curve of what they ought to be doing from a cybersecurity perspective (depending on the organization, of course). That said, appropriate momentum is required to make the appropriate changes. Obviously, for areas of resistance or where there is a lack of knowledge, such as in software development, changes can take quite a bit longer.
Another side of the equation is business perception around value. When IT was an up-and-coming phenomenon, many businesses perceived it as a cost center. They did not want to put the time and effort into supporting the men and women in that department. IT is now seen as a business enabler. Information security has had one foot in the cost center arena according to some businesses. The higher-risk and more highly regulated businesses elevated security more quickly as a business enabler—partially because it was. Various vendor risk programs required the security be heightened in order for business to commence. In heavy regulatory environments, they often had data breaches that cost them more and therefore security was given more clout to get the work done—they were not merely a cost center. They were protecting the business. They were seen as a business enabler. In these environments, despite the lack of education, they are able to form stronger cybersecurity practices—generally speaking.
Microsoft spends over a billion dollars a year on security, and some of that is devoted to patching the vulnerabilities in its operating systems.46 Not all companies are as diligent as Microsoft, but it does demonstrate that even security-conscious companies like Microsoft still find vulnerabilities in their code. In the IoMT world, not all companies will invest in the legacy systems to the degree Microsoft will. This can lead to vulnerabilities remaining on the systems. What is worse, many internet-connected medical devices cannot have agents installed on them.47 On Microsoft Windows, there are hundreds if not thousands of agents that can be installed to help protect the system. That single fact alone makes protecting internet-connected medical devices more challenging—alternative strategies must often be utilized.
All of these are influences on the security of internet-connected medical devices, but they still do not tell a sufficient story. Obviously, internet-connected medical devices are influenced by the prevailing culture, but the security behind these devices has often lapsed well behind the security of other pieces of software. This, of course, does not mean there isn't software that isn't riddled with flaws, but it does mean that medical device security has often taken a back seat to security requirements if security is recognized at all. Combined with the constant drive to innovate, this only exacerbates the security challenges.
To top it all off, the manufacturers are doing what they can to limit their liability in case things go awry. In March 2019, Bethany Corbin elucidated the challenges both brilliantly and clearly the legal challenges affecting internet-connected medical devices:
“Compounding this issue of non-regulation and insecure code is the lack of a comprehensive and workable liability framework for consumers to follow if their IoMT device malfunctions or is hacked. The application of tort principles to evolving IoMT devices is imperfect, creating challenges for plaintiffs seeking damages. In particular, the numerous actors in the IoMT supply chain make it difficult to apportion liability, with no clear boundaries establishing which party is at fault for a hack or breach. Similarly, defect-free software does not exist, which complicates the application of strict products liability. Further, end-user licensing agreements contractually limit manufacturer liability for defective devices, shifting the risk of harm to consumers and eliminating manufacturer incentives to comply with cybersecurity best practices.”48