Читать книгу You CAN Stop Stupid - Ira Winkler - Страница 28

Culture

Оглавление

Establishing a great process is awesome. However, as stated in the famous Peter Drucker quote, “Culture eats strategy for breakfast.”

Consider all of the security rules that exist in an organization. Then consider how many are usually followed. There are generally security rules that are universally followed and those that are universally ignored.

As consultants, we are frequently issued badges when we arrive at a client's facility. We diligently don the badges, at least until we walk around and determine that we are the only people actually wearing a badge. While we intend to adhere to security policies, we also have a need to fit in with and relate to the people inside the organization. Badge wearing is a symptom of a culture where security policies inspire people to ignore them.

Conversely, if many people in an office lock their file cabinets at the end of the day or whenever they leave their desk, most of their colleagues will generally do the same. Culture is essentially peer pressure about how to behave at work. No matter what the defined parameters of official behavior are within the organization, people learn their actual behavior through mirroring the behavior of their peers.

Culture is very powerful, enabling vast amounts of UIL and facilitating losses in other categories. If your culture doesn't adequately support and promote your processes, training, and technology implementation, then crime, physical losses, and user errors all increase as a consequence. Let's consider some examples where culture can be shown to have a direct relationship to UIL.

When the Challenger space shuttle exploded, the explanation given to the public was that O-rings, cold weather, and a variety of other factors were the combined cause. However, internal investigations also revealed that there was a culture that was driven to take potentially excessive risk to stay on schedule. (See “Missed Warnings: The Fatal Flaws Which Doomed Challenger,” Space Safety Magazine, www.spacesafetymagazine.com/space-disasters/challenger-disaster/missed-warnings-fatal-flaws-doomed-challenger/.) Despite many warnings about safety concerns relevant to the Challenger launch, NASA executives chose to downplay the warnings and continued with the launch. Even if the Challenger explosion was due to a mechanical failure, it was clearly a UIL because someone made the conscious decision to ignore warnings and proceed despite the risks.

While it shouldn't take a crippling of the entire space program to initiate culture fixes, NASA subsequently issued engineers challenge cards that they could place on the table in the middle of discussions and demand that their concerns be heard.

In perhaps one of the most iconic cases of culture-based UIL, in 2017, the U.S. Navy destroyer USS Fitzgerald crashed into a large freighter, resulting in severe damage and 7 deaths. Another destroyer, the USS John S. McCain, crashed into another large ship 9 weeks later, resulting in massive damage and 10 deaths. Investigations determined that there were major failures in leadership and communications on the individual vessels. (See “Worse Than You Thought: Inside the Secret Fitzgerald Probe the Navy Doesn't Want You to Read,” Navy Times, www.navytimes.com/news/your-navy/2019/01/14/worse-than-you-thought-inside-the-secret-fitzgerald-probe-the-navy-doesnt-want-you-to-read/.) Essentially, the investigators determined that there was a culture on the ship that created those failures. Further studies found that the problems resulting in these collisions were due to a culture that created systematic failings throughout the 7th Fleet, creating poorly trained sailors, failing equipment, and other failures. (See “Years of Warning, Then Death and Disaster,” ProPublica, features.propublica.org/navy-accidents/us-navy-crashes-japan-cause-mccain/.)

In the cybersecurity world, there are many massive failings due to cultural causes. The Equifax hack demonstrated systematic failures that went beyond a straight failing of technologies. (See “Data Protection: Actions Taken by Equifax and Federal Agencies in Response to the 2017 Breach,” U.S. Government Accountability Office, www.warren.senate.gov/imo/media/doc/2018.09.06%20GAO%20Equifax%20report.pdf.) The technological failures would not have occurred without a management infrastructure that allowed them to take place. Some people claimed the Equifax CISO was made a scapegoat and fired, but it is clear that there were management failings.

There is a wide spectrum to the forms that culture-based UIL can take, from small behaviors to widespread negligence. Strong cultures tend to be consistently strong, although they may have some isolated shortcomings to address. Likewise, when you have a weak culture with regard to control of loss, you will find the culture to be consistently weak. Even so, culture is not static. It evolves and changes. If you let culture change organically on its own, you will more likely increase loss. If you take an active role in understanding and influencing culture by determining the desired user behaviors, taking steps to architect those behaviors, establishing open channels of communication for feedback, initiating and improving training, and so on, you can improve your culture and reduce UIL.

Chapter 9, “Security Culture Defined,” explores culture further.

You CAN Stop Stupid

Подняться наверх