Читать книгу Business Risk and Simulation Modelling in Practice - Rees Michael - Страница 9

Part I
An Introduction to Risk Assessment – Its Uses, Processes, Approaches, Benefits and Challenges
CHAPTER 1
The Context and Uses of Risk Assessment
1.1 Risk Assessment Examples

Оглавление

This section presents some simple examples of the use of risk assessment in everyday situations. From these, we aim to draw some general conclusions, including that the conducting of risk assessment is quite natural to most of us (and not something unusual, in principle). Indeed, in situations that are fairly simple or that are encountered frequently, the process is usually implicit; our plans automatically incorporate some element of risk mitigation and contingency planning based on experience, without us being particularly aware of it. For situations faced less frequently (or where the situation does not closely fit a recognised pattern), the process is generally slightly more explicit.

We also aim to show that a risk assessment process – whether explicit or implicit – may result in modifications to original (base case) plans in several possible ways:

• It may result in no change to the underlying plan or project, but simply to the adaptation of targets or of objectives to make them more realistic or achievable, such as the addition of contingency, whether it be extra time, resources or budget.

• It may lead to moderate changes to the initial plan or project, by leading one to look for measures to respond to risks, such as mitigation or exploitation measures.

• It could result in more fundamental changes to the project, such as the requirement for it to be re-scoped or changed in a major way, or for completely new structural or contextual possibilities to be developed.

We also show that the results of the process often depend on personal judgement, rather than robust analysis and criteria. In particular, we typically make a number of judgements in ways that are neither explicit nor formalised, and these depend on our experiences, personal situations, preferences and biases. Although, in personal situations, we often have discretion as to which decision option or mitigation measure to implement, and the consequences are borne directly by us, in some cases, consultation and agreement with others may nevertheless be required.

1.1.1 Everyday Examples of Risk Management

The following describes some simple examples, each of which aims to demonstrate some of the above points.

When planning to cross a road, in normal circumstances, one first looks each way. This can be considered as risk mitigation behaviour that has been instilled in us since a young age, and has become a natural reflex: it is clear that the benefits of looking are significant when compared to the cost of doing so; the small investment in time and effort is easily outweighed by the reduction in the risk of having an accident. However, when the circumstances are a little different to normal (e.g. the road is particularly busy, or the traffic signals are broken), one tends to naturally take extra precautions: one may look more carefully than usual, or walk more cautiously. Under more unusual circumstances (e.g. if considering crossing a very busy multi-lane highway), one would tend to try to identify risks explicitly, and to reflect even more carefully on possible risk mitigation measures: if it had been foreseen in advance that one may face such a situation, one may already have put on sports shoes or a coloured reflecting jacket before setting out on the journey. If such precautions had not been taken, and time were available, one may return home in order to change into the appropriate shoes and jacket. One may even wish to be able to build a bridge, if only time and money would allow! However, if all of the possible mitigation approaches are judged insufficient, impractical, too costly or too time-consuming, one would consider whether to abandon the plan to cross, and thus to have to develop completely new options or to revise one's objectives and targets.

When planning a major business trip, one could simply book an air ticket for the dates concerned. On the other hand, one would often naturally consider (the risk) that the dates of the trip may need to be changed, and take this into account in some way. In particular, one may consider a range of possible options, each with different costs, benefits and risks:

• Buy (now) a non-flexible ticket: This would generally be the cheapest option but also would result in the whole investment being lost if the trip were rescheduled. As a variation, one may be able to buy trip-cancellation insurance (thus increasing the cost slightly): indeed, there may be a range of such insurance types available, at different prices, with different levels of reimbursement, and different general terms and conditions.

• Buy (now) a fully-flexible ticket: This would generally be a more expensive option than purchasing a non-flexible ticket, but at least trip-cancellation insurance would not be required, and the cost would have been fixed.

• Delay the purchase of the ticket until the dates are fixed with more certainty; at that future point, make a final decision as to whether to buy a fixed-date ticket at that point or to purchase one that is flexible, and possibly also with trip-cancellation insurance.

• One could think of an even wider set of decision options of a more structural nature that are fundamentally different to the originally planned actions, and which nevertheless aim to achieve the desired objectives; for example, one may conduct a series of video conferences coupled with electronic document sharing, instead of having an in-person meeting.

When planning a major building or renovation project (for example, of an old apartment that one has just bought), one may estimate a base budget for the works and then add some contingency to cover “unexpected” issues: these could include that materials or labour costs may be higher than expected, or that asbestos would be discovered in currently hidden (or inaccessible) wall or ceiling cavities, or that supporting structures would not be as solid as expected, and so on. This process would result in a revised figure that may be sufficient to cover the total project costs even when several of the risks materialise. If this revised budget is covered by available funds, one would presumably proceed with the project as originally conceived. However, if this revised budget exceeds the funds available, one may have to develop further decision options, such as:

• To continue the project as originally planned and “hope for the best” (whilst potentially looking for other possible mitigation measures, such as borrowing money from a family member if required, and taking in a lodger to repay the borrowings more quickly).

• To re-scope the project (e.g. use less expensive finishings).

• To restructure the project into phases (e.g. delay for several years the renovation of the spare bathroom until more funds are available).

• To cancel the project entirely.

When planning to travel from home to the airport, if one has already conducted such a journey many times, one would know from experience how much travel time to allow: this “base case” plan would implicitly already take into account to some extent that there may be unforeseen events that can materialise en route. In other words, the base plan would have some contingency (time) built in. On the other hand, where the journey is new (e.g. one has recently moved into the area), one may do some explicit research to estimate the base journey time, and then perhaps add some extra contingency time as well.

When planning a journey that will be undertaken with another person, each person's desired contingency time would typically be different to the other's: each will have different tolerances for risk, with both their perceived cost of excess waiting time (e.g. at the airport) and the implications to them of missing the plane being different.

Of course, in general, these informal processes can be very valuable; indeed they may often be sufficient to ensure than an adequate decision is taken. In other cases, they will be insufficient.

1.1.2 Prominent Risk Management Failures

Clearly, in both the public and private sectors there have been many projects in which significant unexpected delays or cost overruns occurred, most especially in the delivery of major infrastructure, transportation and construction projects. An example (chosen only as it appeared in the general press around the time of the writing of this text) was the project to deliver a tramway in Edinburgh (Scotland), which was due to cost around £400 million when announced in 2003, but rose to around £800 million by the date of project completion in 2014.

In fact, it is probably fair to say that most failures (and many successes) of risk management in business contexts are not publicly observable, for many reasons, including:

• They are of a size that does not impact the aggregate business performance in a meaningful way (even if the amounts concerned may be substantial by the standards of ordinary individuals), and the losses are absorbed within a general budget.

• They are not openly discussed, and the failure is not objectively investigated (nor the results made public).

• It is challenging to demonstrate that risks that did materialise could and should have been mitigated earlier: in other words to distinguish the “benefits of hindsight” from what should reasonably have been known earlier in the process.

However, occasionally there have been major cases that have been of sufficient size and public importance that their causes have been investigated in detail; some of these are briefly discussed below:

• The Financial Crisis. The financial crisis of the early 21st century led to the creation of a Financial Crisis Enquiry Commission, whose role was to establish the causes of the crisis in the United States. Although its report, published in January 2012, runs to hundreds of pages, some key conclusions were:

• “… this financial crisis was avoidable … the result of human action and inaction, not of Mother Nature or computer models gone haywire. The captains of finance and the public stewards of our financial system ignored warnings, and failed to question, understand, and manage evolving risks.”

• “Despite the view of many … that the crisis could not have been foreseen … there were warning signs. The tragedy was that they were ignored or discounted.”

• “Dramatic failures of corporate governance and risk management at many systemically important financial institutions were a key cause of this crisis …”

• The Deepwater Horizon Oil Spill. In April 2010, the Macondo oil well being drilled in the Gulf of Mexico suffered a severe blowout, costing the lives of 12 men, and resulting in the spillage of millions of barrels of crude oil. This disrupted the region's economy, damaged fisheries and habitats, and led to BP's having to pay large sums in compensation and damages. A commission was set up by President Obama to investigate the disaster, its causes and effects, and recommend the actions necessary to minimise such risks in the future. The Report to the President, issued in January 2012, runs into several hundred pages. Some key conclusions include:

• “The loss … could have been prevented.”

• “The immediate causes … a series of identifiable mistakes … that reveal … systematic failures in risk management.”

• “None of [the] decisions … in Figure 4.10 [Examples of Decisions that Increased Risk at Macondo while Potentially Saving Time] appear to have been subject to a comprehensive and systematic risk-analysis, peer-review, or management of change process.”

Columbia Space Shuttle. On 1 February 2003, space shuttle Columbia broke up as it returned to Earth, killing the seven astronauts on board. The Accident Investigation Board reported in August 2003, and showed that a large piece of foam fell from the shuttle's external tank on re-entry, which breached the spacecraft wing. The report also noted that:

• The problem … was well known and had caused damage on prior flights; management considered it an acceptable risk.

• “… the accident was probably not an anomalous, random event, but rather likely rooted … in NASA's history and … culture.”

• “Cultural traits and organizational practices detrimental to safety were allowed to develop, including … a reliance on past success as a substitute for sound engineering … [and] … organizational barriers that prevented effective communication and stifled professional differences of opinion.”

Business Risk and Simulation Modelling in Practice

Подняться наверх