Читать книгу Demand Driven Material Requirements Planning (DDMRP), Version 2 - Carol Ptak - Страница 12

Оглавление

CHAPTER 1

Planning in the New Normal

To truly understand where industry is today, it is necessary to discuss the history behind conventional planning. Where did it come from? What did it replace? What circumstances was it developed for? Is it still relevant and appropriate for the environment of today?

The Material Requirements Planning Revolution

Today most midrange and large manufacturing enterprises throughout the world use a planning method and tool called Material Requirements Planning (MRP). This method and tool was conceived in the 1950s with the increasing availability, promise, and power of computers. Computers allowed for rapid and complex calculations about what and how much was needed to be bought and made given a demand input.

The more complex the products, the more powerful the promise of MRP. The APICS Dictionary1 defines MRP as:

A set of techniques that uses bill of material data, inventory data, and the master production schedule to calculate requirements for materials. It makes recommendations to release replenishment orders for material. Further, because it is time-phased, it makes recommendations to reschedule open orders when due dates and need dates are not in phase. Time-phased MRP begins with the items listed on the MPS and determines (1) the quantity of all components and materials required to fabricate those items and (2) the date that the components and material are required. Time-phased MRP is accomplished by exploding the bill of material, adjusting for inventory quantities on hand or on order, and offsetting the net requirements by the appropriate lead times. (p. 103)

By 1965 the modern acronym “MRP” was in existence. Then in 1972 capacity reconciliation was incorporated into MRP. This was called closed-loop MRP. The year 1980 saw the significant incorporation of cost accounting into MRP, transforming it into a system known as Manufacturing Resources Planning (MRP II). Finally, by 1990, as client-server architecture became available, MRP II had evolved into Enterprise Resources Planning (ERP). Throughout this progression the definition of the MRP portion of the information system has remained unchanged.

While this is not a book about MRP, a basic level of understanding of MRP will be helpful to the reader. This basic explanation, and even a demonstration of MRP, is included in Chapter 3 and Appendix A, respectively.

Perhaps the most recognized leader of the MRP charge was Joe Orlicky. His 1975 seminal work Material Requirements Planning: The New Way of Life in Production and Inventory Management provided the blueprint and codification of MRP that is still the standard today. Consider that when this book was written, only 700 companies or plants in the world had implemented MRP, almost all located in the United States:

As this book goes into print, there are some 700 manufacturing companies or plants that have implemented, or are committed to implementing, MRP systems. Material requirements planning has become a new way of life in production and inventory management, displacing older methods in general and statistical inventory control in particular. I, for one, have no doubt whatever that it will be the way of life in the future. (p. ix)

MRP did become the way of life in manufacturing. The codification and subsequent commercialization of MRP fundamentally changed the industrial world, and it did so relatively quickly. Orlicky, along with others at the time, recognized the opportunity presented by changing manufacturing circumstances and the invention of the computer that enabled a planning approach never before possible:

Traditional inventory management approaches, in pre-computer days, could obviously not go beyond the limits imposed by the information processing tools available at the time. Because of this almost all of those approaches and techniques suffered from imperfection. They simply represented the best that could be done under the circumstances. They acted as a crutch and incorporated summary, shortcut and approximation methods, often based on tenuous or quite unrealistic assumptions, sometimes force-fitting concepts to reality so as to permit the use of a technique.

The breakthrough, in this area, lies in the simple fact that once a computer becomes available, the use of such methods and systems is no longer obligatory. It becomes feasible to sort out, revise, or discard previously used techniques and to institute new ones that heretofore it would have been impractical or impossible to implement. It is now a matter of record that among manufacturing companies that pioneered inventory management computer applications in the 1960s, the most significant results were achieved not by those who chose to improve, refine, and speed up existing procedures, but by those who undertook a fundamental overhaul of their systems. (p. 4)

In his book, Orlicky made the case for a fundamental reexamination of how companies planned and managed inventory and resources. This case was so compelling that the concepts that he brought to the table proliferated throughout the industrial world within two decades. That proliferation remains largely unchanged in the present. Today we know that nearly 80 percent of manufacturing companies that buy an ERP system also buy and implement the MRP module associated with that system.

Perhaps the most interesting and compelling part of the passage from the original Orlicky book is the sentence that is italicized. This was simply common sense that was easily demonstrable with the results of precomputer inventory management systems. Yet could this same description be applied to the widespread use of MRP today? Could it be that conventional planning approaches and tools are:

Acting as a crutch?

Incorporating summary, shortcut, and approximation methods based on tenuous assumptions?

Force-fitting concepts to reality so as to permit the use of a technique?

In the authors’ 60+ years of combined manufacturing experience across a wide array of industries, the answer is a resounding yes to all these points. By the end of this book, the reader will also be able to understand why the answer is yes to all these points. Indeed if the answer is yes to these points, there should be evidence to support the assertion that MRP systems are not living up to their billing—that they are in fact guilty as charged in the previous three bullet points.

Before we review the evidence, let’s start with two basic observations about rules:

Observation 1. Most rules are life limited. Rules are instituted most often based on assumptions about the environment at the time they are made. Rules are often made to accommodate certain limitations. When those assumptions or limitations change, the rules must be reexamined to determine whether they are still appropriate. Souder’s law states that “repetition does not establish validity.” Simply continuing to do something that has always been done does not define whether it is or ever has been the appropriate thing to do. Worse yet, the longer the repetition, the more invalid or inappropriate the rule may be.

Observation 2. “Optimizing” inappropriate rules is counterproductive. Attempts and investment meant to enable or accelerate compliance to rules that are inappropriate can be devastating to an organization. If the rule is not only inappropriate but also damaging, then the organization is at risk to do the wrong things faster.

Evidence of a Problem

There are three areas that point to major issues with the rules and tools of conventional planning featuring MRP.

Return on Asset Performance Degradation

As described above, the United States led the adoption of manufacturing information systems starting with MRP in the 1960s. These systems are expensive to purchase, to implement, and to maintain. The value of these formal planning systems has always been based on the ability to better leverage the assets of a business. Did the widespread adoption of MRP and subsequent information systems enable the U.S. economy to better manage assets?

In late 2013 Deloitte University Press released a report by John Hagel III, John Seely Brown, Tamara Samoylova, and Michael Lui that is quite eye-opening when considered against the progression and adoption rates of information systems.2 Figure 1-1 is a chart from the report that depicts the return on asset performance of the United States economy since 1965.

There is a steady decrease in return on assets for the U.S. economy from 1965 to 2012. Furthermore, during this time period the same report shows that labor productivity (as measured by Tornqvist aggregation) more than doubled! What is most interesting about this graphic in relation to information systems is that by 1965 we had the modern acronym MRP, but massive proliferation of information systems did not occur until after 1975 and, in particular, after 1980 with MRPII.

Obviously there are many factors at play with this decrease in return on assets, but this report would certainly lead one to realize that the impact of the widespread adoption of MRP, MRP II, and ERP systems (at least in the United States) has not significantly helped companies manage themselves to better returns on asset performance. Indeed, when this decline is taken in combination with the increase in labor productivity, it actually suggests that companies are accelerating their mistakes.

But this is just one point of data, a high-level view with many unrelated factors contributing to these effects. What additional evidence indicts the efficacy of the conventional planning approach?


FIGURE 1-1 Return on asset peformance for the U.S. economy

Work-Around Proliferation

In addition to examining the performance of an entire economy over a period of time, next examine the day-to-day actions of the people charged with making decisions about how to utilize assets. One hallmark of supply chains is the presence of supply orders. Supply orders are the purchase orders, stock transfer orders, and manufacturing orders that dictate the flow and activities of any supply chain.

The very purpose of a planning system is to ultimately determine the timing, quantity, and collective synchronization of the supply orders up, down, and across the levels of the network. Inside most manufacturers there are tiers within the planning system where stock transfer orders could prompt manufacturing orders that in turn would prompt purchase orders. Additionally, within most supply chains there are tiers of different planning systems at each organization linked together by these orders and communicating through these supply order signals. For example, purchase orders from a customer can prompt stock transfers or manufacturing orders at suppliers.

Perhaps the biggest indictment of just how inappropriate modern planning rules and tools are can be observed in how frequently people choose to work around them. The typical workaround involves the use of spreadsheets. Data are extracted out of the planning system and put into a spreadsheet. The data are then organized and manipulated within the spreadsheet until a personal comfort level is established. Recommendations and orders are then put back into the planning system, essentially overriding many of the original recommendations.

Consider polling on this subject by the Demand Driven Institute from 2011 to 2014. With over 500 companies responding, 95 percent claim to be augmenting their planning systems with spreadsheets. Nearly 70 percent claim these spreadsheets are used to a heavy or moderate degree. The results of this polling are consistent with other surveys by analyst firms such as Aberdeen Group. This reliance on spreadsheets has often been referred to as “Excel hell.” Validation for this proliferation can be easily provided by simply asking the members of a planning and purchasing team what would happen to their ability to do their job if their access to spreadsheets were taken away.

But why have planners and buyers become so reliant on spreadsheets? Because they know that if they stayed completely within the rules of the formal planning system, approving all recommendations, it would be very career limiting. Tomorrow they would undo or reverse half the things they did today because MRP is constantly and dramatically changing the picture. This phenomenon, known as “nervousness,” is explained in Chapter 3.

So what do they do instead? They work around the system. They each have their own ways of working with tools that they have crafted and honed through their years of experience. These ways of working and tools are highly individualized with extremely limited ability to be utilized by anyone but the originator. This is a different, informal, highly variable, and highly customized set of rules.

Worse yet, there is no oversight or auditing of these side “systems.” There is no “vice president of spreadsheets” in any company the authors have ever worked in or visited. Everyone simply assumes that the people who created these spreadsheets built and maintain them properly. Consider an article in the Wall Street Journal’s “Market Watch” in 2013:

Close to 90% of spreadsheet documents contain errors, a 2008 analysis of multiple studies suggests. “Spreadsheets, even after careful development, contain errors in 1% or more of all formula cells,” writes Ray Panko, a professor of IT management at the University of Hawaii and an authority on bad spreadsheet practices. “In large spreadsheets with thousands of formulas, there will be dozens of undetected errors.” (Jeremy Olshan, April 20, 2013)

As an example of how disastrous spreadsheet errors can be, consider the role a spreadsheet error played in a $6 billion disaster for JP Morgan in 2012. The following is an excerpt from the zerohedge.com article “How a Rookie Excel Error Led JPMorgan to Misreport Its VaR for Years”3:

Just under a year ago, when JPMorgan’s London Whale trading fiasco was exposed as much more than just the proverbial “tempest in a teapot,” Morgan watchers were left scratching their heads over another very curious development: the dramatic surge in the company’s reported VaR, which as we showed last June nearly doubled, rising by some 93% year over year, a glaring contrast to what the other banks were reporting to be doing.

Specifically, we said that “in the 10-Q filing, the bank reported a VaR of $170 million for the three months ending March 31, 2012. This compared to a tiny $88 million for the previous year.” JPM, which was desperate to cover up this modelling snafu, kept mum and shed as little light on the issue as possible. In its own words from the Q1 2012 10-Q filing: “the increase in average VaR was primarily driven by an increase in CIO VaR and a decrease in diversification benefit across the Firm.” And furthermore: “CIO VaR averaged $129 million for the three months ended March 31, 2012, compared with $60 million for the comparable 2011 period. The increase in CJO average VaR was due to changes in the synthetic credit portfolio held by CIO as part of its management of structural and other risks arising from the Firm’s on-going business activities.” Keep the bolded sentence in mind, because as it turns out it is nothing but a euphemism for, drumroll, epic, amateur Excel error!

How do we know this? We know it courtesy of JPMorgan itself, which in the very last page of its JPM task force report had this to say on the topic of JPM’s VaR:

“. . . a decision was made to stop using the Basel II.5 model and not to rely on it for purposes of reporting CIO VaR in the Firm’s first-quarter Form 10-Q. Following that decision, further errors were discovered in the Basel II.5 model, including, most significantly, an operational error in the calculation of the relative changes in hazard rates and correlation estimates. Specifically, after subtracting the old rate from the new rate, the spreadsheet divided by their sum instead of their average, as the modeler had intended. This error likely had the effect of muting volatility by a factor of two and of lowering the VaR.... it also remains unclear when this error was introduced in the calculation.”

In other words, the doubling in JPM’s VaR was due to nothing but the discovery that for years, someone had been using a grossly incorrect formula in their Excel, and as a result misreporting the entire firm VaR by a factor of nearly 50%! So much for the official JPM explanation in its 10-Q filing that somewhat conveniently missed to mention that, oops, we made a rookie, first year analyst error. (Tyler Durden, February 2, 2013)

Perhaps a more interesting question is why are personnel allowed to use these ad-hoc approaches? From a data integrity and security perspective, this is a nightmare. It also means that the fate of the company’s purchasing and planning effectiveness is in the hands of a few essentially irreplaceable personnel. These people can’t be promoted or get sick or leave without dire consequences to the company. This also means that due to the error-prone nature of spreadsheets, globally on a daily basis there are a lot of wrong signals being generated across supply chains. Wouldn’t it be so much easier to just work in the system? The answer seems so obvious. The fact that reality is just the opposite shows just how big the problem is with conventional systems.

To be fair, many executives are simply not aware of just how much work is occurring outside the system. Once they become aware, they are placed in an instant dilemma. Let it continue, thus endorsing it by default, or force compliance to a system that your subject-matter experts are saying is at best suspect? The choice is only easy the first time an executive encounters it. The authors of this book have seen countless examples of executives attempting to end the ad hoc systems only to quickly retreat when inventories balloon and service levels fall dramatically. They may not understand what’s behind the need for the work-arounds, but they now know enough to simply look the other way. So they make the appropriate noises about how the entire company is on the new ERP system and downplay just how much ad hoc work is really occurring.

The Inventory Bimodal Distribution

Another piece of evidence to suggest the shortcomings of conventional MRP systems has to do with the inventory performance of the companies that use these systems. To understand this particular challenge, consider the simple graphical depiction in Figure 1-2. In this figure you see a solid horizontal line running in both directions. This line represents the quantity of inventory. As you move from left to right, the quantity of inventory increases; right to left the quantity decreases.


FIGURE 1-2 Taguchi inventory loss function

A curved dotted line bisects the inventory quantity line at two points:

Point A, the point where a company has too little inventory. This point would be a quantity of zero, or “stocked out.” Shortages, expedites, and missed sales are experienced at this point. Point A is the point at which the part position and supply chain have become too brittle and are unable to supply required inventory. Planners or buyers that have part numbers past this point to the left typically have sales and operations screaming at them for additional supply.

Point B, the point where a company has too much inventory. There is excessive cash, capacity, and space tied up in working capital. Point B is the point at which inventory is deemed waste. Planners or buyers that have part numbers past this point to the right typically have finance screaming at them for misuse of financial resources.

If we know that these two points exist, then we can also conclude that for each part number, as well as the aggregate inventory level, there is an optimal range somewhere between those two points. This optimal zone is labeled in the middle and colored green. When inventory moves out of the optimal zone in either direction, it is deemed increasingly problematic.

This depiction is consistent with the graphical depiction of a loss function developed by the Japanese business statistician Genichi Taguchi to describe a phenomenon affecting the value of products produced by a company. This made clear the concept that quality does not suddenly plummet when, for instance, a machinist slightly exceeds a rigid blueprint tolerance. Instead “loss” in value progressively increases as variation increases from the intended nominal target.

The same is true for inventory. Chapter 2 will discuss how the value of inventory should be related to the ability of inventory to help promote or protect flow. As the inventory quantity expands out of the optimal zone and moves toward point B, the return on working capital captured in the inventory becomes less and less as the flow of working capital slows down. The converse is also true: as inventory shrinks out of the optimal zone and approaches zero or less, then flow is impeded due to shortages.

When the aggregate inventory position is considered in an environment using traditional MRP, there is frequently a bimodal distribution noted. With regard to inventory, a bimodal distribution can occur on two distinct levels:

1. A bimodal distribution can occur at the single-part level over a period of time, as a part will oscillate back and forth between excess and shortage positions. In each position, flow is threatened or directly inhibited. The bimodal position can be weighted toward one side or the other, but what makes it bimodal is a clear separation between the two groups—the lack of any significant number of occurrences in the “optimal range.”

2. The bimodal distribution also occurs across a group of parts at any point in time. At any one point, many parts will be in excess while other parts are in a shortage position. Shortages of any parts are particularly devastating in environments with assemblies and shared components because the lack of one part can block the delivery of many.


FIGURE 1-3 Bimodal inventory distribution

Figure 1-3 is a conceptual depiction of a bimodal distribution across a group of parts. The bimodal distribution depicts a large number of parts that are in the too-little range while still another large number of parts are in the too-much range. The Y axis represents the number of parts at any particular point on the loss function spectrum.

Not only is the smallest population in the optimal zone, but the time any individual part spends in the optimal zone tends to be short-lived. In fact, most parts tend to oscillate between the two extremes. The oscillation is depicted with the solid curved line connecting the two disparate distributions. That oscillation will occur every time MRP is run. At any one time, any planner or buyer can have many parts in both distributions simultaneously.

This bimodal distribution is rampant throughout industry. It can be very simply described as “too much of the wrong and too little of the right” at any point in time and “too much in total” over time. In the same survey noted earlier, taken between 2011 and 2014 by the Demand Driven Institute, 88 percent of companies reported that they experienced this bimodal inventory pattern. The sample set included over 500 organizations around the world.

Three primary effects of the bimodal distribution are evident in most companies:

1. High inventories. The distribution can be disproportionate, as many planners and buyers will tend to err on the side of too much. This results in slow-moving or obsolete inventory, additional space requirements, squandered capacity and materials, and even lower margin performance as discounts are frequently required to clear out the obsolete and slow-moving items.

2. Chronic and frequent shortages. The lack of availability of just a few parts can be devastating to many manufacturing environments, especially those that have assembly operations and common material or components. The lack of any one part will block any assembly. The lack of common material or components will block the manufacture of all parent items calling for that common item. This means an accumulation of delays in manufacturing, late deliveries, and missed sales.

3. High bimodal-related expenses. This effect tends to be undermeasured and underappreciated. It is the additional amount of money that an organization must spend in order to compensate for the bimodal distribution. When inventory is too high, third-party storage space may be required. When inventory is too low, premium and fast freight are frequently used to expedite material. Overtime is then used to push late orders through the plant. Partial shipments are made to get the customers some of what they ordered but with significantly increasing freight expenses.

Why the bimodal distribution occurs is explained in Chapter 3. It is a combination of basic MRP traits, the type of demand signal that is typically used in conjunction with MRP, and the complex volatile supply chain environment within which companies now must operate.

The New Normal

Experienced planning and purchasing personnel know that if they simply follow what MRP recommends, they will be in big trouble. Shortages will increase. Excess inventory will increase. Expedites will increase. Intuitively, planners understand that materials and inventory management, under conventional practices, places them in a no-win situation. What happened to the promise of MRP as verbalized by Joe Orlicky in the beginning of this chapter? The answer is exceedingly simple: the world changed and MRP did not.

The circumstances under which Orlicky and his cadre developed the rules behind MRP have dramatically changed. Customer tolerance times have shrunk dramatically, driven by low information and transactional friction largely due to the Internet. Customers can now easily find what they want at a price they are willing to pay and get it in a short period of time.

Ironically, the planning complexity is largely self-induced in the face of these shorter customer tolerance times. Most companies have made strategic decisions that have directly made it much harder to do business. Product variety has risen dramatically. Supply chains have extended around the world driven by low-cost sourcing. Product complexity has risen. Outsourcing is more prevalent. Product life and development cycles have been reduced.

Add on top of this an increased amount of regulatory requirements for consumer safety and environmental protection, and there are simply more complex planning and supply scenarios than ever before. The complexity comes from multiple directions: ownership, the market, engineering and sales, and the supply base. While this complexity has risen, the potential of technology has progressed and accelerated. The lack of significant financial return on technology investments would strongly suggest that this potential, up to this point, has largely been squandered.

Figure 1-4 is taken from Demand Driven Performance: Using Smart Metrics by Debra Smith and Chad Smith. The figure shows the tremendous difference in supply chain circumstances between 1965 and 2015.


FIGURE 1-4 Changing supply chain circumstances

From Debra Smith and Chad Smith, Demand Driven Performance: Using Smart Metrics, McGraw-Hill, 2013, p. 9.

Summary

We appear to have come full circle as MRP, according to observable, prevailing, and widespread effects across the world, now appears to be guilty of the same deficiencies as the techniques that preceded it. Software is simply a tool that translates and reinforces rules into a routine. If the rules behind the software are inappropriate and outdated, then the rules must change before the tools can change. In recent years, however, industry and software providers have attempted to combat increasing complexity with more sophisticated software applications, applications with the old rules still embedded at their core. The net effect is that we have improved the efficiency of doing the wrong or inappropriate things. Money and energy spent to optimize antiquated rules with increasingly sophisticated tools are wasteful, distractive, and counterproductive. Given the current world of increased variability and volatility, conventional planning logic now requires a fundamental overhaul. The authors think Joe Orlicky would agree.

The authors’ self-imposed mission was to stand on the shoulders of Joe Orlicky’s incredible vision in order to see further. This book proposes elegant and intuitive alternative planning rule sets to address the volatile twenty-first-century landscape. Complexity cannot be combated with more complexity. Effective and simplified rules and subsequent tools are necessary for a company’s resources to work more closely in alignment with the market, enabling a demand driven world. There can be no more lip service to small incremental changes that may or may not improve a company’s performance; concrete and proven tactics are required that drive sustainable bottom-line results. Where to start?

Demand Driven Material Requirements Planning (DDMRP), Version 2

Подняться наверх