Читать книгу Manufacturing and Managing Customer-Driven Derivatives - Qu Dong - Страница 16
Part One
Overview of Customer-driven Derivative Business
Chapter 2
Pillars in Structured Derivative Business
Model and Product Development Process
ОглавлениеFinancial derivative products are not tangible, and ultimately they are based on models. Quantitative analysts (quants) must be a risk-conscious business group. Its roles encompass developing derivative pricing and hedging models, providing quantitative supports, formulating and developing derivatives model-related trading and risk systems. Quants should be one of the drivers along the industrialized production line for derivatives.
Derivative pricing models are vital in the structured derivatives and risk management business. Many client-driven derivative products have no direct traded markets for bench-marking, and they will have to be marked to the models, although the vanilla markets are used for calibration. In such a (de facto) marking-to-model business environment, the quality of the models is paramount, as it not only impacts P&L, but also the day-to-day hedging and risk managing activities. Banks must establish and standardize a process for developing quality pricing and hedging models, as a key part of the efficient and reliable production line which can also minimize the model risks.
Principles of Model and Product Development
Model specification, its numerical implementation and development testing need to follow a number of critical principles. Independent model validation is also an essential part of the model and product development process.
Model Specification
Speculation is human, hedge is divine. The central part of the non-arbitrage derivative pricing framework rests on the divine principle of hedge. The model specification must comply with this general framework. The model mathematical formulation, scope, applicability range and any limitations should be clearly specified. Any model assumptions deviating from those defined in the model framework should be thoroughly assessed with the business. Assumptions and potential implications of hedging should be explicitly explained. The bank must seek to eliminate or minimize the model mis-specification risks at source.
It is very important that the models are specified and implemented as close as possible to the real world, and they are suitable for day-to-day business usage. Quants should be aware of the common and best market practices, remembering that the models are not only used for pricing, but also for risk analysis and hedging. As a general guideline, good model specifications aim to achieve the following qualities:
• capturing market risks which matter from a hedging perspective;
• calibrating reliably to the markets to enable reliable hedging;
• numerical stability for pricing and computing risk sensitivities (Greeks);
• computationally efficient for front office pricing as well as downstream risk calculations.
Model Implementation Process
Model implementation is an interactive process among quants, traders, IT and risk managers. It entails the following stages:
• Quants develop pricing models including all the necessary calibration routines in a quant library. It is vital that the quant library is structurally well-designed and object-oriented.
• Quants, working with IT, develop system and user interfaces for the trading and risk systems.
• Quants conduct model development testing to examine the validity and implementation of the model. The model test scopes as well as the results should be documented.
• IT develops the downstream applications, including the relevant Risk and Back Office requirements.
• Risk conducts independent model validation.
A typical model/product implementation flow chart is shown in Figure 2.2.
Figure 2.2 Typical model/product implementation flow chart
Note that the model trading/risk system integration should be accomplished during the model development stage as a parallel task, rather than after. This is because most of the model integration and interfacing works are not specific to a particular model. In a well-designed object-oriented quant library, the permitted parallel approach can greatly enhance the overall model and product development efficiency.
Model Testing
Quants model testing is to ensure that the model is implemented properly. It is aimed to minimize the implementation risks, which constitute a very large portion of model risks occurred in real life. Model testing should include development testing and system testing.
Development testing checks the fundamental mathematical and numerical implementation. Whenever possible, an alternative model should be developed for comparison purposes. The comparison between the models can reveal differences and deficiencies. Differences should be thoroughly examined and understood; some are due to legitimate differences in numerical methods and some due to implementation bugs.
Model system testing is to ensure that the implemented model performs as expected in the production environment. The testing within the trading and risk systems enables an assessment on model's pricing stability and its capability in generating sensible risk sensitivities, for hedging over a wide range of real market data at portfolio level. It is beneficial to run system tests using the live system overnight daily risk report and analysis tools. The impact of the new model on the live portfolio in terms of P&L and risk sensitivities should be assessed and fully understood.
System testing is vital for new model development. It is also essential for model change control. Live models may be changed and updated, and they should be subject to release change control procedure, including thorough system regression tests. All live models should be version-controlled, and this can usually be done easily by the source code's repository. Accompanying each model release, there should be a release note explaining any model changes together with version number and date, etc.
Independent Model Validation
Independent Model Validation (IMV) is based on the four-eyes principle to verify the model theory and test the model implementation. In practice, IMV develops its own equivalent models independently to conduct model comparison and testing. The actual model testing tends to be a large part of the work, as many implementation details need to be verified.
Once the models have been tested and approved by IMV, they can be released into production for pricing and hedging. It is the best practice that IT carries out the model release into trading and risk production systems independent of quants and trading. IT should manage and maintain production systems following a standard but independent procedure.
Quants should communicate effectively with the IMV team to facilitate its model validation, and more importantly model testing tasks. Some of the key information is listed in Table 2.1.
Table 2.1 Key model information
IMV is a very important development and control function, and its focus should be on mathematical verification and actual model testing. It should avoid spending time to go through front office quants' source codes for obvious reasons:
• The amount of tiny detail in the source codes is overwhelming. Going through source codes does not help with the independent mathematical or numerical verification.
• It does not help either with the most important part of IMV: the actual thorough model testing.
• It can potentially compromise IMV's “independent” validation.
• It substantially increases the bank's security risks of model source codes leaking out.
• Overall it consumes valuable resources and prolongs the validation process with little real benefit on control or business.
Object-Oriented Quant Library
A quant library consisting of implemented models is the engine in the modern derivatives business. It should be scalable, simple and transparent, allowing generic, efficient and user-friendly modular interfaces to the pricing tools, trading and risk systems. The quant library requires a well-designed architecture at the outset as well as ongoing enhancement to survive and succeed.
The quant library should be written in an object-oriented framework. Object-oriented programming and design has many advantages. At the programming level, the (C++ or C#) programs are well-structured and modular. At the practical level, it permits orthogonal combinations of objects. For example, by keeping the instrument/product objects distinctively separate from the valuation/model objects, the orthogonal combination allows one to price a particular instrument/product with any suitable model using any suitable numerical approach. This can be done at trade as well as portfolio level, reusing the same objects without coding repetitions.
Key Objects in a Quant Library
Table 2.2 lists some examples of the key objects or components in a quant library.
Table 2.2 Key objects in a quant library
When all the required objects are coded up properly in the quant library, it will allow efficient and flexible interactions among the objects in the process of developing new models and products. A generic description of a product can be constructed naturally by connecting together the relevant objects. For example: a swap consists of legs, a leg consists of cash flows, and cash flow consists of various attributes including currency, notional, pay/receive and auxiliary information. All the required details are wrapped up in an organized way that permits easier understanding and repeated usage without code repetitions.
Objects Interconnection and Architecture
Figure 2.3 illustrates how the objects are interconnected from the architecture perspective.
Figure 2.3 Object Interconnection and Architecture
Generic interface should be very thin, and its sole task is to transit and map data, reformatting data as necessary. Interfacing is extremely important, as the quant library must be integrated into trading and risk systems to be of value to the business. A badly designed interface will significantly increase the time and costs of developing new products. In the following, the “attributes table” approach is explained as an example of a generic interface for systems.
In trading and risk systems, common attributes such as spot, notional, currency, yield curve, etc. are readily available and a quant developer can simply pull them out and group them into objects that are fed into the pricing models and/or risk engines. For the exotic (or even common) attributes, an attributes table can be created inside the trading system. An example can be seen in Table 2.3.
Table 2.3 Example attributes table
Once the attribute table is set up inside the trading system, the quant developer can simply loop through the table, and pass all attributes in the table into the model interface. The model interface should be designed so that it can recognize the attributes and map them into the relevant objects. The beauty of this approach is that the looping codes are simple, and they do not change, no matter what the attributes are. This makes the quant developer's job much easier and more standardized. For some risk engines or back office systems that sit outside the trading system, a risk developer can use similar looping codes to read attributes and call the same model interface. The attribute table approach makes it possible for the same looping codes to be used in the trading and downstream systems, for many different products. It is therefore feasible that once quants have developed and added a new product into the trading system, all downstream systems will automatically work.
Object-oriented quant library architecture is fundamental in meeting the challenges in modern derivatives business. Many banks had to rewrite their quant libraries every a few years, wasting a huge amount of time and resources, because their prevailing libraries were not properly designed and constructed or simply became too complex to handle.
A quant library should be a child born from the marriage of brilliant mathematical modelling and skilful IT programming. A well-designed and constructed, object-oriented quant library can offer:
• Integrated business efficiency and much-enhanced productivity, including streamlined interfacing to systems and infrastructures.
• Standardization of model development and testing process, and minimization of model implementation risks.
• Application of higher-quality operational control procedures, allowing four eyes to watch a centralized piece.
Finally, an object-oriented quant library should be kept simple. Overly complicated object structures are tempting, but they may in fact defeat the purpose of having an efficient quant library. So keep it simple and object-oriented (KISOO).
Quantitative Documentation
Derivative models developed by Quants must be documented comprehensively. The key quantitative documents are listed in Table 2.4.
Table 2.4 Key quantitative documents
Table 2.5 and Table 2.6 show examples of the model technical and testing documentation templates respectively. These tables are given to illustrate the scopes and details required for achieving a high documentation standard.
Table 2.5 Technical documentation template
Table 2.6 Model testing documentation template
Model technical and testing documents also serve as the audit trail of the quantitative works done during the model/product development. These works are essential to pursue the highest possible quality for the model and the quant library as a whole.