Читать книгу Dynamic Spectrum Access Decisions - George F. Elmasry - Страница 85
5.3.2 DSA Cloud Services Metrology
ОглавлениеMetrology is the scientific method used to create measurements. With cloud services, we need to create measurements to:
1 quantify or measure properties of the DSA cloud services
2 obtain a common understanding of these properties.
A DSA cloud services metric provides knowledge about a DSA service property. Collected measurements of this metric help the DSA cognitive engine estimate the property of this metric during runtime. Post‐processing analysis can provide further knowledge of the metric property.
It is important to look at DSA cloud services metrics not as software properties measurements. DSA cloud services metrology measures physical aspects not functional aspects of the services. The designer of DSA as a set of cloud services should be able to provide measurable metrics such that a service agreement can be created and evaluated during runtime and in post processing. Since the model used here is a hierarchical model, a metric used at different layers of the hierarchy is evaluated differently at each layer. For example, as explained in Chapter 1, response time when providing DSA as a local service should be less than response time when providing DSA as a distributed cooperative service, which is also less than when providing DSA as a centralized service.
With the concept of providing DSA as a set of cloud services, the design should be able to go through an iterative process before the model is deemed workable. The design should include the following steps:
1 Create an initial service agreement driven from requirements and design analysis.
2 Run scripted scenarios to evaluate how the agreement is met during runtime through created metrics.
3 Run post‐processing analysis of these scripted scenarios to gain further knowledge of the properties of the selected metrics.
4 Refine the service agreement.
Figure 5.7 illustrates this iterative concept. The outcome of this processing is a defined service agreement with measurable metrics that a deployed system is expected to meet.
Figure 5.7 Iterative process to create a workable DSA service agreement.
With standard cloud services, a customer should be able to compare two service agreements from two different providers and select the provider that best meets his needs. The provider of an IaaS attempts to optimize the infrastructure resources use dynamically in order to create an attractive service agreement. If the scripted scenarios in Figure 5.7 are selected to represent deployed scenarios accurately, and if the iterative process in Figure 5.7 is run sufficiently enough and with enough samples, the service agreement created should be met with the deployed system. However, there should still be room to refine the cognitive algorithms, policies, rule sets, and configuration parameters after deployment if post‐processing analysis necessitates this change. A good system design should only require refining of policies, rule sets, and configuration parameters without the need for software modification. This system design should allow for the deployed cognitive engine to morph based on post‐processing analysis results.