Читать книгу Mechanical Engineering in Uncertainties From Classical Approaches to Some Recent Developments - Группа авторов - Страница 21
1.5. Interval analysis
ОглавлениеAs we have seen in previous sections, one of the major problems in modeling epistemic uncertainties resides in the determination of the probability distribution (or its bounding) that a certain physical quantity follows. However, in a large majority of cases, the uncertainty can at least be bounded. A range with very wide bounds can often be obtained by considering constraints of physical nature on the quantities of interest. For example, physical considerations make it possible to limit the Poisson ratio of a homogeneous isotropic material between –1 and 0.5. Even if relatively narrow, this range is not very informative. The opinion of an expert, on the other hand, could allow this range to be further significantly reduced. In our example, an expert could, for example, state that, for the material that we are considering, the Poisson ratio will be between 0.2 and 0.4. Note that, based on this information, we cannot characterize how the uncertainty varies within this range (are the boundary values as plausible as the center values?). Nonetheless, there are situations where a range is the only information available.
Interval analysis can be used to model such cases where the uncertainty is provided by bounds on the quantity of interest. The uncertainty on the quantities x1, …, xn is then described by their lower bound and their upper bound . A lot of research work has then focused on the propagation of uncertainties from x1, …, xn to a quantity y = g(x1, …, xn), where g is any function modeling the relation between the input and output variables. The problem of determining the lower and upper bound on y is equivalent to solving a global optimization problem. Consequently, many global optimization algorithms can be applied to solve this problem (Hansen and Walster 2003). Note that in cases where the function g is monotonic over the range of variation of the input variables (namely, on the range domain), then the lower and upper bounds on y can be determined very efficiently: due to the monotonicity, y has merely to be evaluated at the vertices of the hypercube constituted by the bounds of the input variables. The lower and upper bounds are then necessarily the minimum and maximum among these points, respectively. This method is often known as the vertex method. Other techniques use Taylor expansions to approximate the bounds on the output variable. Note that sampling techniques, such as Monte-Carlo simulation, can also be used to solve this problem, but this quickly becomes prohibitively time-consuming and methods using global optimization algorithms are usually more efficient. For a review of the different algorithms for efficient interval analysis, the reader may refer to Kreinovich and Xiang (2008).
A major disadvantage of interval approaches is the lack of a measure of uncertainty, similar to probability within the context of probability theory. In interval analysis, uncertainty is characterized by the boundaries of the interval only, but no information is available on the likelihood of different values within that interval. In probabilistic approaches, the likelihood of different values is characterized by the PDF, which defines the probability that the quantity of interest lies within a certain range (for example, an interval). Such a measure is not available in the interval approach.
Interval arithmetic and associated uncertainty propagation methods will not be discussed in more detail because of the lack of an uncertainty measure in interval analysis. This reduces the usefulness of the method for reliability- or robustness-based applications where quantification of the risk associated with various decisions is required. Nevertheless, the interval approach is still useful in situations where only the worst case needs to be considered.