Читать книгу Power Magnetic Devices - Scott D. Sudhoff - Страница 18

Example 1.3A

Оглавление

Let us apply Newton’s method to find the minimizer of the function

(1.3A-1)

This example is an arbitrary mathematical function; we will consider how to construct f(x) so as to serve a design purpose in Section 1.9. Inspection of (1.3A-1) reveals that the global minimum is at x1 = 2 and x2 = ln (2). However, let us apply Newton’s method to find the minimum. Our first step is to obtain the gradient and the Hessian. From (1.3A-1), we have

(1.3A-2)

and

(1.3A-3)

Table 1.1 Newton’s Method Results

k x[k] f(x[k]) ∇f(x[k]) F(x[k])
1 43
2 14.2
3 9.25
10 8.00

Let us arbitrarily take our initial estimate of the solution to be . Table 1.1 lists the numerical results from the repeated application of (1.3-7). As can be seen, during the first three iterations, the value of the function decreases rapidly. However, then the rate of reduction of the function slows. Observe that on the 10th iteration the value of the objective function is the minimum value to three significant digits, though there is still some discrepancy in the estimate of the minimizer. In this problem, the minimum is quite shallow, which reduces the speed of convergence.

Newton’s method can be extremely effective on some problems, but prove problematic on others. For example, if f(x) is not twice differentiable for some x, difficulties arise since Newton’s method requires the function, its gradient, and its Hessian. Many optimization methods require similar information and share similar drawbacks. There are optimization methods that do not require derivative information. One example is the Nelder–Mead simplex method. Even so, this algorithm can still become trapped at local minimizers if the function is not convex.

One feature that makes these methods susceptible to becoming trapped at a local minimum is that they take the approach of starting with a single estimated solution and attempt to refine that estimate. If the single estimate is close to a local extrema, it will tend to converge to that extrema. There is another class of optimization methods that are not based on a single estimate of the solution but on a large number (a population) of estimates. These population‐based methods are not as susceptible to convergence to a nonglobal local extrema because there are a multitude of candidate optimizers.

Genetic algorithms (GAs) are a population‐based optimization algorithms that have proven very effective in solving design optimization problems. Other population‐based optimization methods, such as particle swarm optimization, have also been used successfully. While one can engage in a lengthy debate over which algorithm is superior, such a debate is unlikely to be fruitful. The focus of this text is on posing the design problem as a formal optimization problem; once the problem is so posed, any optimization algorithm can be used. A discussion of GAs is included herein in order to provide the reader with a background in at least one method that can be used for the optimization process.

Power Magnetic Devices

Подняться наверх