首页 03 Improving Search

03 Improving Search

举报
开通vip

03 Improving SearchnullChapter 3 Improving SearchChapter 3 Improving SearchTianping Shuai School of Science, BUPT tpshuai@sohu.com3.1 Improving search, local and global optima3.1 Improving search, local and global optimaSolutions Solutions are the points visited in a search; ...

03 Improving Search
nullChapter 3 Improving SearchChapter 3 Improving SearchTianping Shuai School of Science, BUPT tpshuai@sohu.com3.1 Improving search, local and global optima3.1 Improving search, local and global optimaSolutions Solutions are the points visited in a search; A solution is a choice of values for all decision variables. For example, in the two crude model, which had two decision variables x1 and x2, a solution is a choice of two nonnegative quantities x1, x2. We can use a vector to represent a solution. For a model with decision vector x, the first solution visited by a search is denoted x(0). The next x(1), and so on.nullConsider the fictitious problem of choosing a location for the latest Dclub discount department store. Dots on the map in Figure 3.1 show the three population centers of the area to be served. Population center 1 has approximately 60,000 persons, center 2 has 20,000, and center 3 has 30,000. DClub wishes to locate one new store somewhere in the area in a way that maximizes business from the three populations.Example 3.1 DClub LocationnullnullClearly, the decision variables are x1 and x2, the coordinates of the chosen location.The new store can be located anywhere except in the congested areas within ½ mile of each population center. That is, constraints of the model arenullFor an objective function, assume that experience shows that the business attracted from any population follows a “gravity” pattern– proportional to population ( here in thousands) and inversely proportional to 1+the square of its distance from the chosen location.From the above analysis, the objective function becomesnullSo the full model is subject to subject tonullnullExample of an Improving SearchnullThe search begins with it advances through solutionsto optimumnull3.3 Improving searches are numerical algorithms that begin at a feasible solution to given optimization model and advance along a search path of feasible points with ever-improving objective function value.Neighborhood PerspectiveNeighborhood Perspective3.4 The neighborhood of a current solution x(t) consists of all nearby points; that is, all points within a small positive distance of x(t).null[3.5] A solution is a local optimum(local maximum for a maximize problem or local minimum for a minimize problem) if it is feasible and if sufficiently small neighborhoods surrounding it contain no points that are both feasible and superior in objective. [3.6] A solution is a global optimum(global maximum for a maximize problem or global minimum for a minimize problem) if it is feasible and no other feasible solution has superior objective value. Local Optima/ Global Optima [3.7] Global optima are always local optimum [3.8] local optima may not be global optima.nullLocal Optima and Improving Search[3.9]Improving searches stop if they encounter a local optimum Dealing with Local OptimaThe most tractable optimization models for improving search are those with mathematical forms assuring every local optimum is a global optimum. When models have local optima that are not global, the most satisfactory available analysis is often to run several independent improving searches and accept the best local optima discovered as a heuristic or approximate optimanull3.2 Search with improving and feasible direction3.2 Search with improving and feasible directionDirection-Step ParadigmnullVector x is an improving direction at current solution x(t) if the objective function value at x(t)+ x is superior to that of x(t) for all >0 sufficiently small.Improving DirectionnullnullnullFeasible DirectionsVector x is a feasible direction at current solution x(t) if point x(t)+ x violates no model constraint if >0 is sufficiently small.nullStep-Size: How Far?Step-Size: How Far?Once an improving feasible move direction has been discovered at the current solution, how far should we follow it? That is, what step size  should be applied?Improving searches normally apply the maximum step  for which the selected more direction continues to retain feasibility and improve the objective function.Algorithm 3A: Continuous improving searchAlgorithm 3A: Continuous improving searchnullExample: search of DClub Locationnullx(1) =x(1)-x(0)=(4.5,-2.75)nullIteration processx(1) =(2,-1)x(2) =(-4,-1)x(3) =(1,-1)1=2.752=0.253=0.5When improving search stops?When improving search stops?No optimization model solution at which an improving feasible direction is available can be a local optimum.When a continuous improving search terminates at a solution admitting no improving feasible direction, and mild assumptions hold, the point is a local optimum.nullNon-optimal points admitting no improving feasible directionsnullNon-optimal points admitting no improving feasible directionsDetecting UnboundednessDetecting UnboundednessAn optimization model is unbounded if it admits feasible solutions with arbitrarily good objective value.If an improving search discovers an improving feasible direction for a model that can be pursued forever without ceasing to improve or losing feasibility, the model is unbounded.nullMax f=x13.3 Algebraic conditions for improving and feasible directions3.3 Algebraic conditions for improving and feasible directionsGradientsGradients show graphically as vectors perpendicular to contours of the objective function and point in the direction of most rapid objective value increase.Gradient conditions for improving directionsGradient conditions for improving directionsSuppose that our search of objective function f has arrived at current solution x. Then the change associated with a step of size  in direction x can be approximated as Theorem: Direction x is improving for maximize objective function f at point at x if f(x)x>0. On the other hand, if f(x)x<0, x does not improve at xnullCase where dot product f(x)x=0 cannot be resolved without further information! Theorem: Direction x is improving for minimize objective function f at point at x if f(x)x<0. On the other hand, if f(x)x>0, x does not improve at xNote thatWe then need only choose x=f(x) nullTheorem: When objective function gradient f(x)0 Direction x= f(x) ( x=f(x)) is an improving direction for a maximize (minimize)objective f.Active constraints and feasible directionActive constraints and feasible directionTurning now to conditions for feasibility of directions, we focus our attention on constraints. Constraints define the boundary of the feasible region for an optimization model, so they will be the source of conditions for feasible directions.Not all the constraints of a model are relevant to whether a direction x is feasible at a particular solution x. For example, in Two Crude model, at point x=(4,4), any direction is feasible, but for point x=(7,0), any feasible direction x=(x,y) must satisfy y0 because constraint x20 limits the feasibility of directions.nullWhether a direction is feasible at a solution x depends on whether it would lead to immediate violation of any active constraint(or tight constraints/ binding constrains) at x, i.e., any constraint satisfied as equality at x.Equality constraints are always active at every feasible point.Conditions for feasible directions with linear constraintsConditions for feasible directions with linear constraintsLinear constraints take the following general forms.nullTheorem: Direction x=(x1, x2, …,xn) is feasible for a linearly constrained optimization model at x=(x1, x2, …,xn) solution if and only if3.4 Unimodal and convex model forms tractable for improving search3.4 Unimodal and convex model forms tractable for improving searchTractability in a model means convenience for analysis. To obtain a model that is tractable enough to yield useful insights, we sometimes have the choice of making simplifying assumptions or other compromises. What forms are best for improving search? If the mathematical form of the model guarantees that every local optimum is a global optimum, we can pursue an improving search without being much concerned about stopping at less than the truly optimal solution we desire. Unimodal objective functionsUnimodal objective functionsDefinition: An objective function f(x) is unimodal if the straight line direction from every point in its domain to every better point is an improving direction. That is, for every x(1) and every x(2) with a better objective function value, direction x= (x(2)-x(1))should be improving at x(1).Theorem: Linear objective functions are unimodal in both maximize and minimize optimization models.Unimodal objective function and unconstrained local optimaUnimodal objective function and unconstrained local optimaUnconstrained local optima are solutions for which no point in some surrounding neighborhood has a better objective function value. That is, they are local optima caused entirely by the objective function. An unconstrained global optima is a solution yielding a better objective value than any other in the domain of the objective function.Theorem: If the objective function of an optimization model is unimodal, every unconstrained local optimum is an unconstrained global optimum.Constraints and local optimaConstraints and local optimaConvex feasible setsConvex feasible setsDefinition: The feasible set of an optimization problem is convex if the line segment between every pair of feasible points falls entirely within the feasible region.nullDiscrete feasible sets are never convex (except in the trivial case where there is only one feasible point).Linear constraints and convexityLinear constraints and convexityThe line segment between vector solutions x(1) and x(2) consists of all points of the form x(2) + (x(1) x(1)) with01.Theorem: If all constraints of an optimization model are linear (both main and variable-type), its feasible space is convex.Lemma: If the feasible set of an optimization model is convex, there is a feasible direction leading from any feasible solution to any other.Convex sets and constraint tractabilityModels for which local optima must be globalModels for which local optima must be globalTheorem : If the objective function of optimization model is unimodal and the constraints produce a convex feasible set, every local optimum of the model is a global.3.5 Search for starting feasible solutions3.5 Search for starting feasible solutionsTwo Phase MethodIdea: In order to find a feasible solution, we first introduce some artificial variables, form a new associated mathematical programming with an apparent start feasible solution. We then use improving search to find an optimal solution. If the objective value is zero, then we obtained a feasible solution for original problem, otherwise, original problem is infeasible.Algorithm 3B: Two-phase improving searchAlgorithm 3B: Two-phase improving searchTwo crude model example revisedTwo crude model example revisedLinear Programming of Two crude: we If we choose x=(0,0), it will not satisfy the main constraints. We deal with constraints unsatisfied at our arbitrary starting solution by introducing artificial variables to absorb the infeasibility..Phase I modelsPhase I modelsArtificial variables Phase I constraints are derived from those of the original model by considering each in relation to the starting solution chosen. Satisfied constraints simply become part of the Phase I model. Violated ones are augmented with a nonnegative artificial variable to permit artificial feasibility. Introduce artificial variables x3,x4,x5, we have the following constraints in Phase I modelnullFor the above constraints, it clearly, x1=x2=0, x3=2, x4=1.5, x5=0.5 is an feasible starting solution (but not feasible in original problem). The next thing is drive out all the infeasibility, that is, we want to find another solution with x3=x4=x5=0. We can solved it by use improving search method to find an optimal solution with the following objective: x3+x4+x5That is, the objective function of Phase I is to minimizes the sum of the artificial variables.nullIn our Two Crude example, the resulting Phase I model isPhase I Outcomes?Phase I Outcomes?How could the Phase I search end?Artificial variables are restricted to be nonnegative, so their sum must be nonnegative. For the same reason, the optimal value of Phase I problem must be nonnegative and bounded. There three possibilities:Theorem: If Phase I terminates with a solution having (Phase I) objective function value=0, the components of the Phase I solution corresponding to original variables provide a feasible solution for the original model.nullTheorem: If Phase I terminates with a global minimum having (Phase I) objective function value>0, the original model is infeasibleTheorem: If Phase I terminates with a local minimum that may not be global but has (Phase I) objective function value>0, we conclude nothing. Phase I search should be repeated from a new starting solution.Big-M MethodBig-M MethodTwo-Phase Algorithm 3B deals with feasibility and optimality separately. Phase I search tests feasibility. Phase II proceeds to an optimum.The Big-M method combines these activities in a single search. Artificial variables are included in constraints exactly as in Phase I of Two-Phase search. However, the effort to drive their total to =0 is combined with a search for an optimal solution to the original model. The key to combining feasibility and optimality considerations is a composite objective function.nullThe Big-M method uses a large positive multiplier M to combine feasibility and optimality in a single-objective function of the form max (original objective) M(artificial variable sum) for an originally maximize problem, or min (original objective)+ M(artificial variable sum) for an originally minimize problem.nullWe illustrate it with the Two Crude example. The big-M form is Big-M OutcomesBig-M OutcomesTheorem: If a Big-M search terminates with a locally optimal solution having all artificial variables=0, the components of the solution corresponding to original variables form a locally optimal solution for the original modelTheorem: If a Big-M search terminates with a global optimal having some artificial variables>0, the original model is infeasible.nullTheorem: If a Big-M search terminates with a locally optimal solution having some artificial variables>0, or the multiplier M may not be large enough, we can conclude nothing. The search should be repeated with a larger M and/ or a new starting solution.
本文档为【03 Improving Search】,请使用软件OFFICE或WPS软件打开。作品中的文字与图均可以修改和编辑, 图片更改请在作品中右键图片并更换,文字修改请直接点击文字进行修改,也可以新增和删除文档中的内容。
该文档来自用户分享,如有侵权行为请发邮件ishare@vip.sina.com联系网站客服,我们会及时删除。
[版权声明] 本站所有资料为用户分享产生,若发现您的权利被侵害,请联系客服邮件isharekefu@iask.cn,我们尽快处理。
本作品所展示的图片、画像、字体、音乐的版权可能需版权方额外授权,请谨慎使用。
网站提供的党政主题相关内容(国旗、国徽、党徽..)目的在于配合国家政策宣传,仅限个人学习分享使用,禁止用于任何广告和商用目的。
下载需要: 免费 已有0 人下载
最新资料
资料动态
专题动态
is_541607
暂无简介~
格式:ppt
大小:7MB
软件:PowerPoint
页数:0
分类:其他高等教育
上传时间:2011-06-27
浏览量:179