A Brief Introduction to Global Optimization
I. What is Optimization?
The term of “optimization” refers to the process of searching for the optimum solution from a set of candidates to the problem of interest based on certain performance criteria, which has found broad applications in manufacturing, engineering, finance and logistics etc. Generally speaking, all problems to be optimized should be able to be formulated as a system with its status controlled by one or more input variables and its performance specified by a well defined objective function. The goal of optimization is to find the best value for each variable in order to achieve satisfactory performance. In practice, it could mean to accomplish a predefined task in the most efficient way or the highest quality or to produce maximum yields given limited resources.
II. Real World Problems
Optimization problems are ubiquitous in our everyday life and may come in a variety of forms. For example, a fund manager may need to decide an appropriate investment portfolio according to the specific investment objective, which is often represented by the tradeoff between profit and volatility. Usually, funds mainly invested in assets such as cash and property are likely to yield stable but relatively low return while an aggressive portfolio could be made up of a significant portion of highrisk assets such as shares, which may experience large fluctuations in value especially in short term. In this scenario, the variables to be optimized are the percentages of each class of asset, which of course should add up to one.


On the other hand, if this manager needs to visit his clients in different locations, he may be interested in planning his journey in advance in order to visit all clients at the minimum cost in terms of time and petrol, which often implies choosing the shortest route. Suppose there are totally n clients, all possible routes could be encoded by n variables with each one indicating the order of a certain client to be visited. If he is reluctant to drive by locations that he has already visited, there would be totally n! possible routes to choose from. In this problem, the number of candidates could grow very quickly as the increase of the number of locations. For instance, there are 3,628,800 and 2,432,902,008,176,640,000 possible routes for n=10 and n=20 respectively.


In the mean time, people have long been enthusiastic on discovering the underling principles behind data in the hope to, for example, better understand the movement of market price or help the diagnosis of illness based on patient symptoms. These problems are generally specified as regression problems in which one wants to approximate the distribution of the data or classification problems in which data points are to be given different labels and a variety of learning systems have been developed to model the important features embedded in the data. Since most systems have multiple parameters to be adjusted and their performance depends heavily on choosing the right parameter values, optimization techniques could also play a significant role in these areas.


III. How to Solve Them?
While the best solutions may be worked out manually or through exhaustive search in some simple situations, computerized optimization techniques are required to handle most nontrivial problems mainly due to the size and multimodality of the search space. Furthermore, many problems have inherent constraints that must be satisfied and multiple criteria to be met. As a result, good optimization techniques should not only be able to evaluate candidate solutions hundreds of thousands times faster than human thanks to the power of modern computer systems but also employ intelligent mechanism to guide the searching in order to solve challenging problems in practical time.
Despite of the existence of a wide variety of optimization problems that may look quite different from each other, the good news is that many of them could be tackled using techniques under a unified framework. This means that it is not compulsory to invent brand new approaches for each new problem, although certain level of customization may be helpful. The reason is that, during optimization, each problem is transformed into a landscape in the mathematical space similar to its counterpart in the physical world with each dimension representing a variable and an additional dimension indicating the performance. Consequently, the process of optimization is like wondering on the landscape to look for the highest peak, without worrying about the realistic meaning of the problem.


However, landscape walking is not necessarily an easy task as the search space will often grow exponentially with regard to the number of variables. Furthermore, for landscapes with a large number of peaks, locating the best one could also be very challenging. Just image the difficulty of finding a way to the top of the highest hill in a mountain extending hundreds of miles with numerous peaks, valleys and plateaus. In fact, even deciding whether you are on the top of the highest hill could sometimes be difficult. Note that you DO NOT have the big picture of the landscape except the performance of each single point or solution. Unfortunately, based on the current computing technology, it is nearly impossible to guarantee finding the optimum solution in polynomial time.
In order to find as good as possible solutions to these hard optimization problems, a number of metaheuristic algorithms have been developed since 1970s many of which are loosely based on the fundamental principles of the nature such as population, selection and mutation etc. Some famous ones are Genetic Algorithms, Evolution Strategies, Genetic Programming, Ant Colony Optimization, Estimation of Distribution Algorithms, Particle Swarm Optimizers, Memetic Algorithms and Differential Evolution, to name a few. The basic idea behind all these algorithms is to assume that there exists some general structure in the landscape, which could be exploited to efficiently explore the search space. Despite of the different heuristics used by these algorithms, the major advantage is that they are usually populationbased and have inherent parallel mechanism, which make them less likely to get stuck on local optima. Also, many of them could be implemented relatively easily and could also be modified to better suit different problems.
IV Summary
Optimization is an active and fast growing research area and has a great impact on the real world. Despite of the enormous amount of work that has been conducted both theoretically and empirically and the huge success that has been achieved in different aspects, it is still an ongoing and longterm task to develop competent techniques, which could effectively solve largescale optimization problems. After all, solving challenging optimization problems is not as simple as plugging in an offtheshelf optimization routine and hoping it will do the job. Instead, it still requires a significant involvement of human expertise in the analysis, specification and decomposition of the problem as well as choosing and customizing the right technique.
Contact Author: Bo Yuan