Optimization algorithms are used to determine the minimum values of mathematical functions, and are widely used for such purposes as evaluating design tradeoffs and control systems as well as finding patterns in data. At a conference in mid-January, Massachusetts Institute of Technology's Hossein Mobahi and John Fisher presented a way to generate a sequence of simplified functions that ensure the best approximation the method can offer.
Their approach attempts to identify a convex approximation of an optimization problem through the use of Gaussian smoothing, which converts the cost function into a related function that gives a weighted average of all the surrounding values. The technique also minimizes abrupt dips or ascents in the cost function's graph. The weights assigned the surrounding values are determined by a Gaussian function, or normal distribution, also known as the bell curve. The width of a Gaussian function is determined by a single parameter.
Mobahi and Fisher initially use a very wide Gaussian, which, under certain conditions, yields a convex function, and gradually contract the width of the Gaussian to generate a series of intermediary problems. By the time the width of the distribution becomes zero, the original cost function is recovered because every value is the average of itself.
From MIT News
View Full Article
Abstracts Copyright © 2015 Information Inc., Bethesda, Maryland, USA
No entries found