A numerical method for maximizing a function by invariably moving upwards along a slope of steepest ascent from the current position. The method does not guarantee to reach a global maximum because, like a mountaineer who tries to reach the highest point by always climbing up a slope of steepest ascent from any given point and who gets trapped on a local mound separated by valleys from higher peaks, the process may get trapped at a local maximum nowhere near the global maximum or summit. It is used in connectionism (1) and parallel distributed processing to maximize the correlation between the output of a network model and the desired state for a given input. Also called steepest ascent. See also annealing (1), back-propagation algorithm. Compare gradient descent.