Data Analysis Mathematics Linear Algebra Statistics
New Version 7.0!  QuickStart Samples

# Optimization In Multiple Dimensions QuickStart Sample (IronPython)

Illustrates the use of the multi-dimensional optimizer classes in the Extreme.Mathematics.Optimization namespace for optimization in multiple dimensions in IronPython.

```import numerics

# The optimization classes reside in the
# Extreme.Mathematics.Optimization namespace.
from Extreme.Mathematics.Optimization import *
# Function delegates reside in the Extreme.Mathematics
# namespace.
from Extreme.Mathematics import *
# Vectors reside in the Extreme.Mathematics.LinearAlgebra
# namespace.
from Extreme.Mathematics.LinearAlgebra import *

#/ Illustrates unconstrained optimization in multiple dimensions
#/ using classes in the Extreme.Mathematics.Optimization
#/ namespace of the Extreme Optimization Numerical Libraries
#/ for .NET.

#
# Objective function
#

# The objective function must be supplied as a
# Func<Vector, double> delegate. This is a method
# that takes one Vector argument and returns a real number.
# See the end of this sample for definitions of the methods
# that are referenced here.

# The famous Rosenbrock test function.
def fRosenbrock(x):
p = (1-x)
q = x - x*x
return p*p + 105 * q*q

# Gradient of the Rosenbrock test function.
def gRosenbrock(x, f):
# Always assume that the second argument may be null:
if f == None:
f = Vector.Create(2)
p = (1-x)
q = x - x*x
f = -2*p - 420*x*q
f = 210*q
return f

# The gradient of the objective function can be supplied either
# as a MultivariateVectorFunction delegate, or a
# MultivariateVectorFunction delegate. The former takes
# one vector argument and returns a vector. The latter
# takes a second vector argument, which is an existing
# vector that is used to return the result.

# The initial values are supplied as a vector:
initialGuess = Vector.Create(-1.2, 1)
# The actual solution is [1, 1].

#
# Quasi-Newton methods: BFGS and DFP
#

# For most purposes, the quasi-Newton methods give
# excellent results. There are two variations: DFP and
# BFGS. The latter gives slightly better results.

# Which method is used, is specified by a constructor
# parameter of type QuasiNewtonMethod:
bfgs = QuasiNewtonOptimizer(QuasiNewtonMethod.Bfgs)

bfgs.InitialGuess = initialGuess
bfgs.ExtremumType = ExtremumType.Minimum

# Set the ObjectiveFunction:
bfgs.ObjectiveFunction = fRosenbrock
# The FindExtremum method does all the hard work:
bfgs.FindExtremum()

print "BFGS Method:"
print "  Solution:", bfgs.Extremum
print "  Estimated error:", bfgs.EstimatedError
print "  # iterations:", bfgs.IterationsNeeded
# Optimizers return the number of function evaluations
# and the number of gradient evaluations needed:
print "  # function evaluations:", bfgs.EvaluationsNeeded

#
#

# Conjugate gradient methods exist in three variants:
# Fletcher-Reeves, Polak-Ribiere, and positive Polak-Ribiere.

# Which method is used, is specified by a constructor
# Everything else works as before:
cg.ObjectiveFunction = fRosenbrock
cg.InitialGuess = initialGuess
cg.FindExtremum()

print "  Solution:", cg.Extremum
print "  Estimated error:", cg.EstimatedError
print "  # iterations:", cg.IterationsNeeded
print "  # function evaluations:", cg.EvaluationsNeeded

#
# Powell's method
#

# Powell's method is a conjugate gradient method that
# does not require the derivative of the objective function.
# It is implemented by the PowellOptimizer class:
pw = PowellOptimizer()
pw.InitialGuess = initialGuess
# Powell's method does not use derivatives:
pw.ObjectiveFunction = fRosenbrock
pw.FindExtremum()

print "Powell's Method:"
print "  Solution:", pw.Extremum
print "  Estimated error:", pw.EstimatedError
print "  # iterations:", pw.IterationsNeeded
print "  # function evaluations:", pw.EvaluationsNeeded

#
#

# Also called the downhill simplex method, the method of Nelder
# and Mead is useful for functions that are not tractable
# by other methods. For example, other methods
# may fail if the objective function is not continuous.
# Otherwise it is much slower than other methods.

# The method is implemented by the NelderMeadOptimizer class:

# The class has three special properties, that help determine
# the progress of the algorithm. These parameters have
# default values and need not be set explicitly.
nm.ContractionFactor = 0.5
nm.ExpansionFactor = 2
nm.ReflectionFactor = -2

# Everything else is the same.
nm.SolutionTest.AbsoluteTolerance = 1e-15
nm.InitialGuess = initialGuess
# The method does not use derivatives:
nm.ObjectiveFunction = fRosenbrock
nm.FindExtremum()