Data Analysis Mathematics Linear Algebra Statistics
New Version 7.0!  QuickStart Samples

# Optimization In One Dimension QuickStart Sample (IronPython)

Illustrates the use of the Brent and Golden Section optimizer classes in the Extreme.Mathematics.Optimization namespace for one-dimensional optimization in IronPython.

```import numerics

from math import exp, sqrt

# The optimization classes resides in the
# Extreme.Mathematics.EquationSolvers namespace.
from Extreme.Mathematics.Optimization import *
# Function delegates reside in the Extreme.Mathematics
# namespace.
from Extreme.Mathematics import *

#/ Illustrates the use of the Brent and Golden Section optimizers
#/ in the Extreme.Mathematics.Optimization namespace of the
#/ Extreme Optimization Mathematics Library for .NET.

# Several algorithms exist for optimizing functions
# in one variable. The most common one is
# Brent's algorithm.

# The function we are trying to minimize is called the
# objective function and must be provided as a Func<double, double>.
# FunctionDelegates QuickStart sample.

#
# Brent's algorithm
#

# Now let's create the BrentOptimizer object.
optimizer = BrentOptimizer()

# Set the objective function:
optimizer.ObjectiveFunction = lambda x: x ** 3 - 2 * x - 5

# Optimizers can find either a minimum or a maximum.
# Which of the two is specified by the ExtremumType
# property
optimizer.ExtremumType = ExtremumType.Minimum

# The first phase is to find an interval that contains
# a local minimum. This is done by the FindBracket method.
optimizer.FindBracket(0, 3)
# You can verify that an interval was found from the
# IsBracketValid property:
if not optimizer.IsBracketValid:

# Finally, we can run the optimizer by calling the FindExtremum method:
optimizer.FindExtremum()

print "Function 1: x^3 - 2x - 5"
# The Status property indicates
# the result of running the algorithm.
print "  Status:", optimizer.Status
# The result is available through the
# Result property.
print "  Minimum:", optimizer.Result
exactResult = sqrt(2/3.0)
result = optimizer.Extremum
print "  Exact minimum:", exactResult

# You can find out the estimated error of the result
# through the EstimatedError property:
print "  Estimated error:", optimizer.EstimatedError
print "  Actual error:", result - exactResult
print "  # iterations:", optimizer.IterationsNeeded

print "Function 2: 1/Exp(x*x - 0.7*x +0.2)"
f2 = lambda x: 1/exp(x**2 - 0.7*x +0.2)
# You can also perform these calculations more directly
# using the FindMinimum or FindMaximum methods. This implicitly
# calls the FindBracket method.
result = optimizer.FindMaximum(f2, 0)
print "  Maximum:", result
print "  Actual maximum:", 0.35
print "  Estimated error:", optimizer.EstimatedError
print "  Actual error:", result - 0.35
print "  # iterations:", optimizer.IterationsNeeded

#
# Golden section search
#

# A slower but simpler algorithm for finding an extremum
# is the golden section search. It is implemented by the
# GoldenSectionMinimizer class:
optimizer2 = GoldenSectionOptimizer()

# Maximum at x = 0.35
print "Using Golden Section optimizer:"
result = optimizer2.FindMaximum(f2, 0)
print "  Maximum:", result
print "  Actual maximum:", 0.35
print "  Estimated error:", optimizer2.EstimatedError
print "  Actual error:", result - 0.35
print "  # iterations:", optimizer2.IterationsNeeded```