New Version 8.1! |
Supports .NET 6.0.
Try it for free with our fully functional
60-day trial version.
|
QuickStart Samples
Optimization In One Dimension QuickStart Sample (IronPython)
Illustrates the use of the Brent and Golden Section optimizer classes in the Extreme.Mathematics.Optimization namespace for one-dimensional optimization in IronPython.
C# code
Visual Basic code
F# code
Back to QuickStart Samples
import numerics
from math import exp, sqrt
# The optimization classes resides in the
# Extreme.Mathematics.EquationSolvers namespace.
from Extreme.Mathematics.Optimization import *
# Function delegates reside in the Extreme.Mathematics
# namespace.
from Extreme.Mathematics import *
#/ Illustrates the use of the Brent and Golden Section optimizers
#/ in the Extreme.Mathematics.Optimization namespace of the
#/ Extreme Optimization Mathematics Library for .NET.
# Several algorithms exist for optimizing functions
# in one variable. The most common one is
# Brent's algorithm.
# The function we are trying to minimize is called the
# objective function and must be provided as a Func<double, double>.
# For more information about this delegate, see the
# FunctionDelegates QuickStart sample.
#
# Brent's algorithm
#
# Now let's create the BrentOptimizer object.
optimizer = BrentOptimizer()
# Set the objective function:
optimizer.ObjectiveFunction = lambda x: x ** 3 - 2 * x - 5
# Optimizers can find either a minimum or a maximum.
# Which of the two is specified by the ExtremumType
# property
optimizer.ExtremumType = ExtremumType.Minimum
# The first phase is to find an interval that contains
# a local minimum. This is done by the FindBracket method.
optimizer.FindBracket(0, 3)
# You can verify that an interval was found from the
# IsBracketValid property:
if not optimizer.IsBracketValid:
raise Exception("An interval containing a minimum was not found.")
# Finally, we can run the optimizer by calling the FindExtremum method:
optimizer.FindExtremum()
print "Function 1: x^3 - 2x - 5"
# The Status property indicates
# the result of running the algorithm.
print " Status:", optimizer.Status
# The result is available through the
# Result property.
print " Minimum:", optimizer.Result
exactResult = sqrt(2/3.0)
result = optimizer.Extremum
print " Exact minimum:", exactResult
# You can find out the estimated error of the result
# through the EstimatedError property:
print " Estimated error:", optimizer.EstimatedError
print " Actual error:", result - exactResult
print " # iterations:", optimizer.IterationsNeeded
print "Function 2: 1/Exp(x*x - 0.7*x +0.2)"
f2 = lambda x: 1/exp(x**2 - 0.7*x +0.2)
# You can also perform these calculations more directly
# using the FindMinimum or FindMaximum methods. This implicitly
# calls the FindBracket method.
result = optimizer.FindMaximum(f2, 0)
print " Maximum:", result
print " Actual maximum:", 0.35
print " Estimated error:", optimizer.EstimatedError
print " Actual error:", result - 0.35
print " # iterations:", optimizer.IterationsNeeded
#
# Golden section search
#
# A slower but simpler algorithm for finding an extremum
# is the golden section search. It is implemented by the
# GoldenSectionMinimizer class:
optimizer2 = GoldenSectionOptimizer()
# Maximum at x = 0.35
print "Using Golden Section optimizer:"
result = optimizer2.FindMaximum(f2, 0)
print " Maximum:", result
print " Actual maximum:", 0.35
print " Estimated error:", optimizer2.EstimatedError
print " Actual error:", result - 0.35
print " # iterations:", optimizer2.IterationsNeeded
Copyright © 2003-2021, Extreme Optimization. All rights reserved.
Extreme Optimization, Complexity made simple, M#, and M Sharp are trademarks of ExoAnalytics Inc.
Microsoft, Visual C#, Visual Basic, Visual Studio, Visual Studio.NET, and the Optimized for Visual Studio logo
are registered trademarks of Microsoft Corporation.