Data Analysis Mathematics Linear Algebra Statistics
New Version 8.1!

Supports .NET 6.0. Try it for free with our fully functional 60-day trial version. QuickStart Samples

# Optimization In One Dimension QuickStart Sample (F#)

Illustrates the use of the Brent and Golden Section optimizer classes in the Extreme.Mathematics.Optimization namespace for one-dimensional optimization in F#.

```// Illustrates the use of the Brent and Golden Section optimizers
// in the Extreme.Mathematics.Optimization namespace of the
// Extreme Optimization Mathematics Library for .NET.

#light

open System

// The optimization classes resides in the
// Extreme.Mathematics.EquationSolvers namespace.
open Extreme.Mathematics.Optimization
// Function delegates reside in the Extreme.Mathematics
// namespace.
open Extreme.Mathematics

// Several algorithms exist for optimizing functions
// in one variable. The most common one is
// Brent's algorithm.

// The function we are trying to minimize is called the
// objective function and must be provided as a Func<double, double>.

//
// Brent's algorithm
//

// Now let's create the BrentOptimizer object.
let optimizer = BrentOptimizer()

// Set the objective function:
optimizer.ObjectiveFunction <- Func<_,_> (fun x -> x * x * x - 2.0 * x - 5.0)

// Optimizers can find either a minimum or a maximum.
// Which of the two is specified by the ExtremumType
// property
optimizer.ExtremumType <- ExtremumType.Minimum

// The first phase is to find an interval that contains
// a local minimum. This is done by the FindBracket method.
optimizer.FindBracket(0.0, 3.0)
// You can verify that an interval was found from the
// IsBracketValid property:
if (not optimizer.IsBracketValid) then

// Finally, we can run the optimizer by calling the FindExtremum method:
let extremum = optimizer.FindExtremum()

printfn "Function 1: x^3 - 2x - 5"
// The Status property indicates
// the result of running the algorithm.
printfn "  Status: %A" optimizer.Status
// The result is available through the
// Result property.
printfn "  Minimum: %A" optimizer.Result
let exactResult = sqrt(2.0/3.0)
let result = optimizer.Extremum
printfn "  Exact minimum: %A" exactResult

// You can find out the estimated error of the result
// through the EstimatedError property:
printfn "  Estimated error: %A" optimizer.EstimatedError
printfn "  Actual error: %A" (abs(result - exactResult))
printfn "  # iterations: %d" optimizer.IterationsNeeded

printfn "Function 2: 1/Exp(x*x - 0.7*x +0.2)"
// You can also perform these calculations more directly
// using the FindMinimum or FindMaximum methods. This implicitly
// calls the FindBracket method.
let f2 = Func<_,_> (fun x -> 1.0 / exp(x*x - 0.7*x + 0.2))
let result2 = optimizer.FindMaximum(f2, 0.0)
printfn "  Maximum: %A" result2
printfn "  Actual maximum: %A" 0.35
printfn "  Estimated error: %A" optimizer.EstimatedError
printfn "  Actual error: %A" (result - 0.35)
printfn "  # iterations: %d" optimizer.IterationsNeeded

//
// Golden section search
//

// A slower but simpler algorithm for finding an extremum
// is the golden section search. It is implemented by the
// GoldenSectionMinimizer class:
let optimizer2 = GoldenSectionOptimizer()

printfn "Using Golden Section optimizer:"
let result3 = optimizer2.FindMaximum(f2, 0.0)
printfn "  Maximum: %A" result3
printfn "  Actual maximum: %A" 0.35
printfn "  Estimated error: %A" optimizer2.EstimatedError
printfn "  Actual error: %A" (result - 0.35)
printfn "  # iterations: %d" optimizer2.IterationsNeeded

printf "Press Enter key to exit..."