Home»Documentation»Mathematics Library User's Guide»Automatic Differentiation

## Automatic Differentiation | Extreme Optimization Numerical Libraries for .NET Professional |

Many of the methods in the
**Extreme Optimization Numerical Libraries for .NET**
that use functions of one or more variables require derivatives and gradients.
The calculation of these functions can be tedious and error-prone.
These functions can be approximated numerically,
This is the default behaviour if the functions are not supplied by the user.
However, this has its own drawbacks in terms of performance and stability.

To address these problems, the
**Extreme Optimization Numerical Libraries for .NET**
includes the capability to compute these functions symbolically, given
a representation of the function as an expression tree.

Most mathematical functions in the .NET Framework Base Class Libraries, such as those in the Math class, as well as the functions in the Elementary and Special classes, are supported out of the box. Even though these functions are all that is needed in most applications, it may still be necessary to add more.

Symbolic derivatives are defined as lambda expressions. They have to be built using the classes in the System.Linq.Expressions namespace. Often, reflection data which is manipulated using types in the System.Reflection namespace are also needed.

For static functions of one variable, like Sin(Double), it is a lambda expression that takes one argument and returns the derivative of the function of that argument. For example, for Sin(Double), the derivative would be:

MethodInfo cos = typeof(Math).GetMethod("Cos", new Type[] { typeof(double) }); ParameterExpression t = Expression.Parameter(typeof(Double)); LambdaExpression dsin = Expression.Lambda(Expression.Call(cos, t), t);

For static methods, there may be a derivative with respect to each of the arguments. Each such lambda expression takes the same arguments as the original function.

For instance methods, the same pattern is used, except that the lambda expression always takes the instance as its first argument, with the remaining arguments as before.

The SymbolicMath class has a method, DefineSymbolicDerivative, which can be used to directly define the derivative of a function. It takes three arguments. The first is the MethodInfo of the method whose derivative is being defined. The second is an integer that specifies the position of the argument with respect to which the derivative applies. The thirds is a lambda expression that specifies the derivative in the form defined above.

For example, to define the derivative of Sin(Double) explicitly, we could write:

MethodInfo sin = typeof(Math).GetMethod("Sin", new Type[] { typeof(double) }); SymbolicMath.DefineSymbolicDerivative(sin, 0, dsin);

Alternatively, derivatives can be defined by the class they are declared in. This is the method used by the Elementary and Special classes.

When the symbolic derivative of a function is requested, the internal cache is queried first. If no match was found, the declaring type of the method is inspected to see if it has a GetSymbolicDerivative method. If it does, then this method is called that takes two arguments: the first is the MethodInfo of the method whose derivative is being defined. The second is an integer that specifies the position of the argument with respect to which the derivative applies. The lambda expression for the specified function should be returned. If no derivative is available for the method, then should be returned.

The details of how to use automatic differentiation with specific classes is given with the documentation for those classes. Here we will give some general guidelines only.

Most of the methods in the FunctionMath class have an equivalent method in the SymbolicMath class that makes use of automatic differentiation. Whereas the methods in FunctionMath are defined as extension methods, the methods in SymbolicMath have not, since doing so would create an ambiguity between the two methods that would cause a compiler error.

In addition, several methods in SymbolicMath have been overloaded to make them more convenient. For example, the FindMinimum method has overloads that work directly on functions of 2 to 15 arguments, rather than on functions that take an argument. The following one-line code snippet finds the minimum of the so-called Rosenbrock function using automatic differentiation (it has been spread over multiple lines to improve clarity):

SymbolicMath.FindMinimum( (x, y) => Math.Pow(1 - x, 2) + 100 * Math.Pow(y - x * x, 2), Vector.Create(-1.2, 1));

Nonlinear curves are special in that they often require the derivative with respect to each of the parameters. The NonlinearCurve class contains two methods that make this process very simple.

The FromExpression method creates a NonlinearCurve from a lambda expression. The first argument of the lambda is the x value. The remaining arguments correspond to the function parameters. The resulting curve object can be used directly for nonlinear curve fitting or other purposes.

The following code constructs a 3-parameter logistic curve:

```
NonlinearCurve curve = NonlinearCurve.FromExpression(
(x, a, b, c) => c / (1 + a * Math.Exp(b * x)));
```

The GetGeneratorFromExpression method is similar to The FromExpression, but returns a function that creates instances of the curve object. This is useful in situations where more than one instance of the curve is required.

Many algorithms for optimization and finding zeros use the derivative or gradient of the objective function and can make use of automatic differentiation. For example, optimization classes have a SymbolicObjectiveFunction property that can be used to specify the objective function as a lambda expression. Unlike the convenience methods in the SymbolicMath class, the lambda expression must be expressed as a function of a vector rather than a function of multiple variables.

In nonlinear programs, constraints can also be defined as lambda expressions using the AddSymbolicConstraint method. The gradient, which is needed during the calculations, is then computed symbolically.

#### Reference

Copyright Â© 2004-2023,
Extreme Optimization. All rights reserved.

*Extreme Optimization,* *Complexity made simple*, *M#*, and *M
Sharp* are trademarks of ExoAnalytics Inc.

*Microsoft*, *Visual C#, Visual Basic, Visual Studio*, *Visual
Studio.NET*, and the *Optimized for Visual Studio* logo

are
registered trademarks of Microsoft Corporation.