Choosing a Solving Algorithm
You can choose a solving algorithm for the following functions: Find, Minerr, Minimize, Maximize, Pdesolve, Odesolve, numol, genfit, and polyroots, as well as for definite integrals.
• To view the algorithm, right-click the function name or the integral operator.
• To change the solving algorithm, select a solving algorithm from the list.
Algorithm Selection for Definite Integrals
The following
numerical integration methods are available:
• Romberg—Applicable for most integrals. Uses trapezoidal approximations over an even number of subintervals, then compares sequential estimates by summing the areas of the trapezoids.
• Adaptive—Applicable for functions that change rapidly over the interval of integration. Also named adaptive quadrature method.
• Infinite Limit—Applicable for integrals where one or both of the limits are infinite. The function being integrated must be real.
• Singular Endpoint—Applicable for integrals with a singularity or an infinity at one or both limits of integration. Also named the open-ended Romberg method.
Additional Information:
• If at least one of the integral limits is greater than the absolute value of 10^307, or if it is infinity, Auto Select uses Infinite Limit as the solving algorithm. For other cases, it uses Adaptive.
Algorithm Selection for Find, Minerr, Minimize, and Maximize
For
Find, Minerr, Minimize, and Maximize, the default algorithm used by
Auto Select is the
Nonlinear: Levenberg Marquardt method.
• Linear—Applicable when the problem has a linear structure, namely objective functions with all the constraints. This algorithm provides fast and precise results.
• Nonlinear: Levenberg Marquardt—Attempts to find the zeros of the errors in the constraints. If no zeros are found, the method minimizes the sum of squares of the errors in the constraints.
• Nonlinear: Conjugate Gradient—Factorizes a projection matrix, and applies the conjugate gradient method to approximately minimize a quadratic model of the barrier problem.
• Nonlinear: SQP—This active-set method solves a sequence of quadratic programming subproblems to find the solution.
• Nonlinear: Interior Point—This method replaces the nonlinear programming problem by a series of barrier subproblems controlled by a barrier parameter.
• Nonlinear: Active Set—Active set methods solve a sequence of subproblems based on a quadratic model of the original problem.
Troubleshooting:
• Try a different method. A particular method may work better or worse than the others on the problem you are attempting to solve.
• Try a different guess value or add an inequality constraint. Provide a complex guess value if you are solving for a complex solution.
• Use Minerr instead of Find to reach an approximate solution.
For systems having more than one solution, the solution returned depends on the guess values. You can add inequalities to force the solver to find a different solution.
Algorithm Selection for Pdesolve and numol
For
Pdesolve and
numol, the default solving algorithm is
Recursive 5-Point Differences. All the algorithms solve using the
Radau method.
• Polynomial—Uses polynomial approximations of spatial derivatives of the first and second order to reduce the PDE system to an ODE system by the time variable.
• Central Differences—Uses central differences approximations of spatial derivatives of the first order with recursive application for spatial derivatives of the second order to reduce the PDE system to an ODE system by the time variable and.
• 5-Point Differences—Uses 5-point differences approximations of spatial derivatives of the first order with separate approximation for spatial derivatives of the second order to reduce the PDE system to an ODE system by the time variable.
• Recursive 5-Point Differences—Uses 5-point differences approximations of spatial derivatives of the first order with recursive application for spatial derivatives of the second order to reduce the PDE system to an ODE system by the time variable.
Algorithm Selection for Odesolve
• Adams/BDF (default)—For non-stiff systems, Odesolve calls the Adams solver that uses the Adams-Bashforth methods. If Odesolve detects that the system of ODEs is stiff, it switches to BDF (Backward Differentiation Formula) solver.
• Fixed—Calls the rkfixed solver that uses a fixed-step Runge-Kutta method.
• Addaptive—Calls the Rkadapt solver that uses a Runge-Kutta method with adaptive step-size.
• Radau—For systems that are stiff or have algebraic constraints, Odesolve calls the Radau solver.
Algorithm Selection for genfit
• Optimized Levenberg Marquardt (default)—This optimized version of the Levenberg-Marquardt method for minimization is frequently faster and less sensitive to innacurate guess values, and is more sensitive to errors in the supplied algebraic derivatives.
• Levenberg-Marquardt—This method for minimization is used to solve non-linear least square problems. This method should be used with genfit for well behave function and accurate guess values.
Algorithm Selection for polyroots
• LaGuerre (default)—This method is iterative and searches for solutions in the complex plane.
• Companion Matrix—This method converts the equations into an eigenvalue problem.