版本1.1.7097.23301
Functions
Additional components that add new mathematical functions to the SMath Studio program, necessary for solving problems from various fields.
-
BDQRF("1:function", "2:condition", "3:condition")
Bisected Direct Quadratic Regula Falsi root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least 4 decimal places function precision. -
BDQRF("1:function", "2:condition", "3:condition", "4:condition")
Bisected Direct Quadratic Regula Falsi root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" function precision. -
BDQRF("1:function", "2:condition", "3:condition", "4:condition", "5:condition")
Bisected Direct Quadratic Regula Falsi root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" function precision or "5:condition" variable precision. -
BDQRF("1:function", "2:condition", "3:condition", "4:condition", "5:condition", "6:number", "7:variable", "8:variable", "9:variable")
Bisected Direct Quadratic Regula Falsi root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" function precision or "5:condition" variable precision. A "6:number" different from 0 set your custom max number of iterations, a "7:variable" different from 0 show you the number of iterations, a "8:variable" different from 0 show you a step-by-step summary and a "9:variable" different from 0 save a CSV summary into the current working directory. -
Bisection("1:function", "2:condition", "3:condition")
Bisection root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least 4 decimal places function precision. -
Bisection("1:function", "2:condition", "3:condition", "4:condition")
Bisection root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" function precision. -
Bisection("1:function", "2:condition", "3:condition", "4:condition", "5:condition")
Bisection root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" function precision or "5:condition" variable precision. -
Bisection("1:function", "2:condition", "3:condition", "4:condition", "5:condition", "6:variable", "7:variable", "8:variable")
Bisection root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" function precision or "5:condition" variable precision. A "6:variable" different from 0 show you the number of iterations, a "7:variable" different from 0 show you a step-by-step summary and a "8:variable" different from 0 save a CSV summary into the current working directory. -
Brent("1:function", "2:condition", "3:condition")
Brent's root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least 4 decimal places function precision. -
Brent("1:function", "2:condition", "3:condition", "4:condition")
Brent's root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" function precision. -
Brent("1:function", "2:condition", "3:condition", "4:condition", "5:condition")
Brent's root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" function precision or "5:condition" variable precision. -
Brent("1:function", "2:condition", "3:condition", "4:condition", "5:condition", "6:number", "7:variable", "8:variable", "9:variable")
Brent's root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" function precision or "5:condition" variable precision. A "6:number" different from 0 set your custom max number of iterations, a "7:variable" different from 0 show you the number of iterations, a "8:variable" different from 0 show you a step-by-step summary and a "9:variable" different from 0 save a CSV summary into the current working directory. -
Broyden("1:function", "2:condition")
Broyden's root-finding method of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places function(s) precision. -
Broyden("1:function", "2:condition", "3:condition")
Broyden's root-finding method of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least "3:condition" function(s) precision. -
Broyden("1:function", "2:condition", "3:condition", "4:condition")
Broyden's root-finding method of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. -
Broyden("1:function", "2:condition", "3:condition", "4:condition", "5:number", "6:variable", "7:variable", "8:variable")
Broyden's root-finding method of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. A "5:number" different from 0 set your custom max number of iterations, a "6:variable" different from 0 show you the number of iterations, a "7:variable" different from 0 show you a step-by-step summary and a "8:variable" different from 0 save a CSV summary into the current working directory. -
FindRoot("1:function", "2:condition")
Find root(s) of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places function(s) precision. -
FindRoot("1:function", "2:condition", "3:condition")
Find root(s) of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least "3:condition" function(s) precision. -
FindRoot("1:function", "2:condition", "3:condition", "4:condition")
Find root(s) of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. -
GaussNewton.CD("1:function", "2:condition")
Gauss-Newton optimization algorithm of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places variable(s) precision. Alghorithm use a constant step length. -
GaussNewton.CD("1:function", "2:condition", "3:condition")
Gauss-Newton optimization algorithm of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. Alghorithm use a constant step length. -
GaussNewton.CD("1:function", "2:condition", "3:condition", "4:condition")
Gauss-Newton optimization algorithm of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:condition" different from 0 set your custom perturbation. Alghorithm use a constant step length. -
GaussNewton.CD("1:function", "2:condition", "3:condition", "4:condition", "5:number", "6:variable", "7:variable", "8:variable")
Gauss-Newton optimization algorithm of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:condition" different from 0 set your custom perturbation. A "5:number" different from 0 set your custom max number of iterations, a "6:variable" different from 0 show you the number of iterations, a "7:variable" different from 0 show you a step-by-step summary and a "8:variable" different from 0 save a CSV summary into the current working directory. Alghorithm use a constant step length. -
GaussNewton.GSS;CD("1:function", "2:condition")
Gauss-Newton optimization algorithm of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places variable(s) precision. Alghorithm use a step length based on a Golden Section Search line search strategy. -
GaussNewton.GSS;CD("1:function", "2:condition", "3:condition")
Gauss-Newton optimization algorithm of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. Alghorithm use a step length based on a Golden Section Search line search strategy. -
GaussNewton.GSS;CD("1:function", "2:condition", "3:condition", "4:condition")
Gauss-Newton optimization algorithm of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:condition" different from 0 set your custom perturbation. Alghorithm use a step length based on a Golden Section Search line search strategy. -
GaussNewton.GSS;CD("1:function", "2:condition", "3:condition", "4:condition", "5:number", "6:variable", "7:variable", "8:variable")
Gauss-Newton optimization algorithm of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:condition" different from 0 set your custom perturbation. A "5:number" different from 0 set your custom max number of iterations, a "6:variable" different from 0 show you the number of iterations, a "7:variable" different from 0 show you a step-by-step summary and a "8:variable" different from 0 save a CSV summary into the current working directory. Alghorithm use a step length based on a Golden Section Search line search strategy. -
GaussNewton.GSS("1:function", "2:condition")
Gauss-Newton optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places variable(s) precision. Alghorithm use a step length based on a Golden Section Search line search strategy. -
GaussNewton.GSS("1:function", "2:condition", "3:condition")
Gauss-Newton optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. Alghorithm use a step length based on a Golden Section Search line search strategy. -
GaussNewton.GSS("1:function", "2:condition", "3:condition", "4:number", "5:variable", "6:variable", "7:variable")
Gauss-Newton optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:number" different from 0 set your custom max number of iterations, a "5:variable" different from 0 show you the number of iterations, a "6:variable" different from 0 show you a step-by-step summary and a "7:variable" different from 0 save a CSV summary into the current working directory. Alghorithm use a step length based on a Golden Section Search line search strategy. -
GaussNewton("1:function", "2:condition")
Gauss-Newton optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places variable(s) precision. Alghorithm use a constant step length. -
GaussNewton("1:function", "2:condition", "3:condition")
Gauss-Newton optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. Alghorithm use a constant step length. -
GaussNewton("1:function", "2:condition", "3:condition", "4:number", "5:variable", "6:variable", "7:variable")
Gauss-Newton optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:number" different from 0 set your custom max number of iterations, a "5:variable" different from 0 show you the number of iterations, a "6:variable" different from 0 show you a step-by-step summary and a "7:variable" different from 0 save a CSV summary into the current working directory. Alghorithm use a constant step length. -
GoldenSectionSearch.max("1:function", "2:condition", "3:condition")
Golden Section Search extremum finding of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least 4 decimal places variable precision. -
GoldenSectionSearch.max("1:function", "2:condition", "3:condition", "4:condition")
Golden Section Search extremum finding of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" variable precision. -
GoldenSectionSearch.max("1:function", "2:condition", "3:condition", "4:condition", "5:number", "6:variable", "7:variable", "8:variable")
Golden Section Search extremum finding of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" variable precision. A "5:number" different from 0 set your custom max number of iterations, a "6:variable" different from 0 show you the number of iterations, a "7:variable" different from 0 show you a step-by-step summary and a "8:variable" different from 0 save a CSV summary into the current working directory. -
GoldenSectionSearch.min("1:function", "2:condition", "3:condition")
Golden Section Search extremum finding of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least 4 decimal places variable precision. -
GoldenSectionSearch.min("1:function", "2:condition", "3:condition", "4:condition")
Golden Section Search extremum finding of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" variable precision. -
GoldenSectionSearch.min("1:function", "2:condition", "3:condition", "4:condition", "5:number", "6:variable", "7:variable", "8:variable")
Golden Section Search extremum finding of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" variable precision. A "5:number" different from 0 set your custom max number of iterations, a "6:variable" different from 0 show you the number of iterations, a "7:variable" different from 0 show you a step-by-step summary and a "8:variable" different from 0 save a CSV summary into the current working directory. -
Gradient.CD("1:function", "2:variable")
Numerical first order central differences of "1:function" evaluated at "2:variable"; returns Gradients or 1st order differentiations. -
Gradient.CD("1:function", "2:variable", "3:variable")
Numerical first order central differences of "1:function" evaluated at "2:variable" using a "3:variable" perturbation; returns Gradients or 1st order differentiations. -
Gradient("1:function", "2:variable")
First order derivatives of "1:function" evaluated at "2:variable"; returns Gradients or 1st order differentiations. -
GradientAscent.GSS("1:function", "2:condition")
Gradient ascent optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places variable(s) precision. Alghorithm use a step length based on a Golden Section Search line search strategy. -
GradientAscent.GSS("1:function", "2:condition", "3:condition")
Gradient ascent optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. Alghorithm use a step length based on a Golden Section Search line search strategy. -
GradientAscent.GSS("1:function", "2:condition", "3:condition", "4:number", "5:variable", "6:variable", "7:variable")
Gradient ascent optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:number" different from 0 set your custom max number of iterations, a "5:variable" different from 0 show you the number of iterations, a "6:variable" different from 0 show you a step-by-step summary and a "7:variable" different from 0 save a CSV summary into the current working directory. Alghorithm use a step length based on a Golden Section Search line search strategy. -
GradientAscent("1:function", "2:condition")
Gradient ascent optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places variable(s) precision. Alghorithm use a constant step length. -
GradientAscent("1:function", "2:condition", "3:condition")
Gradient ascent optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. Alghorithm use a constant step length. -
GradientAscent("1:function", "2:condition", "3:condition", "4:number", "5:variable", "6:variable", "7:variable")
Gradient ascent optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:number" different from 0 set your custom max number of iterations, a "5:variable" different from 0 show you the number of iterations, a "6:variable" different from 0 show you a step-by-step summary and a "7:variable" different from 0 save a CSV summary into the current working directory. Alghorithm use a constant step length. -
GradientDescent.GSS("1:function", "2:condition")
Gradient descent optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places variable(s) precision. Alghorithm use a step length based on a Golden Section Search line search strategy. -
GradientDescent.GSS("1:function", "2:condition", "3:condition")
Gradient descent optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. Alghorithm use a step length based on a Golden Section Search line search strategy. -
GradientDescent.GSS("1:function", "2:condition", "3:condition", "4:number", "5:variable", "6:variable", "7:variable")
Gradient descent optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:number" different from 0 set your custom max number of iterations, a "5:variable" different from 0 show you the number of iterations, a "6:variable" different from 0 show you a step-by-step summary and a "7:variable" different from 0 save a CSV summary into the current working directory. Alghorithm use a step length based on a Golden Section Search line search strategy. -
GradientDescent("1:function", "2:condition")
Gradient descent optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places variable(s) precision. Alghorithm use a constant step length. -
GradientDescent("1:function", "2:condition", "3:condition")
Gradient descent optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. Alghorithm use a constant step length. -
GradientDescent("1:function", "2:condition", "3:condition", "4:number", "5:variable", "6:variable", "7:variable")
Gradient descent optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:number" different from 0 set your custom max number of iterations, a "5:variable" different from 0 show you the number of iterations, a "6:variable" different from 0 show you a step-by-step summary and a "7:variable" different from 0 save a CSV summary into the current working directory. Alghorithm use a constant step length. -
Hessian.CD("1:function", "2:variable")
Numerical second order central differences of "1:function" evaluated at "2:variable"; returns Hessians or 2nd order differentiations. -
Hessian.CD("1:function", "2:variable", "3:variable")
Numerical second order central differences of "1:function" evaluated at "2:variable" using a "3:variable" perturbation; returns Hessians or 2nd order differentiations. -
Hessian("1:function", "2:variable")
Second order derivatives of "1:function" evaluated at "2:variable"; returns Hessians or 2nd order differentiations. -
HRE.B("1:function", "2:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Broyden's algorithm; calculation have at least 4 decimal places function(s) precision. -
HRE.B("1:function", "2:condition", "3:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Broyden's algorithm; calculation have at least "3:condition" function(s) precision. -
HRE.B("1:function", "2:condition", "3:condition", "4:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Broyden's algorithm; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. -
HRE.B("1:function", "2:condition", "3:condition", "4:condition", "5:number", "6:variable", "7:variable", "8:variable")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Broyden's algorithm; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. A "5:number" different from 0 set your custom number of homotopy transformations, a "6:variable" different from 0 show you the number of iterations, a "7:variable" different from 0 show you a step-by-step summary and a "8:variable" different from 0 save a CSV summary into the current working directory. -
HRE.NR;CD("1:function", "2:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the central differences Newton's algorithm; calculation have at least 4 decimal places function(s) precision. -
HRE.NR;CD("1:function", "2:condition", "3:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the central differences Newton's algorithm; calculation have at least "3:condition" function(s) precision. -
HRE.NR;CD("1:function", "2:condition", "3:condition", "4:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the central differences Newton's algorithm; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. -
HRE.NR;CD("1:function", "2:condition", "3:condition", "4:condition", "5:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the central differences Newton's algorithm; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. A "5:condition" different from 0 set your custom perturbation. -
HRE.NR;CD("1:function", "2:condition", "3:condition", "4:condition", "5:condition", "6:number", "7:variable", "8:variable", "9:variable")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the central differences Newton's algorithm; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. A "5:condition" different from 0 set your custom perturbation. A "6:number" different from 0 set your custom number of homotopy transformations, a "7:variable" different from 0 show you the number of iterations, a "8:variable" different from 0 show you a step-by-step summary and a "9:variable" different from 0 save a CSV summary into the current working directory. -
HRE.NR("1:function", "2:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Newton's algorithm; calculation have at least 4 decimal places function(s) precision. -
HRE.NR("1:function", "2:condition", "3:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Newton's algorithm; calculation have at least "3:condition" function(s) precision. -
HRE.NR("1:function", "2:condition", "3:condition", "4:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Newton's algorithm; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. -
HRE.NR("1:function", "2:condition", "3:condition", "4:condition", "5:number", "6:variable", "7:variable", "8:variable")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Newton's algorithm; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. A "5:number" different from 0 set your custom number of homotopy transformations, a "6:variable" different from 0 show you the number of iterations, a "7:variable" different from 0 show you a step-by-step summary and a "8:variable" different from 0 save a CSV summary into the current working directory. -
HRE.RK;CD("1:function", "2:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Runge-Kutta 4th order central differences algorithm; calculation have at least 4 decimal places function(s) precision. -
HRE.RK;CD("1:function", "2:condition", "3:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Runge-Kutta 4th order central differences algorithm; calculation have at least "3:condition" function(s) precision. -
HRE.RK;CD("1:function", "2:condition", "3:condition", "4:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Runge-Kutta 4th order central differences algorithm; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. -
HRE.RK;CD("1:function", "2:condition", "3:condition", "4:condition", "5:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Runge-Kutta 4th order central differences algorithm; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. A "5:condition" different from 0 set your custom perturbation. -
HRE.RK;CD("1:function", "2:condition", "3:condition", "4:condition", "5:condition", "6:number", "7:variable", "8:variable", "9:variable")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Runge-Kutta 4th order central differences algorithm; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. A "5:condition" different from 0 set your custom perturbation. A "6:number" different from 0 set your custom number of homotopy transformations, a "7:variable" different from 0 show you the number of iterations, a "8:variable" different from 0 show you a step-by-step summary and a "9:variable" different from 0 save a CSV summary into the current working directory. -
HRE.RK("1:function", "2:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Runge-Kutta 4th order algorithm; calculation have at least 4 decimal places function(s) precision. -
HRE.RK("1:function", "2:condition", "3:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Runge-Kutta 4th order algorithm; calculation have at least "3:condition" function(s) precision. -
HRE.RK("1:function", "2:condition", "3:condition", "4:condition")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Runge-Kutta 4th order algorithm; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. -
HRE.RK("1:function", "2:condition", "3:condition", "4:condition", "5:number", "6:variable", "7:variable", "8:variable")
Homotopy root-estimation method of function(s) "1:function", giving an initial guess "2:condition" for each variable, using the Runge-Kutta 4th order algorithm; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. A "5:number" different from 0 set your custom number of homotopy transformations, a "6:variable" different from 0 show you the number of iterations, a "7:variable" different from 0 show you a step-by-step summary and a "8:variable" different from 0 save a CSV summary into the current working directory. -
Jacobian.CD("1:function", "2:variable")
Numerical first order central differences of "1:function" evaluated at "2:variable"; returns Jacobians or 1st order differentiations. -
Jacobian.CD("1:function", "2:variable", "3:variable")
Numerical first order central differences of "1:function" evaluated at "2:variable" using a "3:variable" perturbation; returns Jacobians or 1st order differentiations. -
Jacobian("1:function", "2:variable")
First order derivatives of "1:function" evaluated at "2:variable"; returns Jacobians or 1st order differentiations. -
LevenbergMarquardt.CD("1:function", "2:condition")
Levenberg-Marquardt optimization algorithm of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places variable(s) precision. Alghorithm use a constant step length. -
LevenbergMarquardt.CD("1:function", "2:condition", "3:condition")
Levenberg-Marquardt optimization algorithm of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. Alghorithm use a constant step length. -
LevenbergMarquardt.CD("1:function", "2:condition", "3:condition", "4:condition")
Levenberg-Marquardt optimization algorithm of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:condition" different from 0 set your custom perturbation. Alghorithm use a constant step length. -
LevenbergMarquardt.CD("1:function", "2:condition", "3:condition", "4:condition", "5:number", "6:variable", "7:variable", "8:variable")
Levenberg-Marquardt optimization algorithm of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:condition" different from 0 set your custom perturbation. A "5:number" different from 0 set your custom max number of iterations, a "6:variable" different from 0 show you the number of iterations, a "7:variable" different from 0 show you a step-by-step summary and a "8:variable" different from 0 save a CSV summary into the current working directory. Alghorithm use a constant step length. -
LevenbergMarquardt("1:function", "2:condition")
Levenberg-Marquardt optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places variable(s) precision. Alghorithm use a constant step length. -
LevenbergMarquardt("1:function", "2:condition", "3:condition")
Levenberg-Marquardt optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. Alghorithm use a constant step length. -
LevenbergMarquardt("1:function", "2:condition", "3:condition", "4:number", "5:variable", "6:variable", "7:variable")
Levenberg-Marquardt optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:number" different from 0 set your custom max number of iterations, a "5:variable" different from 0 show you the number of iterations, a "6:variable" different from 0 show you a step-by-step summary and a "7:variable" different from 0 save a CSV summary into the current working directory. Alghorithm use a constant step length. -
mapUnknowns("1:function", "2:condition")
Symbolical variables' mapping; returns a vector of unassigned variables/elements contained in "1:function", according with the "2:condition" pattern. -
mapUnknowns("1:function", "2:condition", "3:name")
Symbolical variables' mapping; returns a vector of unassigned elements contained in "1:function", according with the "2:condition" pattern, using "3:name" as unknown name. -
NCGM.CD("1:function", "2:condition")
Nonlinear Conjugate Gradient Method optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places variable(s) precision. Alghorithm use a step length based on a Golden Section Search line search strategy. -
NCGM.CD("1:function", "2:condition", "3:condition")
Nonlinear Conjugate Gradient Method optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. Alghorithm use a step length based on a Golden Section Search line search strategy. -
NCGM.CD("1:function", "2:condition", "3:condition", "4:condition")
Nonlinear Conjugate Gradient Method optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:condition" different from 0 set your custom perturbation. Alghorithm use a step length based on a Golden Section Search line search strategy. -
NCGM.CD("1:function", "2:condition", "3:condition", "4:condition", "5:number", "6:variable", "7:variable", "8:variable")
Nonlinear Conjugate Gradient Method optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:condition" different from 0 set your custom perturbation. A "5:number" different from 0 set your custom max number of iterations, a "6:variable" different from 0 show you the number of iterations, a "7:variable" different from 0 show you a step-by-step summary and a "8:variable" different from 0 save a CSV summary into the current working directory. Alghorithm use a step length based on a Golden Section Search line search strategy. -
NCGM("1:function", "2:condition")
Nonlinear Conjugate Gradient Method optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places variable(s) precision. Alghorithm use a step length based on a Golden Section Search line search strategy. -
NCGM("1:function", "2:condition", "3:condition")
Nonlinear Conjugate Gradient Method optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. Alghorithm use a step length based on a Golden Section Search line search strategy. -
NCGM("1:function", "2:condition", "3:condition", "4:number", "5:variable", "6:variable", "7:variable")
Nonlinear Conjugate Gradient Method optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:number" different from 0 set your custom max number of iterations, a "5:variable" different from 0 show you the number of iterations, a "6:variable" different from 0 show you a step-by-step summary and a "7:variable" different from 0 save a CSV summary into the current working directory. Alghorithm use a step length based on a Golden Section Search line search strategy. -
NelderMead("1:function", "2:condition", "3:condition", "4:condition", "5:condition", "6:condition", "7:number", "8:variable", "9:variable", "10:variable")
Nelder-Mead optimization algorithm of function(s) "1:function", giving an initial simplex or an initial guess "2:condition"; calculation have "3:condition" standard deviation precision for function(s) on the simplex. A "4:number" different from 0 set your custom reflection coefficient, a "5:number" different from 0 set your custom contraction coefficient and a "6:number" different from 0 set your custom expansion coefficient. A "7:number" different from 0 set your custom max number of iterations, a "8:variable" different from 0 show you the number of iterations, a "9:variable" different from 0 show you a step-by-step summary and a "10:variable" different from 0 save a CSV summary into the current working directory. -
NelderMead("1:function", "2:condition")
Nelder-Mead optimization algorithm of function(s) "1:function", giving an initial simplex or an initial guess "2:condition"; calculation have at least 4 decimal places standard deviation precision for function(s) on the simplex. -
NelderMead("1:function", "2:condition", "3:condition")
Nelder-Mead optimization algorithm of function(s) "1:function", giving an initial simplex or an initial guess "2:condition"; calculation have "3:condition" standard deviation precision for function(s) on the simplex. -
NelderMead("1:function", "2:condition", "3:condition", "4:condition", "5:condition", "6:condition")
Nelder-Mead optimization algorithm of function(s) "1:function", giving an initial simplex or an initial guess "2:condition"; calculation have "3:condition" standard deviation precision for function(s) on the simplex. A "4:number" different from 0 set your custom reflection coefficient, a "5:number" different from 0 set your custom contraction coefficient and a "6:number" different from 0 set your custom expansion coefficient. -
NewtonMethod.CD("1:function", "2:condition")
Newton's optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places variable(s) precision. -
NewtonMethod.CD("1:function", "2:condition", "3:condition")
Newton's optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. -
NewtonMethod.CD("1:function", "2:condition", "3:condition", "4:condition")
Newton's optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:condition" different from 0 set your custom perturbation. -
NewtonMethod.CD("1:function", "2:condition", "3:condition", "4:condition", "5:number", "6:variable", "7:variable", "8:variable")
Newton's optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:condition" different from 0 set your custom perturbation. A "5:number" different from 0 set your custom max number of iterations, a "6:variable" different from 0 show you the number of iterations, a "7:variable" different from 0 show you a step-by-step summary and a "8:variable" different from 0 save a CSV summary into the current working directory. -
NewtonMethod.GSS;CD("1:function", "2:condition")
Newton's optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places variable(s) precision. Alghorithm use a step length based on a Golden Section Search line search strategy. -
NewtonMethod.GSS;CD("1:function", "2:condition", "3:condition")
Newton's optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. Alghorithm use a step length based on a Golden Section Search line search strategy. -
NewtonMethod.GSS;CD("1:function", "2:condition", "3:condition", "4:condition")
Newton's optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:condition" different from 0 set your custom perturbation. Alghorithm use a step length based on a Golden Section Search line search strategy. -
NewtonMethod.GSS;CD("1:function", "2:condition", "3:condition", "4:condition", "5:number", "6:variable", "7:variable", "8:variable")
Newton's optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:condition" different from 0 set your custom perturbation. A "5:number" different from 0 set your custom max number of iterations, a "6:variable" different from 0 show you the number of iterations, a "7:variable" different from 0 show you a step-by-step summary and a "8:variable" different from 0 save a CSV summary into the current working directory. Alghorithm use a step length based on a Golden Section Search line search strategy. -
NewtonMethod.GSS("1:function", "2:condition")
Newton's optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places variable(s) precision. Alghorithm use a step length based on a Golden Section Search line search strategy. -
NewtonMethod.GSS("1:function", "2:condition", "3:condition")
Newton's optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. Alghorithm use a step length based on a Golden Section Search line search strategy. -
NewtonMethod.GSS("1:function", "2:condition", "3:condition", "4:number", "5:variable", "6:variable", "7:variable")
Newton's optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:number" different from 0 set your custom max number of iterations, a "5:variable" different from 0 show you the number of iterations, a "6:variable" different from 0 show you a step-by-step summary and a "7:variable" different from 0 save a CSV summary into the current working directory. Alghorithm use a step length based on a Golden Section Search line search strategy. -
NewtonMethod("1:function", "2:condition")
Newton's optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places variable(s) precision. -
NewtonMethod("1:function", "2:condition", "3:condition")
Newton's optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. -
NewtonMethod("1:function", "2:condition", "3:condition", "4:number", "5:variable", "6:variable", "7:variable")
Newton's optimization algorithm of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have "3:condition" variable(s) precision. A "4:number" different from 0 set your custom max number of iterations, a "5:variable" different from 0 show you the number of iterations, a "6:variable" different from 0 show you a step-by-step summary and a "7:variable" different from 0 save a CSV summary into the current working directory. -
NewtonRaphson.CD("1:function", "2:condition")
Newton's root-finding method of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places function(s) precision. -
NewtonRaphson.CD("1:function", "2:condition", "3:condition")
Newton's root-finding method of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have at least "3:condition" function(s) precision. -
NewtonRaphson.CD("1:function", "2:condition", "3:condition", "4:condition")
Newton's root-finding method of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. -
NewtonRaphson.CD("1:function", "2:condition", "3:condition", "4:condition", "5:condition")
Newton's root-finding method of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. A "5:condition" different from 0 set your custom perturbation. -
NewtonRaphson.CD("1:function", "2:condition", "3:condition", "4:condition", "5:condition", "6:number", "7:variable", "8:variable", "9:variable")
Newton's root-finding method of function(s) "1:function" using central differences, giving an initial guess "2:condition" for each variable; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. A "5:condition" different from 0 set your custom perturbation. A "6:number" different from 0 set your custom max number of iterations, a "7:variable" different from 0 show you the number of iterations, a "8:variable" different from 0 show you a step-by-step summary and a "9:variable" different from 0 save a CSV summary into the current working directory. -
NewtonRaphson("1:function", "2:condition")
Newton's root-finding method of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least 4 decimal places function(s) precision. -
NewtonRaphson("1:function", "2:condition", "3:condition")
Newton's root-finding method of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least "3:condition" function(s) precision. -
NewtonRaphson("1:function", "2:condition", "3:condition", "4:condition")
Newton's root-finding method of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. -
NewtonRaphson("1:function", "2:condition", "3:condition", "4:condition", "5:number", "6:variable", "7:variable", "8:variable")
Newton's root-finding method of function(s) "1:function", giving an initial guess "2:condition" for each variable; calculation have at least "3:condition" function(s) precision or "4:condition" variable(s) precision. A "5:number" different from 0 set your custom max number of iterations, a "6:variable" different from 0 show you the number of iterations, a "7:variable" different from 0 show you a step-by-step summary and a "8:variable" different from 0 save a CSV summary into the current working directory. -
Ridder("1:function", "2:condition", "3:condition")
Brent's root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least 4 decimal places function precision. -
Ridder("1:function", "2:condition", "3:condition", "4:condition")
Brent's root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" function precision. -
Ridder("1:function", "2:condition", "3:condition", "4:condition", "5:condition")
Brent's root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" function precision or "5:condition" variable precision. -
Ridder("1:function", "2:condition", "3:condition", "4:condition", "5:condition", "6:number", "7:variable", "8:variable", "9:variable")
Brent's root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" function precision or "5:condition" variable precision. A "6:number" different from 0 set your custom max number of iterations, a "7:variable" different from 0 show you the number of iterations, a "8:variable" different from 0 show you a step-by-step summary and a "9:variable" different from 0 save a CSV summary into the current working directory. -
Secant("1:function", "2:condition", "3:condition")
Secant root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least 4 decimal places function precision. -
Secant("1:function", "2:condition", "3:condition", "4:condition")
Secant root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" function precision. -
Secant("1:function", "2:condition", "3:condition", "4:condition", "5:condition")
Secant root-finding method of function "1:function", giving a couple of delimiters "2:condition" and "3:condition"; calculation have at least "4:condition" function precision or "5:condition" variable precision. -
Secant("1:function", "2:condition", "3:condition", "4:condition", "5:condition", "6:number", "7:variable", "8:variable", "9:variable")
Secant root-finding method of function "1:function", giving a couple of initial guess "2:condition" and "3:condition"; calculation have at least "4:condition" function precision or "5:condition" variable precision. A "6:number" different from 0 set your custom max number of iterations, a "7:variable" different from 0 show you the number of iterations, a "8:variable" different from 0 show you a step-by-step summary and a "9:variable" different from 0 save a CSV summary into the current working directory. -
Taylor("1:function", "2:variable", "3:number")
Taylor series expansion of "1:function" about the "2:variable" point up to the "3:number"th order. -
Unknowns("variable")
Variables' detection; returns a vector of unassigned variables contained in "1:variable".