API — Objective Functions
Retro supports three levels of user control over derivatives.
Abstract type
Retro.AbstractObjectiveFunction — Type
AbstractObjectiveFunctionSupertype for all Retro objective functions.
Concrete subtypes: ADObjectiveFunction, GradientObjectiveFunction, AnalyticObjectiveFunction.
AD-only objective
Retro.ADObjectiveFunction — Type
ADObjectiveFunction{F,ADT,PG,PH} <: AbstractObjectiveFunctionObjective function with automatic differentiation via DifferentiationInterface. Precomputes gradient and Hessian preparation objects at construction time so that repeated evaluations are allocation-free.
Works with any AD backend that DifferentiationInterface supports (ForwardDiff, Enzyme, Zygote, …) as long as the objective is compatible.
Fields
func::F: Scalar objective $f : \mathbb{R}^n \to \mathbb{R}$adtype::ADT: AD backend (e.g.AutoForwardDiff())prep_g::PG: Prepared gradient operator (fromprepare_gradient)prep_h::PH: Prepared Hessian operator (fromprepare_hessian)
User gradient + AD Hessian
Retro.GradientObjectiveFunction — Type
GradientObjectiveFunction{F,G,ADT,PH} <: AbstractObjectiveFunctionObjective with a user-supplied gradient and AD-computed Hessian.
Fields
func::F: Scalar objective $f : \mathbb{R}^n \to \mathbb{R}$grad!::G: In-place gradientgrad!(g, x)adtype::ADT: AD backend used only for Hessian computationprep_h::PH: Prepared Hessian operator
Fully analytic
Retro.AnalyticObjectiveFunction — Type
AnalyticObjectiveFunction{F,G,H} <: AbstractObjectiveFunctionFully user-supplied objective, gradient, and Hessian. No AD dependency.
Fields
func::F: Scalar objective $f : \mathbb{R}^n \to \mathbb{R}$grad!::G: In-place gradientgrad!(g, x)hess!::H: In-place Hessianhess!(H, x)
Evaluation interface
These functions are used internally by the optimizer. You normally do not need to call them yourself.
Retro.objfunc! — Function
objfunc!(cache, obj, x) -> f(x)Evaluate the objective function at x, incrementing the call counter in cache.
DifferentiationInterface.gradient! — Function
gradient!(g, cache, obj, x)Compute the gradient of obj at x and store it in g (in-place).
DifferentiationInterface.hessian! — Function
hessian!(H, cache, obj, x)Compute the Hessian of obj at x and store it in the matrix H (in-place).
DifferentiationInterface.value_and_gradient! — Function
value_and_gradient!(g, cache, obj, x) -> f(x)Compute objective value and gradient simultaneously. Returns the scalar objective value; the gradient is written to g.
DifferentiationInterface.value_gradient_and_hessian! — Function
value_gradient_and_hessian!(g, H, cache, obj, x) -> f(x)Compute objective value, gradient, and Hessian simultaneously. Returns the scalar objective value; g and H are written in-place.