API — Objective Functions

Retro supports three levels of user control over derivatives.

Abstract type

AD-only objective

Retro.ADObjectiveFunctionType
ADObjectiveFunction{F,ADT,PG,PH} <: AbstractObjectiveFunction

Objective function with automatic differentiation via DifferentiationInterface. Precomputes gradient and Hessian preparation objects at construction time so that repeated evaluations are allocation-free.

Works with any AD backend that DifferentiationInterface supports (ForwardDiff, Enzyme, Zygote, …) as long as the objective is compatible.

Fields

  • func::F: Scalar objective $f : \mathbb{R}^n \to \mathbb{R}$
  • adtype::ADT: AD backend (e.g. AutoForwardDiff())
  • prep_g::PG: Prepared gradient operator (from prepare_gradient)
  • prep_h::PH: Prepared Hessian operator (from prepare_hessian)
source

User gradient + AD Hessian

Retro.GradientObjectiveFunctionType
GradientObjectiveFunction{F,G,ADT,PH} <: AbstractObjectiveFunction

Objective with a user-supplied gradient and AD-computed Hessian.

Fields

  • func::F: Scalar objective $f : \mathbb{R}^n \to \mathbb{R}$
  • grad!::G: In-place gradient grad!(g, x)
  • adtype::ADT: AD backend used only for Hessian computation
  • prep_h::PH: Prepared Hessian operator
source

Fully analytic

Retro.AnalyticObjectiveFunctionType
AnalyticObjectiveFunction{F,G,H} <: AbstractObjectiveFunction

Fully user-supplied objective, gradient, and Hessian. No AD dependency.

Fields

  • func::F: Scalar objective $f : \mathbb{R}^n \to \mathbb{R}$
  • grad!::G: In-place gradient grad!(g, x)
  • hess!::H: In-place Hessian hess!(H, x)
source

Evaluation interface

These functions are used internally by the optimizer. You normally do not need to call them yourself.

Retro.objfunc!Function
objfunc!(cache, obj, x) -> f(x)

Evaluate the objective function at x, incrementing the call counter in cache.

source