Hi,

----

TLDR:

optimizer and integration functions require different functions for

value gradient and hessian. This is too expensive when computing the

value is costly. How to comply with the optimizer/integrator signature

without the extra costs of mutiple evaluations?

----

Many function in octave (e.g. optimizations, like sqp) that optionally

accept gradients, and hessians, do it by accepting different functions

that compute each input argument. e.g. a cell argument in which the

first element is the function, the second element the gradient, and

the third element the hessian.

For many years I liked this separation, but having more experience

with other optimizers. I actually realize that accepting a single

function with multiple output arguments tends to be more numerically

friendly. This is clear when the computation of the function is costly

(eg. likelihood functions of GP) and many of the intermediate

computations can be sued in the gradient and the hessian (in

likelihood functions this is the inverse of the covariance, which is

very expensive!). That is one can compute value, gradient and hessian

in one call to the function.

So far I have not been able to use octave optimizers with gradients

and hessian due to this problem (running time). That is I couldn't yet

find a way to pass three different functions (for the value, the

gradient and the hessian), but internally compute the expensive part

only once.

Note that the problem is induced by the signature of the methods, not

by any property of the function.

Do you have any solution to this problem?

--

JuanPi Carbajal

https://goo.gl/ayiJzi-----

“An article about computational result is advertising, not

scholarship. The actual scholarship is the full software environment,

code and data, that produced the result.”

- Buckheit and Donoho