
1234

Hi,
I am interested in the GSoC project for implementing Nonlinear and constrained least squares optimization problem in Octave [1].
I am studying the LevenbergMarquardt algorithm from [2]. I have taken a look at the levmar.c library [3]. It provides wrapper mex files to call it from MATLAB. The FAQ indicates the author heard that the wrappers are already Octave compatible. But he isn't sure.
I have quite a bit of experience with MATLAB but not mex/Octave.
My next step is to learn mex, call levmar.c from MATLAB and run a few examples between lsqcurvefit/lsqnonlin and levmar.c from MATLAB and comparing data. Then call mex from Octave and do similar.
At this point, I'd like a little feedback if I'm proceeding in the right direction. And any pointers would be appreciated :).
Other notes:
lsqnonlin in MATLAB also supports trustregionreflective algorithm for optimization. But I'll initially focus on levmar support.
Thankyou.
Kind Regards
Asma Afzal
PhD Student (Electrical Engineering) University of Leeds, UK.
References:
[1] http://wiki.octave.org/Summer_of_Code_Project_Ideas#Nonlinear_and_constrained_least_squares[2] http://www.imm.dtu.dk/pubdb/views/edoc_download.php/3215/pdf/imm3215.pdf[3] http://users.ics.forth.gr/~lourakis/levmar/index.html


Hi AsmaA,
On Mon, Feb 23, 2015 at 02:10:29PM 0800, AsmaA wrote:
> Hi,
>
> I am interested in the GSoC project for implementing Nonlinear and
> constrained least squares optimization problem in Octave [1].
I wasn't aware of the cited ([1]) paragraph on the wiki. I reproduce
it here for convenience:
 start quote
Nonlinear and constrained least squares
The Optimization package is missing the functions lsqcurvefit,
lsqlin, lsqnonlin to conveniently solve leastsquares problems that
are nonlinear and/or constrained. There are free implementations of
the needed algorithms in other languages, such as minpack in Fortran
and levmar in C. This project would link to or port these
implementations and develop Matlabcompatible Octave wrappers.
Mentor: Nir Krakauer
 end quote
Although it is true that we havn't functions called 'lsqcurvefit' or
'lsqnonlin' in the optim package, the rest of the paragraph (when was
it written?) does not adequately describe the situation. The wrapper
functions 'nonlin_residmin' and 'nonlin_curvefit' already provide
access to backends for least squares nonlinear constrained
optimization. The wrappers provide a convenient interface similarly to
'lsqnonlin' and 'lsqcurvefit', but are differently named because there
are some differences (not by chance, but by decision). Additionally,
they provide some functionality which 'lsqnonlin' and 'lsqcurvefit'
don't provide.
> I am studying the LevenbergMarquardt algorithm from [2].
This seems to be a general introduction to LM. An actual algorithm
must also, among others, take measures to be numerically stable. It is
usually best to start from one of the several already existing
algorithms (I know this is your intention). In the optim package we
have, among others, an SVDbased algorithm.
> I have taken a
> look at the levmar.c library [3].
On this webpage no nonlinear constraints seem to be mentioned, which
would make the new algorithm inferior to the existing algorithm with
respect to this. But it may still be worth having the new one.
> It provides wrapper mex files to call it
> from MATLAB. The FAQ indicates the author heard that the wrappers are
> already Octave compatible. But he isn't sure.
>
> I have quite a bit of experience with MATLAB but not mex/Octave.
>
> My next step is to learn mex, call levmar.c from MATLAB and run a few
> examples between lsqcurvefit/lsqnonlin and levmar.c from MATLAB and
> comparing data. Then call mex from Octave and do similar.
>
> At this point, I'd like a little feedback if I'm proceeding in the right
> direction. And any pointers would be appreciated :).
A new, alternative, optimization algorithm can be useful. There might
be problems better solvable by one algorithm than by the other. An
example problem for such a case in favour of the new algorithm would
be good, however.
Such a new, alternative, algorithm should be integrated as a backend
of 'nonlin_residmin' and 'nonlin_curvefit' (the latter is actually
only a conveniencewrapper to the former).
'nonlin_residmin' provides configurable access to functions computing
the Jacobian, so any new Jacobianfunction should be treated
separately from the new algorithm.
There is a somewhat outdated description of the backend interface of
'nonlin_residmin' in the optimization package. More can be learned
from studying an existing backend. It is probably also advisable to
coordinate the integration of a new backend with me.
If anyone wants to exactly mimic the interface of 'lsqnonlin' and
'lsqcurvefit' (and personally I'd think this would not be the right
way ... sorry) this can either be done by writing them as seperate
wrappers for the existing backends, or by writing them as wrappers to
'nonlin_residmin' or 'nonlin_curvefit'. It would probably be wrong to
write a new algorithm only for them.
I'd think a project 'officially' supported by the Octave community
should not be based on mexfiles (use mcode and/or octfiles).
I'm not sure, but it could be worth rewriting an algorithm of a
Clibrary in mcode (or as an octfile) so that Octaves native
interfaces to e.g. decompostions and matrix operations are used, and
modifications are more easily possible. And: to my experience, the
objective or modelfunction of the optimization problem is often so
expensive that writing least squares optimizing backends completely in
C would barely give a speed advantage. For expensive operations like
decompositions, mcode calls into Octave anyway, which interfaces with
compiled code to perform them.
Olaf
> References:
>
> [1]
> http://wiki.octave.org/Summer_of_Code_Project_Ideas#Nonlinear_and_constrained_least_squares>
> [2] http://www.imm.dtu.dk/pubdb/views/edoc_download.php/3215/pdf/imm3215.pdf>
> [3] http://users.ics.forth.gr/~lourakis/levmar/index.html
public key id EAFE0591, e.g. on xhkp://pool.skskeyservers.net


Hi Olaf,
Would you be interested in mentoring this project, since you're clearly more up to date on the Optimization package? If so, just update the description in the Wiki. I still think that having Matlabcompatible solver functions is worth doing, even if only as wrappers to the existing functions.
Nir


Hi Olaf,
Thankyou for the detailed response.
On 2/24/2015 8:42 AM, Olaf Till wrote:
> Hi AsmaA,
>
> On Mon, Feb 23, 2015 at 02:10:29PM 0800, AsmaA wrote:
>> Hi,
>>
...
>
> Although it is true that we havn't functions called 'lsqcurvefit' or
> 'lsqnonlin' in the optim package, the rest of the paragraph (when was
> it written?) does not adequately describe the situation. The wrapper
> functions 'nonlin_residmin' and 'nonlin_curvefit' already provide
> access to backends for least squares nonlinear constrained
> optimization. The wrappers provide a convenient interface similarly to
> 'lsqnonlin' and 'lsqcurvefit', but are differently named because there
> are some differences (not by chance, but by decision). Additionally,
> they provide some functionality which 'lsqnonlin' and 'lsqcurvefit'
> don't provide.
I was unaware of the existence of these functions and thought the wiki
mentioned an updated situation as the diff was quite recent.
>
>> I am studying the LevenbergMarquardt algorithm from [2].
>
> This seems to be a general introduction to LM. An actual algorithm
> must also, among others, take measures to be numerically stable. It is
> usually best to start from one of the several already existing
> algorithms (I know this is your intention). In the optim package we
> have, among others, an SVDbased algorithm.
>
>> I have taken a look at the levmar.c library [3].
>
> On this webpage no nonlinear constraints seem to be mentioned, which
> would make the new algorithm inferior to the existing algorithm with
> respect to this. But it may still be worth having the new one.
>
[2] just explains the unconstrained case (which I was referring to for
the basic understanding of how the LM algorithm works), but the
nonlinear constraints are considered in the levmar.c library [3].
start quote
To deal with linear equation constraints, levmar employs variable
elimination based on QR factorization, as described in ch. 15 of the
book Numerical Optimization by Nocedal and Wright. For the
boxconstrained case, levmar implements the algorithm proposed by C.
Kanzow, N. Yamashita and M. Fukushima, LevenbergMarquardt methods for
constrained nonlinear equations with strong local convergence
properties, Journal of Computational and Applied Mathematics 172, 2004,
pp. 375397
end quote
>>
...
>>
>
> A new, alternative, optimization algorithm can be useful. There might
> be problems better solvable by one algorithm than by the other. An
> example problem for such a case in favour of the new algorithm would
> be good, however.
>
> Such a new, alternative, algorithm should be integrated as a backend
> of 'nonlin_residmin' and 'nonlin_curvefit' (the latter is actually
> only a conveniencewrapper to the former).
>
> 'nonlin_residmin' provides configurable access to functions computing
> the Jacobian, so any new Jacobianfunction should be treated
> separately from the new algorithm.
1) Adding a new algorithm to the opt package backend
>
> There is a somewhat outdated description of the backend interface of
> 'nonlin_residmin' in the optimization package. More can be learned
> from studying an existing backend. It is probably also advisable to
> coordinate the integration of a new backend with me.
>
> If anyone wants to exactly mimic the interface of 'lsqnonlin' and
> 'lsqcurvefit' (and personally I'd think this would not be the right
> way ... sorry) this can either be done by writing them as seperate
> wrappers for the existing backends, or by writing them as wrappers to
> 'nonlin_residmin' or 'nonlin_curvefit'. It would probably be wrong to
> write a new algorithm only for them.
2) Mimicking 'lsqnonlin' and 'lsqcurvefit' using same backend
> I'd think a project 'officially' supported by the Octave community
> should not be based on mexfiles (use mcode and/or octfiles).
I agree that mex is not an ideal way for core functionality.
> I'm not sure, but it could be worth rewriting an algorithm of a
> Clibrary in mcode (or as an octfile) so that Octaves native
> interfaces to e.g. decompostions and matrix operations are used, and
> modifications are more easily possible.
3) Rewriting levmar C library into mcode for Octave? Or is there a C
library used by the optimization package that would benefit from porting
into mcode?
What would you suggest as the most suitable contribution as part of GSoC?
I don't have a particular inclination. Whichever helps Octave best.
Regards,
Asma


On Tue, Feb 24, 2015 at 08:37:24PM +0000, Nir Krakauer wrote:
> Hi Olaf,
>
> Would you be interested in mentoring this project, since you're clearly
> more up to date on the Optimization package? If so, just update the
> description in the Wiki. I still think that having Matlabcompatible solver
> functions is worth doing, even if only as wrappers to the existing
> functions.
Hi Nir,
thanks for the suggestion, but what are your reasons to think we
should have the same optimization interfaces as Matlab? I know you're
not alone with this opinion, but as yet I've not seen a reason that
convinces me.
Olaf

public key id EAFE0591, e.g. on xhkp://pool.skskeyservers.net


Olaf,
Mostly on general principles  Octave aims to run code written for Matlab, and optimization is a capability used across many application areas. If the underlying routines are already there, it could be just a matter of providing wrappers.
Nir


On Tue, Feb 24, 2015 at 10:28:57PM +0000, Nir Krakauer wrote:
> Olaf,
>
> Mostly on general principles  Octave aims to run code written for Matlab,
> and optimization is a capability used across many application areas. If the
> underlying routines are already there, it could be just a matter of
> providing wrappers.
In optimization (and maybe also in other fields, e.g. numeric solving
of differential equations) it is practically impossible always to
produce the same results as Matlab. Even if Matlab provides references
for its applied optimization algorithms, the details of the
algorithmimplementations have impact on the results and are too
individual to be reproduced without using their code.
So making the interface identical to Matlabs interface would only
yield a superficial similarity. In particular there will probably
always be certain optimization problems which Matlabs optimization
algorithms can solve well enough, but the "respective" Octave
algorithms can't (and vice versa, of course).
In fact I'd think that people earnestly involved with optimization are
aware of this. They probably also are aware of the potential problems
of optimization and are prepared to spend some effort coping with
them. Compared to this effort, the effort of getting familiar with an
interface slightly different to Matlabs is minor. Correspondingly,
AFAICR we never had a complaint at the mailing lists that the
optimization interface is not identical to Matlabs from people seeking
help with an actual optimization problem.
OTOH, people unaware of the potential difficulties in optimization may
be tempted to expect that an interface identical to Matlabs should
produce exactly the same result as in Matlab.
Taken together I'd think that in optimization, since we have a
slightly different interface which (hopefully) offers some advantages,
it's reasonable to expect users to use it instead of Matlabs.
To address the issue of visibility by former Matlab users, we could
have e.g. an lsqcurvefit function which just throws an error with a
text informing the user about the existence of our interfaces.
Olaf

public key id EAFE0591, e.g. on xhkp://pool.skskeyservers.net


Le 25/02/2015 10:19, Olaf Till a écrit :
> On Tue, Feb 24, 2015 at 10:28:57PM +0000, Nir Krakauer wrote:
>> Olaf,
>>
>> Mostly on general principles  Octave aims to run code written for Matlab,
>> and optimization is a capability used across many application areas. If the
>> underlying routines are already there, it could be just a matter of
>> providing wrappers.
> In optimization (and maybe also in other fields, e.g. numeric solving
> of differential equations) it is practically impossible always to
> produce the same results as Matlab. Even if Matlab provides references
> for its applied optimization algorithms, the details of the
> algorithmimplementations have impact on the results and are too
> individual to be reproduced without using their code.
>
> So making the interface identical to Matlabs interface would only
> yield a superficial similarity. In particular there will probably
> always be certain optimization problems which Matlabs optimization
> algorithms can solve well enough, but the "respective" Octave
> algorithms can't (and vice versa, of course).
>
> In fact I'd think that people earnestly involved with optimization are
> aware of this. They probably also are aware of the potential problems
> of optimization and are prepared to spend some effort coping with
> them. Compared to this effort, the effort of getting familiar with an
> interface slightly different to Matlabs is minor. Correspondingly,
> AFAICR we never had a complaint at the mailing lists that the
> optimization interface is not identical to Matlabs from people seeking
> help with an actual optimization problem.
>
> OTOH, people unaware of the potential difficulties in optimization may
> be tempted to expect that an interface identical to Matlabs should
> produce exactly the same result as in Matlab.
>
> Taken together I'd think that in optimization, since we have a
> slightly different interface which (hopefully) offers some advantages,
> it's reasonable to expect users to use it instead of Matlabs.
>
> To address the issue of visibility by former Matlab users, we could
> have e.g. an lsqcurvefit function which just throws an error with a
> text informing the user about the existence of our interfaces.
>
> Olaf
Olaf,
I mostly agree with everything you said and yet I believe that even some
superficial similarity would be a valuable step forward.
I say this both as a developer of a toolbox/package [1, 2] that strives
to remain compatible with both Octave (>=3.2.2, currently) and Matlab
(>= R2007a, currently) and as someone who has seen several times how
details of this kind (for instance, octave not having a function named
fmincon, compatible with Matlab's one), when they accumulate (and they
do), are able to prevent people from adopting Octave.
Take for instance fmincon. I am perfectly aware that it is not possible
to get a 100%similar implementation of this function, which is actually
an umbrella over several types of complicated algorithms (sqp,
activeset, interiorpoint), even with Matlab's documentation and all
the textbooks and research papers in the world. Yet, I think that having
a similar umbrella in Octave, with the same name, a compatible syntax
and set of options, would be very useful. For instance, when fmincon is
called with "Algorithm" set to "sqp", perhaps should we fall back on the
sqp function as an optimizer. In other cases, if no solver of the same
type is available, a warning could be issued that Octave's fmincon will
switch to a different type of optimizer.
I don't think that we would get that many complaints about Octave's
fmincon returning results different from Matlab's fmincon. A simple
disclaimer in the description of the "Algorithm" option would probably
be enough to answer these anyway.
Hope this helps,
Julien.
[1] http://octave.sourceforge.net/stk/index.html[2] http://sourceforge.net/projects/kriging/


On Tue, Feb 24, 2015 at 10:03:15PM +0000, Asma Afzal wrote:
> <snip>
> [2] just explains the unconstrained case (which I was referring to
> for the basic understanding of how the LM algorithm works), but the
> nonlinear constraints are considered in the levmar.c library [3].
>
> start quote
>
> To deal with linear equation constraints, levmar employs variable
> elimination based on QR factorization, as described in ch. 15 of the
> book Numerical Optimization by Nocedal and Wright. For the
> boxconstrained case, levmar implements the algorithm proposed by C.
> Kanzow, N. Yamashita and M. Fukushima, LevenbergMarquardt methods
> for constrained nonlinear equations with strong local convergence
> properties, Journal of Computational and Applied Mathematics 172,
> 2004, pp. 375397
>
> end quote
Variable elimination should not be applicable if general nonlinear
constraints are also present (one doesn't even know which variables
are referenced by the user function for general constraints). BTW if
variable elimination is applied (if applicable), it should be done
already in the frontend (the function providing the user interface),
so that it is available for all backend algorithms; it should be done
in the same way in the frontends for scalar optimization, not only in
the curvefitting frontends. OTOH I'd think that the available
projection algorithms can trace linear equality constraints quite
efficiently, limiting the usefulness of variable elimination.
And box constraints are linear, or is something different meant here?
So no nonlinear constraints ... (?)
> <snip>
> 3) Rewriting levmar C library into mcode for Octave? Or is there a
> C library used by the optimization package that would benefit from
> porting into mcode?
I meant the former. But as I said, I'm not sure.
> What would you suggest as the most suitable contribution as part of GSoC?
I'm not sure at present.
 For my (current) opinion on providing 'lsqnonlin' et al. see my
answer to Nir.
 Having a further algorithm for residual minimization (i.e. curve
fitting) would be good, but actually I'd prefer an algorithm
featuring also nonlinear constraints, preferably an algorithm which
honours the constraints only in the result (since the current
algorithm honours them throughout optimization, which might not
always be the best). (I havn't searched for such an algorithm as
yet.) The potential advantage of the levmar.c algorithm is probably
very limited.
Olaf

public key id EAFE0591, e.g. on xhkp://pool.skskeyservers.net


On Wed, Feb 25, 2015 at 10:58:55AM +0100, Julien Bect wrote:
> <snip>
> I say this both as a developer of a toolbox/package [1, 2] that
> strives to remain compatible with both Octave (>=3.2.2, currently)
> and Matlab (>= R2007a, currently) and as someone who has seen
> several times how details of this kind (for instance, octave not
> having a function named fmincon, compatible with Matlab's one), when
> they accumulate (and they do), are able to prevent people from
> adopting Octave.
I'm curious, what relation have these people to Matlab  have they to
buy Matlab privately, are they students whose University can't give
them licenses, ... ? What do they do after rejecting Octave, pay for
Matlab? My experience is that people with free access to Matlab don't
think of adopting Octave anyway, except they are personally interested
in Free Software.
That's no rethorical question only to object to you, I'm really
curious about the above.
> Take for instance fmincon. I am perfectly aware that it is not
> possible to get a 100%similar implementation of this function,
> which is actually an umbrella over several types of complicated
> algorithms (sqp, activeset, interiorpoint), even with Matlab's
> documentation and all the textbooks and research papers in the
> world. Yet, I think that having a similar umbrella in Octave, with
> the same name, a compatible syntax and set of options, would be very
> useful. For instance, when fmincon is called with "Algorithm" set to
> "sqp", perhaps should we fall back on the sqp function as an
> optimizer. In other cases, if no solver of the same type is
> available, a warning could be issued that Octave's fmincon will
> switch to a different type of optimizer.
That's already 2:1 against me ... maybe there will be further votes,
but I've the feeling that the direction of the odds won't change.
If that'll be so, the project could indeed be to implement these
functions as wrappers:
1. 'lsqnonlin' wraps 'nonlin_residmin',
2. 'lsqcurvefit' wraps either 'nonlin_curvefit', 'nonlin_residmin', or
even 'lsqnonlin',
3. 'fmincon' wraps 'nonlin_min',
and maybe
4. 'lsqlin' wraps ... (?).
One could state in the application that all the optimization
functionality is already there (don't know it for 'lsqlin'), but
careful mapping of Matlabs interface, in particular of the
optimization options, is necessary. (I've no notion on how
labourextensive a gsoc project should be.)
To answer Nirs question: if it comes to it, I could help with 1., 2.,
and 3.. I'm currently not competent for 4.. But I wouldn't like to be
named at google as a mentor.
Olaf

public key id EAFE0591, e.g. on xhkp://pool.skskeyservers.net


Le 25/02/2015 15:21, Olaf Till a écrit :
> I'm curious, what relation have these people to Matlab  have they to
> buy Matlab privately, are they students whose University can't give
> them licenses, ... ? What do they do after rejecting Octave, pay for
> Matlab? My experience is that people with free access to Matlab don't
> think of adopting Octave anyway, except they are personally interested
> in Free Software. That's no rethorical question only to object to you,
> I'm really curious about the above.
Lots of different situations, difficult to answer. But yes, I agree with
you about people having "free" (or perceived as such, because someone
else is paying) access to Matlab.
Typically, think of a small lab or a team in a company that historivally
has a few Matlab licences (and a few toolboxes), a more or less
important Matlab code base, and wants to spend less on software licences.
They should naturally switch to Octave, since it's advertised to be
"Matlabcompatible" and they have several people already familiar with
the Matlab langage. But if they get the impression that their programs
need to be modified to run under Octave, or that some of their favorite
functions (say, fmincon) are actually missing from Octave, they will
probably hesitate.
What to they do after that ? Well, they might still decide to give
Octave a try (hurrah !), or decide to keep using their Matlab licences
(so much for the savings...), or perhaps decide to (try to) switch to
another langage (e.g., Python + Numpy, since they hear so much about it
these days).
Just for the record: I have also recently helped a PhD student, with
prior exposure to Matlab but no Matlab licence in his lab, to get
started with Octave (under Linux). He was very happy with Octave and its
new GUI, and found all the functions that he needed (mostly linear
algebra) right there and working. I really think the new GUI, together
with the Windows binaries, will attract a lot of people to Octave.


Olaf, I agree that we do not expect to reproduce the Matlab results in all cases, especially in more 'difficult' or illposed problems  but this is also true for functions in core Octave that aim to have the same syntax as Matlab, for example inverting an almost singular matrix may well give different numbers in Octave than in Matlab. Numerical approximations will unavoidably have some implementation dependence. In terms of algorithms, it looks like the Optimization package has __lm_svd__, which offers a LevenbergMarquardt algorithm for least squares (with arbitrary constraints allowed, if I understand it correctly?). The other algorithm Matlab offers in lsqcurvefit and lsqnonlin is a trustregion one[1] which supposedly can be more efficient for highdimensional problems  is anything like that implemented in Octave? Nir


On Wed, Feb 25, 2015 at 07:56:15PM +0000, Nir Krakauer wrote:
> Olaf, I agree that we do not expect to reproduce the Matlab results in all
> cases, especially in more 'difficult' or illposed problems  but this is
> also true for functions in core Octave that aim to have the same syntax as
> Matlab, for example inverting an almost singular matrix may well give
> different numbers in Octave than in Matlab. Numerical approximations will
> unavoidably have some implementation dependence.
>
> In terms of algorithms, it looks like the Optimization package has
> __lm_svd__, which offers a LevenbergMarquardt algorithm for least squares
> (with arbitrary constraints allowed, if I understand it correctly?). The
> other algorithm Matlab offers in lsqcurvefit and lsqnonlin is a
> trustregion one[1] which supposedly can be more efficient for
> highdimensional problems  is anything like that implemented in Octave?
>
> Nir
>
> [1] * http://www.mathworks.com/help/optim/ug/leastsquaresmodelfittingalgorithms.html#broz0i4> < http://www.mathworks.com/help/optim/ug/leastsquaresmodelfittingalgorithms.html#broz0i4>*
We have no trust region algorithm. But to be answerable for trust
region algorithms I'd need to do some studying first.
Olaf

public key id EAFE0591, e.g. on xhkp://pool.skskeyservers.net


On Wed, Feb 25, 2015 at 09:12:59PM +0100, Olaf Till wrote:
After a quick glance at a description of trust region algorithms it
seems to me that a LM algorithm actually could also be called a trust
region algorithm, since it follows the same concept. Since the LM
algorithm reputedly works quite well, I wouldn't expect a large
improvement by using an explicitely so called trust region algorithm.
After an similarly short glance at:
Coleman, T.F. and Li, Y. 1994, Mathematical Programming 67, 189224,
referenced by Matlab with its title slightly wrong ("large scale"
seemingly introduced by Matlab) and the reference contained therein:
Coleman, T.F. and Li, Y., internal research paper TR921315, 1992,
the notion of suitablility for large scale problems seems to be based
on numerical experiments with a "reflective" Newton method, seeing no
substantial increase in the number of iterations with an increase of
dimension (up to 2744). But I've seen no comparison with different
methods in these papers, so I'd have no reason to assume the method is
better than our LM with respect to this.
For high dimensionality, there is a speed problem due to the
decomposition. But this problem is not specific to either LM or "trust
region" algorithm. I'd guess it's possible to improve speed for high
dimensionality in __lm_svd__ (currently no alternative to svd() is
used there, and svd (rand (3000, 2744)) needs 7 min for me).
BTW the second reference states the respective algorithm was written
in Matlab.
The "trust region" algorithm is now Matlabs default, I'd guess because
it supports bounds, while their LM algorithm doesn't. BTW our
__lm_svd__ supports arbitrary constraints.
I'd conclude it's probably not worth providing Matlabs "trust region"
algorithm in Octave, but maybe worth to provide alternatives to svd()
for large scale in our existing algorithm (but the latter I'd probably
try myself, it isn't worth a project).
Olaf

public key id EAFE0591, e.g. on xhkp://pool.skskeyservers.net


Thanks for looking into this, Olaf. So it sounds like the priority for the project should be to implement the missing lsq* functions (and possibly also others such as fmincon) as wrappers primarily to existing functions in the Optimization package. I will update the project description accordingly.
Nir


Also, what do you think about adding as a possible component of the project to benchmark the existing and new optimization functions with a suite of standard problems such as those from CUTEst[1] to identify any performance or interface limitations or bugs, which could enable better prioritizing what algorithms to add/improve?


On Fri, Feb 27, 2015 at 03:01:14PM +0000, Nir Krakauer wrote:
> Also, what do you think about adding as a possible component of the project
> to benchmark the existing and new optimization functions
What do you mean by "existing and new"? I thought the project is now
decided not to create new algorithms?
> with a suite of
> standard problems such as those from CUTEst[1] to identify any performance
> or interface limitations or bugs, which could enable better prioritizing
> what algorithms to add/improve?
Have you looked at "optim_problems.m" in the optim package?
How should tests for interface limitations be performed?
Olaf
> [1] http://ccpforge.cse.rl.ac.uk/gf/project/cutest/wiki/  LGPL, Fortran
> with Matlab interface
>

public key id EAFE0591, e.g. on xhkp://pool.skskeyservers.net


On Fri, Feb 27, 2015 at 10:30:48PM +0000, Nir Krakauer wrote:
> On Fri, Feb 27, 2015 at 8:25 PM, Olaf Till < [hidden email]> wrote:
>
> > On Fri, Feb 27, 2015 at 03:01:14PM +0000, Nir Krakauer wrote:
> > > Also, what do you think about adding as a possible component of the
> > project
> > > to benchmark the existing and new optimization functions
> >
> > What do you mean by "existing and new"? I thought the project is now
> > decided not to create new algorithms?
> >
>
> There would be new functions, like lsq*  not necessarily new algorithms
>
> Have you looked at "optim_problems.m" in the optim package?
> >
>
> It might be a good starting point for benchmarking. Looks like it has a
> dozen or so problems.
>
> > How should tests for interface limitations be performed?
>
> Basically I was just thinking of a case where the existing interface
> doesn't allow the form of a given testsuite problem to be posed in Octave,
> which is something that would be discovered by going through the suite.
I'd say each testsuite problem can be applied to each optimization
interface with a suitable "adapter" (wrapper) (of course the optimizer
must be applicable at all and be able to deal with the posed
constraints).
And if an optimization interface does not directly match a testsuite
interface, I wouldn't say this is necessarily a shortcoming of the
optimization interface.
So I still couldn't say how it should be possible to apply a test
suite to the properties of interfaces. Have I misunderstood something?
Olaf

public key id EAFE0591, e.g. on xhkp://pool.skskeyservers.net

1234
