Memory limits

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

Memory limits

John Logsdon-4
Dear octave-ists

Probably because I am greedy (!), and octave is so robust and excellent, I
push it at the edges.  Don't we all.

Recently, I have occasionally run out of memory with 2.0.9 linux version
(rh4.1) on a 64Mb m/c with 70Mb swap space.  I am using a lot of space -
top indicates generally 37M and up to about 45M.  When I load the data up,
it takes about 10-12M, which is about right.  I have a number of vectors
about 250000 long!

I realise of course that calcs generally take up temporary storage but
even so I have worked out that when nothing is being done, there is
sometimes more space allocated than necessary, particularly in the memory
and swap allocation reported by top.

i  Is this a known feature?

ii Can I garbage collect from within octave (rather than saving, quitting
and restarting)?  Once the program dumped itself in octave-core only I
found that the data local to the function being run was dumped and none of
the main data.  I have trawled the info files and can find nothing
documented.

iii Could it be a feature of linux RH4.1 rather than octave?  If it is a
feature of RH4.1, can I garbage collect from root in another shell?

My interim solution is to up my swap space to 250Mb or 500Mb which I will
do shortly by putting a 2.5Gb drive in but I think this is not the right
thing to do.  There are no other processes going (well, if you discount
the usual stuff, X and a few remote logins to track my email).  I can't at
the moment increase the physical memory.

BTW are there any plans to implement either sparse matrices or
byte/word/integer (1,2,4 byte) storage for data which are indices or take
restricted values in octave? 8 bytes to store small numbers is rather
wasteful - and yes I have thought about using sets.  

John

John Logsdon                               "Try to make things as simple
[hidden email]                   as possible, but not simpler"
Centre for Applied Statistics               [hidden email]



Reply | Threaded
Open this post in threaded view
|

Memory limits

John W. Eaton-6
On 23-Oct-1997, John Logsdon <[hidden email]> wrote:

| Recently, I have occasionally run out of memory with 2.0.9 linux version
| (rh4.1) on a 64Mb m/c with 70Mb swap space.  I am using a lot of space -
| top indicates generally 37M and up to about 45M.  When I load the data up,
| it takes about 10-12M, which is about right.  I have a number of vectors
| about 250000 long!
|
| I realise of course that calcs generally take up temporary storage but
| even so I have worked out that when nothing is being done, there is
| sometimes more space allocated than necessary, particularly in the memory
| and swap allocation reported by top.

Perhaps there is a bug.  I can't tell without a complete example that
demonstrates the problem.  Can submit a complete bug report to
[hidden email] so that it might be possible for someone
else to debug what is happening?  A small example is helpful, but not
absolutely necessary.

Thanks,

jwe


Reply | Threaded
Open this post in threaded view
|

Memory limits

John W. Eaton-6
In reply to this post by John Logsdon-4
On 23-Oct-1997, John Logsdon <[hidden email]> wrote:

| ii Can I garbage collect from within octave (rather than saving, quitting
| and restarting)?

You can clear all the variables that you don't need.  Other than that,
Octave should be making unused memory available for reuse on its own.
If you know of an example where this is not true, please submit a
complete bug report to [hidden email].

| Once the program dumped itself in octave-core only I found that the
| data local to the function being run was dumped and none of the main
| data.  I have trawled the info files and can find nothing
| documented.

I suppose it is a misfeature that it only saves data from the
`current' symbol table.  Perhaps it should only save the data from the
top-level symbol table, or maybe it should go all out and save the
data from the global and top-level symbol tables as well as all the
symbol tables in the active function call stack.  Ugh.  Would people
really like to have that feature?

Thanks,

jwe


Reply | Threaded
Open this post in threaded view
|

sparse matrices and integer data types (was: Memory limits)

John W. Eaton-6
In reply to this post by John Logsdon-4
On 23-Oct-1997, John Logsdon <[hidden email]> wrote:

| BTW are there any plans to implement either sparse matrices or
| byte/word/integer (1,2,4 byte) storage for data which are indices or take
| restricted values in octave? 8 bytes to store small numbers is rather
| wasteful - and yes I have thought about using sets.  

Yes, the current PROJECTS file includes lots of things like this that
would be nice to have, but for which there hasn't been sufficient
time, funding, or contributed code.

jwe


Reply | Threaded
Open this post in threaded view
|

Re: Memory limits

Al Goldstein-2
In reply to this post by John W. Eaton-6
Here is a different memory complaint.

I want the QR decomposition of an mxn matrix as given by dgeqrf in
lapack. As an example using dgeqrf in a fortran code I can solve
a problem with m=850,000 and n=5. With octave I'm limited to m=2000 before
memory runs out (64meg). One problem here is that Q is returned in octave
as an mxm matrix rather than an mxn matrix. Matlab does this also. But the
function orth does no better in octave. Orth returns mxn and still chokes at
m=2000, Matlab does not choke. For QR a hook is needed so that one can specify
the return of an mxn matrix.


Reply | Threaded
Open this post in threaded view
|

Re: Memory limits

Al Goldstein-2
In reply to this post by John W. Eaton-6
A cure for orth is to use  U from [U,S,V]=svd(A,0). That works for large
matrices.
>
>