I need to calculate the autocorrelation coefficients for a vector of 744
elements, but `xcorr` fails returning this error message: error: out of memory or dimension too large for Octave's index type error: called from: error: /home/$user/octave/signal-1.3.2/xcorr.m at line 203, column 7 The behaviour can be reproduced with: var = stdnormal_rnd(744); [r, lg] = xcorr(var, length(var)/2-1); Can nybody help to fix this problem? -- Sent from: http://octave.1599824.n4.nabble.com/Octave-General-f1599825.html ----------------------------------------- Join us March 12-15 at CERN near Geneva Switzerland for OctConf 2018. More info: https://wiki.octave.org/OctConf_2018 ----------------------------------------- |
On Wed, Mar 7, 2018 at 8:22 PM, Fabcap77 <[hidden email]> wrote: I need to calculate the autocorrelation coefficients for a vector of 744 It works for me but the answer is very big size(r) ans = 743 553536 numel(r) ans = 411277248 that is 411,277,248 doubles stored in memory
----------------------------------------- Join us March 12-15 at CERN near Geneva Switzerland for OctConf 2018. More info: https://wiki.octave.org/OctConf_2018 ----------------------------------------- |
Maybe my system does not have enough memory for that monster? I have 2 GB of
shared DDR on my computer. I checked the xcorr.m file, this where the error occurs: P = columns(X); M = 2^nextpow2(N + maxlag); if !isvector(X) ## For matrix X, correlate each column "i" with all other "j" columns (203) *R = zeros(2*maxlag+1,P^2);* -- Sent from: http://octave.1599824.n4.nabble.com/Octave-General-f1599825.html ----------------------------------------- Join us March 12-15 at CERN near Geneva Switzerland for OctConf 2018. More info: https://wiki.octave.org/OctConf_2018 ----------------------------------------- |
On Wed, Mar 7, 2018 at 10:26 PM, Fabcap77 <[hidden email]> wrote: Maybe my system does not have enough memory for that monster? I have 2 GB of I have 16 GB and it was all used before it went back down at the end. I checked the xcorr.m file, this where the error occurs: ----------------------------------------- Join us March 12-15 at CERN near Geneva Switzerland for OctConf 2018. More info: https://wiki.octave.org/OctConf_2018 ----------------------------------------- |
I do think I need a workaround that doesn't eat up so much memory, then - if
there's any. -- Sent from: http://octave.1599824.n4.nabble.com/Octave-General-f1599825.html ----------------------------------------- Join us March 12-15 at CERN near Geneva Switzerland for OctConf 2018. More info: https://wiki.octave.org/OctConf_2018 ----------------------------------------- |
In reply to this post by Fabcap77
Fabcap77 wrote
> I need to calculate the autocorrelation coefficients for a vector of 744 > elements, but `xcorr` fails returning this error message: > > error: out of memory or dimension too large for Octave's index type > error: called from: > error: /home/$user/octave/signal-1.3.2/xcorr.m at line 203, column 7 > > The behaviour can be reproduced with: > > var = stdnormal_rnd(744); > [r, lg] = xcorr(var, length(var)/2-1); > > Can nybody help to fix this problem? > > > > -- > Sent from: > http://octave.1599824.n4.nabble.com/Octave-General-f1599825.html > > > ----------------------------------------- > Join us March 12-15 at CERN near Geneva > Switzerland for OctConf 2018. More info: > https://wiki.octave.org/OctConf_2018 > ----------------------------------------- Hi, You say you need the autocorrelation coefficients of a vector of 744 elements but you are feeding xcorr with a matrix of 744x744 elements. Is this intensional? If you really want a vector something like "var = stdnormal_rnd (1, 744)" would probably do. What you currently do is asking for the intercorrelation between the autocorrelation vectors of each column of your matrix (see "help xcorr): 744 (autocorrelation vectors) vecotrs, intercorrelated with each other -> 744*744*371 = 205361856 elements for the returned matrix. If that is really what you want then you can do it in a loop but it will be slow. Pantxo -- Sent from: http://octave.1599824.n4.nabble.com/Octave-General-f1599825.html ----------------------------------------- Join us March 12-15 at CERN near Geneva Switzerland for OctConf 2018. More info: https://wiki.octave.org/OctConf_2018 ----------------------------------------- |
This post was updated on .
I checked my code and it turns out that the variable I fed to `xcorr` is not
a vector but a 744x744 matrix of NaN. So the problem is not with `xcorr` but with my code. I developed it in Scilab then translated it to Octave because Scilab is disabled by a fatal bug, and it seems it needs to be optimized. Thanks Pantxo! -- Sent from: http://octave.1599824.n4.nabble.com/Octave-General-f1599825.html ----------------------------------------- Join us March 12-15 at CERN near Geneva Switzerland for OctConf 2018. More info: https://wiki.octave.org/OctConf_2018 ----------------------------------------- |
In reply to this post by Fabcap77
On 03/07/2018 08:21 PM, Fabcap77 wrote:
> I do think I need a workaround that doesn't eat up so much memory, then - if > there's any. > > > > -- > Sent from: http://octave.1599824.n4.nabble.com/Octave-General-f1599825.html > > > ----------------------------------------- > Join us March 12-15 at CERN near Geneva > Switzerland for OctConf 2018. More info: > https://wiki.octave.org/OctConf_2018 > ----------------------------------------- > > elements, try var = stdnormal_rnd(744,1) The documentation for this function reads "When called with a single size argument, return a square matrix with the dimension specified" ----------------------------------------- Join us March 12-15 at CERN near Geneva Switzerland for OctConf 2018. More info: https://wiki.octave.org/OctConf_2018 ----------------------------------------- |
In reply to this post by Fabcap77
I thank all who contributed. I was able to trace the source of the error in
my code. I used the function "stdnormal_rnd" only to create an example vector of 744 values; in fact in the code the vector fed to "xcorr" comes from elaboration of raw data. It turns out that I made the same error pointed out in this thread when defining other quantities used in the elboration, so instead of a vector 744 x 1 I had a matrix 744 x 744, and this caused the Out of Memory error. Once fixed that, the code works as it should. The case is closed. -- Sent from: http://octave.1599824.n4.nabble.com/Octave-General-f1599825.html ----------------------------------------- Join us March 12-15 at CERN near Geneva Switzerland for OctConf 2018. More info: https://wiki.octave.org/OctConf_2018 ----------------------------------------- |
Free forum by Nabble | Edit this page |