> -----Original Message-----

> From: Help-octave <

[hidden email]> On Behalf Of llrjt100

>

> The processed position data is the input for a motor controller, with a

> control loop frequency of 20 Hz, so lag is undesirable. My first attempts at

> this were with a simple moving average filter, from which I very quickly

> learnt that the lag is proportional to how far back in time the filter

> looks.

Oops, got this after I sent my last suggestion -- so moving average won't

work (too much delay). So you want to filter out a signal that has a period

around 25 seconds, while preserving information that happens in 25 msec

(20 Hz loop rolloff), and without delaying the signals unduly. If you can

accurately predict the "noise" signal in advance, then you can do this --

just subtract the known value of the noise from the desired signal. If the

noise is unpredictable, then you won't be able to get good cancellation.

If you have a good enough model, it will predict the noise for you.

> In terms of modelling the system, there's a stepper motor driving a gearbox,

> the encoder is mounted on the output shaft. The fast ~2 s error in the graph

> is microstep error, the ~24 s error in the graph is encoder interpolation

> error, and most of the rest will be mechanical gearbox errors (the ratios of

> each stage are known). The period of all the errors is related to the motor

> velocity, so hopefully this constitutes a good model?

Can you formulate an equation or algorithm to describe it? And what is

"encoder interpolation error"? Is it predictable?

Why do you need an encoder? A stepper motor always knows its own position

unless you overload it. Of course there will be some slop in the gearing.

Regards,

Allen