Hey guys,
I have learned quite a lot in this awesome forum, thank you so much!
I am currently planning to implement a servo motor driver and during my research on encoders I have some questions I didn’t find an answer for. My plan so far is using an MT6835, because it is reasonably cheap and the specs look really attractive, and my use case is mostly slow and precise position control, but also occasionally fast movements. I have low torque requirements, my setup will stay well below a couple amps.
Is it fair to say that AB is in general the best communication protocol because it has the lowest latency, the lowest impact on the driver CPU, with the only downside that it has a lower resolution than say SPI and that it is, obviously, not absolute? And that on high rpm the low latency gets more and more important?
with that in mind my plan so far is to
- use AB to read the sensor position, preferably with A on rising+falling interrupt and in the interrupt only read B for direction and update the internal position. That would allow up to 32.768 steps/revolution which is way more than I need
- Never do a homing sequence to get the absolute position
- completely ignore Z because I shouldn’t need that one
- on startup do one single SPI reading to get the absolute position and initialize the internal position
- configure the sensor to a resolution reasonably high for my precision needs but low enough to support my max needed RPM (to be determined still…)
Does this make sense so far or am I completely off the track?
Did you guys already combine absolute (init + maybe occasional syncs) and relative (main workload) readings of magnetic encoders, or is that a stupid idea?
Is it possible, or better does it make sense, to dynamically change the ABZ resolution depending on the current speed? Like, use a pretty high resolution on standstill and fine positioning, and switch to a lower resolution when passing some RPM threshold to not lose interrupts?
Does ABZ work good enough on high rpm or would it still be better to switch to a sensorless mode there? I did read on many places that sensored mode works less and less when speeding up, but I don’t know if that is an artifact of the slow SPI protocol or if it also affects ABZ… I wanted to keep it as simple as possible in the beginning and focus more on the software than the hardware side, and sensorless control sounds like a nightmare to me.
[Edit]
And one more question, does the ABZ protocol on the MT6835 also have the jitter of high-bit SPI readings, or is it more stable? did they add some form of hysteresis there or so, so it doesn’t fire interrupts on complete standstill?