Ok. Some people will think this is boring but sensorless motors are easier to get and the community is really having a hard time making sensorless drive work. There are a number of approaches put forth by some forum members and on the Discord channel, flux observers of various kinds, which try to use the mathematical model of the motor to determine the rotor position and from thence actual motor timing (angle between magnetic field produced by stator and the actual position of the rotor, within a cycle of the electrical wave/drive cycle/ i.e. for a 7 pole motor this goes to 2pi and back to zero 7 times per revolution of the motor). They are feed-forward functions that use the output from last time as an input to the next iteration.
Ok, so basically we are having a very hard time finding a function that given say 20 measures of current and voltage along the waveform (only really need 1 complete wave I think? depending on signal to noise ratio), returns the motor timing. The position of the rotor is not the thing, it’s the timing.
If I hop on a graphing calculator and plug in various equations that represent the operation of the motor, I see no easy way to determine even some proxy of motor timing.
However, we can know from current and voltage measurements when the motor is operating at the optimal point, at a given rpm. There is a global minima in the current vs voltage curve. I previously suggested an algorithm that simply searched for this dip and kept the motor operating there, but it was way too slow.
Ok, so neural networks are sometimes described as “universal function approximators”. I could make a system that, for a range of loads and RPMs, plus 20 data points along the current and voltage waveform, determines what the optimal voltage (and current) for run-time is. Take a good range of data across the possible operating region of the motor. You could just use polynomials to approximate everything. There are various algoritms which will take some settings and the data and spit out a bunch of polynomials that try to fit the data.
This is also considered machine learning, however such algorthims don’t tend to fit the data as well as neural networks. That’s all. Also the tools aren’t as handy for the purpose. Also might get complicated when you have like 20 inputs.
Hence the (artificial) neural network.
The system trains just by connecting to the motor and doing the search to collect data (then you have to run the training algorithms on the data on a pc, and produce the neural net function), and it has to be able to change the load on the motor. I would do this with a raspberry pi using micropython, that would be fairly easy to make data I can then use with scikit-learn.
In my case I can have a second fan blowing against the first, so it’s not too hard. It doesn’t actually need to know what the load on the motor is, it just has to have a range. By looking at the voltage and current at the optimal point we can know what the load on the motor is. So you could use a fairly crude apparatus. A second motor with mosfets to short circuit it so it can act as a brake could also work, but then you get the torque ripple and it could get complicated. A large heavy disk and/or spring on the shaft of the brake motor would help.
It’s kinda lazy and kinda dumb, but I think it could work. As long as it’s fast enough I think it will work, and it takes everything into account, whereas when you try to model things there are things like salience which are really a pain to measure for any given motor and mess up the model. You need to measure stuff either way.
That’s what just happened, I ported the flux observer from Odrive into an independent module and tried it out, and it got completely confused by the motor salience. Short of getting a degree in advanced math, or finding someone that has one, I can’t really get it working.
A legit approach is yes to simply take another observer and try to get that working, but it’s a long road in either case and this seems like it’s more likely to work to at least a basic degree. I built a system that used a magnetic angle sensor (calibrated) to determine motor timing and regulate the voltage, and it worked well but was too complicated and expensive. This would do basically the same thing but using the current sensors on the drive board and a neural network, to measure motor timing, then I plug that into the rest of the system.
It’s a bit nuts to bake a thing like this and then roll out a product that it expected to work but I can’t really find a more promising option. As long as the machine learning system is fast enough I think it will work. My fan already works ok just having done this manually, I just plotted voltage v.s. current and then made a polynomial and ramped the voltage. It’s operating in open loop and it’s been fine but obviously such an approach is fragile and inefficent and kind of ridiculous.
This may sound kind of ridiculous, but in a way we are just doing the same thing, take a bunch of data from the waveform, and put it through a function to get the motor timing, then regulate the motor timing. The function can be a mathematical model of the system, which is a great idea, or it can be a neural network.
I would probably regulate voltage rather than RPM to keep the motor timing in a good range, because I want to be able to control RPM precisely. This is a bit slower but I think it should be ok.
I would use M2cGen, which gives you native code with no dependencies which you can run under arduino, then I would interleave motor.move() commands to keep the voltage being fed to the motor glitch free while the neural network runs, because it would probably take a while. Or ideally put it in an interrupt triggered by a timer if I can get away with that (knowing running such long bits of code is usually ill advised in an interrupt).