Since the zero offset is already calculated by the algorithm anyways, the only thing required would be the maximum current the sensor is rated for, and scale the analog voltage to the current based on whether the sensor is bidirectional or unidirectional (for low-side sensing).
I agree, and even though it isn’t a hard calculation, I always manage to confuse myself and get it wrong…
I don’t think that’s sufficient… the rated current just defines the upper bound of what the sensor can handle.
The output is ratiometric around half the supply voltage, and as you say we find the zero point in the code. But it would be an assumption to say that the 30A is the exact range of the sensor, and to then determine the sensitivity from that we would additionally need to pass the sensor’s supply voltage…
To use the ACS712 as an example, with 30A range and 5V VCC, we would get a sensitivity of 0.083V/A = 83mV/A. But the sensor’s datasheet lists the sensitivity of the 30A model at 66mV/A… apparently they allow a bit of head-room, since VCC could be 4.5-5.5V.
I would propose we create a new constructor which takes directly the sensitivity in mV/A as the parameter. This is a value that should be in the datasheet of most Hall effect current sensors, I would guess.
Your proposal is better, as it allows for using the value from the documentation and matching the parameter. If I use the 30A version, the rated 30A bidirectional swing at 66mV/A corresponds to exactly 4V supply voltage, and 37A at 5V so the advantage of using the sensitivity vs range is obvious.
Hi guys. 66 millivolts for microcontrollers with 5 volt logic? If I use stm32 and a transistor voltage divider, should this value be less?
I have used three of these sensors, but at rest the rotor of the motor oscillates from side to side. The fluctuations are not strong, but they are there. What are the possible elimination solutions? Hi Yuriy
The value depends on the sensor or shunt-values and current sense amps used… Some sensors might have a response of 66mV/A, for others it could be 100mV/A or any value, really. It depends on the hardware used.
If you make an amplification circuit with transistors, you’ll have to calculate the gain according to that circuit. If you share your circuit schematic for this, we can try to estimate it, but probably it’s best to check it empirically anyway, using an oscilloscope. The MCU you’re using normally doesn’t have an impact on the current sensing gain, but of course you have to choose the input range to match what the MCU can handle.
Do you think the oscillations are caused by the current sensing, or by the shaft angle sensing? I have a feeling they might be caused by the angle sensing? Which kind of sensor are you using, is it Hall sensors in the motor?
Sorry I made a mistake, I use a resistor voltage divider, since the ACS712 sensor has an output of 5 volts, and the STM requires 3.3 volts.
I am using the MT6835 sensor in ABZ mode. I also carry out the calibration procedure. But without encoder calibration, the behavior remains just as unstable.
I’m thinking of a way to test the sensor, but I don’t have any ideas. In the library there is an example of checking the encoder, but there the angle is displayed in radians, 6.28 full revolution, which is apparently very compressed and does not allow you to see the vibrations of the sensor …
Ah, I see. For a simple voltage divider, you assume the response is linear, and the division applies to the gain in the same way.
So if the sensor is 66mV/A, and your voltage divider is (for example) 1kΩ/2kΩ, then the 5V * 2/3 = 3.33V and the 66mV/A * 2/3 = 44mV/A
Is this in position control mode? So the motor is trying to hold the position at rest?
One solution can be to tune it out by tuning the PID and LPF values until it does not happen any more. But in some situations this can be very difficult.
If you don’t need the motor to hold position, then one easy solution can just be to turn it off when you don’t need it to move (motor.disable()).
Another solution, but this will require modifications to the code, is to artificially reduce sensor accuracy to eliminate noise.
Another solution, also needing code changes, might be to make the system unreactive to small changes (e.g. introduce hysteresis).
Another solution might be to use mechanical dampening - a loaded motor, or one with larger inertia will be less susceptible to vibrations and oscillations…
Perhaps the people here also have some other ideas - I think its not an uncommon problem.
Thanks for the reply. I use in torque mode. These fluctuations occur at a given target of 0. Perhaps I will try to move the sensor from the motor by 15mm. But this will unfortunately not happen soon. I’ll let you know if that helps.
You can solve this fluctuation, by choosing proper Phase A and Phase B pins for the inline current sensor and also checking the gain, in some cases you need to invert the gain depending upon your sensor placement.
I am using these sensors in inline current mode and they are working perfectly.
I moved the position sensor 20mm away from the engine and its behavior has not changed, it continues to float at a target torque of 0. Randomly, with the help of a magnet, I found that if I bring the magnet to one of the acs712 sensors, the motor starts to rev in one direction or another. This trick works with all three sensors. Now I’m confused about what to do in this case
You show that the current sensors (which work based on the Hall effect) react to the magnet, which output a false current signal that causes the control to adjust the motor.
In torque current mode it is presumably trying to regulate the current to zero.
What to do?
If the motion is kind of random like it looks in your video, then my first assumption that there is some noise on the ADC causing this…
So you could check for noise in the software, or with an oscilloscope… you can work against noise using the LPF settings in software, or by adding a low pass filter to the ADC input in hardware.