Sunday, January 20, 2013

V3 Accuracy

Thanks to Hackaday viewers, Ray Maker at dcrainmaker.com, the various forums, twitter, reddit, and more. Again, feel free to ask questions or put in some comments. I’ll be sure to reply, and I’d love to clarify any deficiency that people have pointed out – like this one.

I’ve seen some comments about the web wondering about the accuracy of my power meter design. The short answer is that due to the simplicity of the design, the left measure should be better than average, while the right should be in line with the Stages Stage One design.

One of the things that I’ve never seen is how SRM, Quarq, Powertap, Power2Max, etc come up with their accuracy claims. This is slightly concerning as it might be like response times on monitors; all practically made up and skewed depending on how. In this interview Jim Meyers, founder of Quarq, explained how you could have 50 strain gauges and terrible accuracy. He also claimed that the last one he calibrated was 0.25%, but their company claims 2% to give a margin – sensible!

Based on explanations it seems they base this accuracy on the torque measuring accuracy. So what would that mean? It means that rotational measures might be accurate enough that it’s not the significant contributor. Since the microcontrollers in these power meters run at a high frequency, the time sample measurement could be down to the microsecond accurate, that it doesn’t matter. Back to the torque, it’s likely based on static torque measures with various calibration weights and looking for the worse measurement. Careful that your calibration weights need to be highly accurate.

V3 right arm (and possibly Stages Stage One) should have increased error measurements as the shear gauge used can be affected by pedal offset and that should be calculated in – however I’m not even sure of the methodology to do so. I’d be curious even if Stages Stage One has this figured out. However, Quarq and SRM both require a decoupling algorithm and have 4 – 5 sensor pickups which measure you have several analog-digital stages that could contribute to error.

The long short of it is that the 2% that is often quoted is a function of the use of strain gauges and electronics to pick them up. So, empirically I can say that mine would be about 2%. However to confirm I would need to get some high accuracy calibration weights. Once actually fully calibrated in the range, it should easily be within 2%.

One thing to note on calibration. Technically it’s bad practice to use a load that is below a load you expect. For instance, if you expect 200 N-M of torque, using 20 N-M of torque is not a good idea. I’ve been involved in QA programs and testing. If you have a 100lb load cell it’s calibrated with usually about 5 test points including 100lb. You don’t test to 20lb and assume it’ll be linear even if you’re experienced enough to say so. That’s just bad engineering.

So my calibration currently is in the extrapolation range and is therefore a bad calibration. I’ll eventually rectify this and I’ve tested several different weights and taken the mean in the past and will again in the future.

1 comment:

  1. This article was very unique and informative.Incredible unique article like this will be helpful for many like me in finding the best Load Cell Supplier in India

    ReplyDelete