I don't actually see an accuracy number, only the claim of "millimeter precision", which is actually pretty bad for calipers. Looks like a fun project though. Basically a linear resolver sensor I guess. From how much effort the author has put into the project I'd estimate the accuracy is much better than +/- 0.5mm.
Which is quite low. My manual caliper is precise to the 1/10 of a mm, my electronic to 1/100 (but I would say 0,02 is the more realistic)
The manual one is not good, harbor freight quality, electronic is a mituyo (not an entry level). Still - good ROI on both, got the electronic one because my eyesight is not that good anymore.
With a decent set of vernier calipers (I have Brown and Sharpe ones) they're accurate to 0.001" (0.02mm) every time. But what's nice about analog measuring tools is you can actually reliably achieve better accuracy--like 0.0005" +/- 0.00025"--by "reading between the lines". I can reliably take finishing cuts accurately to a few ten thousandths of an inch using vernier calipers (confirmed by checking with a micrometer accurate to 0.0001").
The only application I've encountered where digital tools work better for me is having a DRO on a mill is extremely convenient.
You really want to use a mike on this kind of precision. Calipers can be repeatable in a certain range but even then a readout from vernier gives too much error. Measuring a tenth of mm is acceptable (tho I'd never trust a vernier caliper measurement beyond 0.2). A hundredth IMO is wishful thinking.
Looking at my calipers now I noticed that the imperial side is twice as precise as the metric side. Graduations of 0.05mm vs 0.001". I wonder why that is.
In my experience, yes. I've checked with micrometers that are an order of magnitude more precise. You have to be fairly experienced reading it, and it often helps to use a magnifying glass. Also you have to be very careful not to drag the jaws open when removing from the object you're measuring. Taking multiple measurements and averaging helps.
To be clear, for a measurement where accuracy to less than 0.001" actually matters use a micrometer! Otherwise you're likely to screw up the part. But the advertised precision of 0.001" is totally repeatable within 0.0005".
tenths and hundredths of an inch "don't mean anything" because we don't divide inches that way in common use, but in subtractive manufacturing and the like they do use "thou" - and 0.001" is a thou.
Personally, i use microns instead of 0.001mm, too, when measuring that small. I forget the accuracy of my good calipers, but i could detect errors of around 2 microns if memory serves. It's been a long time since i cared about anything that accurate so i have two pairs of cheap plastic ones - scale and digital.
A typical metric micrometer is accurate to 0.01mm (tho you can find more precise ones at premium). It's really unlikely you'll get a micron precision from any calipers. Even an angry glance warms up the instrument enough to make this meaningless.
Microns are the domain of grinding and lapping, you rarely ever need to go there with cutting.
Yeah I only use calipers and micrometers for machining--I haven found any use for additive manufacturing--and never in metric units because all my tools are imperial. Just strange the calipers punish metric users by giving them only half the precision.
If you buy a tool in a country that is mainly using imperial, the markings on it might be more exact for the imperial measurement. Might be the opposite in a country with metric. Just guessing though but that is often how other things work out.
Interesting that you say that. My current backburner project is a display (TFT or PC) for Mitutoyo Digimatic. I can read the bright VFD display, but it struck me that others might find it difficult to read from across a workbench.
Interesting, I always naively assumed that those cheap calipers measure distance mechanically, with something like a wheel and an encoder. The actual method is much cleverer.
Also, nitpick: „ I’m stuck in the local optima of …“ should be „optimum“.
I'd be super curious to see how the accuracy changes with averaging out/low-pass filtering the measurements. Accuracy usually improves proportional to sqrt(N) when you take N samples so your higher precision desire might just be a bit of code to write.
The other side of it though is that you're starting to get down into the "everything needs to be temperature controlled" region as you squeeze that precision number. FR-4 and copper have thermal expansion coefficients around 15-20ppm/C. If I'm doing this mental math correctly, a 5 deg temperature rise would make a 1m long piece of FR4 expand by 0.1mm, or a 10cm piece of FR4 expand by 0.01mm.
One time I wanted to demonstrate thermal expansion to my kid (1st grade or so) so I made some marks with a steel ruler and put it in the freezer. Imagine my surprise when we took it out and there was no perceptible difference :-D
That at 500Hz is called "optical mouse on absolute positioning patterned surface".
On that note: I'm looking for a mouse style camera sensor unit that can export full frame rate raw to a system where I can actually decode such an absolute positioning code.
TLDR: <0.02mm should be possible w/open source using cheap fabs.
Interesting project. The hardware guy earlier built a rotary encoder and a vape pen. I am no metrologist (though by chance I once worked for the UK guy who brought Hexagon to China and made bank), this looks overall like quite a complex scheme that was probably referenced from an existing implementation. These days you can get 0.10mm pitch tracks and offsets ("4 mil") or 0.09mm ("3.5 mil") from JLC on 2 layer/4+ layer. With flex PCBs you can get still smaller pitch ("3 mil"). Combining a few rows of these with basic multi-track rotary encoder theory should give you portions thereof, ~0.01-0.02mm.
This back of hand calculation aligns well with my Mitutoyo's test report, which states maximum permissible error is 0.04mm @ 5mm diameter, 0.02mm @ 0-200mm, and 0.03mm @ 300mm. Indicated errors on the test report are all in the range of 0mm-0.02mm except inside radius which is 0.03mm. This would be a standard high grade caliper level of precision.
In practice, achieving these levels is going to require machining high grade steels and mounting them at high levels of parallelism, not simply working out the electronics.
I don't actually see an accuracy number, only the claim of "millimeter precision", which is actually pretty bad for calipers. Looks like a fun project though. Basically a linear resolver sensor I guess. From how much effort the author has put into the project I'd estimate the accuracy is much better than +/- 0.5mm.
Which is quite low. My manual caliper is precise to the 1/10 of a mm, my electronic to 1/100 (but I would say 0,02 is the more realistic) The manual one is not good, harbor freight quality, electronic is a mituyo (not an entry level). Still - good ROI on both, got the electronic one because my eyesight is not that good anymore.
With a decent set of vernier calipers (I have Brown and Sharpe ones) they're accurate to 0.001" (0.02mm) every time. But what's nice about analog measuring tools is you can actually reliably achieve better accuracy--like 0.0005" +/- 0.00025"--by "reading between the lines". I can reliably take finishing cuts accurately to a few ten thousandths of an inch using vernier calipers (confirmed by checking with a micrometer accurate to 0.0001").
The only application I've encountered where digital tools work better for me is having a DRO on a mill is extremely convenient.
You really want to use a mike on this kind of precision. Calipers can be repeatable in a certain range but even then a readout from vernier gives too much error. Measuring a tenth of mm is acceptable (tho I'd never trust a vernier caliper measurement beyond 0.2). A hundredth IMO is wishful thinking.
Looking at my calipers now I noticed that the imperial side is twice as precise as the metric side. Graduations of 0.05mm vs 0.001". I wonder why that is.
Precision, maybe, but is it accurate to that degree?
In my experience, yes. I've checked with micrometers that are an order of magnitude more precise. You have to be fairly experienced reading it, and it often helps to use a magnifying glass. Also you have to be very careful not to drag the jaws open when removing from the object you're measuring. Taking multiple measurements and averaging helps.
To be clear, for a measurement where accuracy to less than 0.001" actually matters use a micrometer! Otherwise you're likely to screw up the part. But the advertised precision of 0.001" is totally repeatable within 0.0005".
tenths and hundredths of an inch "don't mean anything" because we don't divide inches that way in common use, but in subtractive manufacturing and the like they do use "thou" - and 0.001" is a thou.
Personally, i use microns instead of 0.001mm, too, when measuring that small. I forget the accuracy of my good calipers, but i could detect errors of around 2 microns if memory serves. It's been a long time since i cared about anything that accurate so i have two pairs of cheap plastic ones - scale and digital.
A typical metric micrometer is accurate to 0.01mm (tho you can find more precise ones at premium). It's really unlikely you'll get a micron precision from any calipers. Even an angry glance warms up the instrument enough to make this meaningless.
Microns are the domain of grinding and lapping, you rarely ever need to go there with cutting.
Yeah I only use calipers and micrometers for machining--I haven found any use for additive manufacturing--and never in metric units because all my tools are imperial. Just strange the calipers punish metric users by giving them only half the precision.
If you buy a tool in a country that is mainly using imperial, the markings on it might be more exact for the imperial measurement. Might be the opposite in a country with metric. Just guessing though but that is often how other things work out.
Interesting that you say that. My current backburner project is a display (TFT or PC) for Mitutoyo Digimatic. I can read the bright VFD display, but it struck me that others might find it difficult to read from across a workbench.
The proper term for calipers, for me, is Mitutoyo. I really want one of the solar models.
Article states +/- 0.6mm
Interesting, I always naively assumed that those cheap calipers measure distance mechanically, with something like a wheel and an encoder. The actual method is much cleverer.
Also, nitpick: „ I’m stuck in the local optima of …“ should be „optimum“.
> Have you ever wished for a 500 Hz, millimeter-precise linear position sensing system
Kind of, but I'd like an 0.01mm precision please. It can be just a few Hz, I don't need 500 Hz.
Great project though!
I'd be super curious to see how the accuracy changes with averaging out/low-pass filtering the measurements. Accuracy usually improves proportional to sqrt(N) when you take N samples so your higher precision desire might just be a bit of code to write.
The other side of it though is that you're starting to get down into the "everything needs to be temperature controlled" region as you squeeze that precision number. FR-4 and copper have thermal expansion coefficients around 15-20ppm/C. If I'm doing this mental math correctly, a 5 deg temperature rise would make a 1m long piece of FR4 expand by 0.1mm, or a 10cm piece of FR4 expand by 0.01mm.
One time I wanted to demonstrate thermal expansion to my kid (1st grade or so) so I made some marks with a steel ruler and put it in the freezer. Imagine my surprise when we took it out and there was no perceptible difference :-D
did you measure the freezer after? It expanded :)
With some microcontrollers you can do this "averaging out" just by changing ADC parameters, you don't even have to write the low-pass-filter code.
That is specifically sigma-delta type ADCs: https://www.analog.com/en/resources/technical-articles/sigma...
That at 500Hz is called "optical mouse on absolute positioning patterned surface".
On that note: I'm looking for a mouse style camera sensor unit that can export full frame rate raw to a system where I can actually decode such an absolute positioning code.
Anyone got something in the sub-100$ range?
TLDR: <0.02mm should be possible w/open source using cheap fabs.
Interesting project. The hardware guy earlier built a rotary encoder and a vape pen. I am no metrologist (though by chance I once worked for the UK guy who brought Hexagon to China and made bank), this looks overall like quite a complex scheme that was probably referenced from an existing implementation. These days you can get 0.10mm pitch tracks and offsets ("4 mil") or 0.09mm ("3.5 mil") from JLC on 2 layer/4+ layer. With flex PCBs you can get still smaller pitch ("3 mil"). Combining a few rows of these with basic multi-track rotary encoder theory should give you portions thereof, ~0.01-0.02mm.
This back of hand calculation aligns well with my Mitutoyo's test report, which states maximum permissible error is 0.04mm @ 5mm diameter, 0.02mm @ 0-200mm, and 0.03mm @ 300mm. Indicated errors on the test report are all in the range of 0mm-0.02mm except inside radius which is 0.03mm. This would be a standard high grade caliper level of precision.
In practice, achieving these levels is going to require machining high grade steels and mounting them at high levels of parallelism, not simply working out the electronics.
See also: https://www.eevblog.com/forum/projects/absolute-capacitive-r... (see animation, GC7626C datasheet) https://github.com/littleboot/ACRE
> TLDR: <0.02mm should be possible w/open source using cheap fabs.
What does that have to do with a Fab? Isn’t a fab the thing that creates ICs? PCBs aren’t made in a fab.