THR myth confirmation/busting - digital scale accuracy

Status
Not open for further replies.

LiveLife

Member
Joined
Jan 10, 2010
Messages
32,936
Location
Northwest Coast
How about another THR myth confirmation/busting? To benefit THR members and guests, I have come up with a digital scale accuracy challenge.

For years, there have been discussions as to the accuracy and reliability of digital scales. Over the years, I have used 1/4"x1/4" pieces of paper cut from copy paper to test sensitivity of scales. Both of my Ohaus 10-10 beam scales will detect one piece of paper yet the digital scales I have tested won't always show detection until 2-3 pieces of paper were placed and then they would read .1 gr (Depending on the paper thickness, pieces should weigh around .03-.05 gr and take 2-3 pieces to show .1 gr reading).

So this is the challenge - Use thinner/cheaper plain copy paper or newspaper and cut 1/4"x1/4" pieces and see what your digital scale readings are. If you can, it would help to post:

- Brand/model of digital scale and price paid
- Paper used: copy paper or newspaper
- Ambient temperature
- Resolution of scale in grains "gr" not grams "g"
- If you have check weights, readings after calibration of scale and check weights below 5 grains
- Number of paper pieces when scale showed detection/reading
- 5/10 sample readings to show consistency using 2-3 pieces of paper (whichever number your digital scale is able to detect)

Thank you and looking forward to the results.

Link to Older Thread
 
Last edited by a moderator:
There is one key item that is missing from this evaluation.:confused:

Each digital scale is calibrated to a different standard, meaning that it takes a certain change in order for it to register. It is possible that a single sheet of paper may not be able to change the reading. Also, how long are you willing to wait for the calibrated scale to update itself, that is also a calibrated factor.

You may need to set the testing parameters or the results will vary and not provide the accurate conclusion.:D
 
Granted you get a response from the beam scale but what is the unit of measure you are detecting?
I am a fan of the Ohaus scales and use a couple on my bench but I also have a cheep MTM digital that is fast and does the job plus gives direct readout of the weight rather than have me adjust the weight on the beam to get the actual total.
If your demands are sub 10th then you are loading for much greater precision than I.
 
Each digital scale is calibrated to a different standard, meaning that it takes a certain change in order for it to register.
That is the precision of the scale and I agree it is important to know that. Although that specification is not often supplied and so must be observed. It really does not have anything to do with the scale's calibration; calibration would affect the scale's accuracy.

I am also eagerly awaiting replies so I can learn more about buying my next scale. Another important specification is the range of the scale. Usually, the higher the range, the lower the precision. For example a 0 - 2 lb scale probably will not have a 0.1 gn precision.

Lou
 
Yes, I agree with previous posts as I overlooked those aspects (my doctor made me switch to Decaf and it shows :D).

So if you could add the resolution of the scale in grains "gr" (0.1 gr, 0.02 gr etc.) and not in grams "g" along with what the readings are compared to check weights at lower weights below 5 grains (if you have them).

< updated OP >
 
To verify claims whether digital scales with 0.02 gr resolution can demonstrate greater detection/consistency compared to 0.1 gr resolution beam scales.

I am also curious what 0.1 gr resolution digital scales demonstrate that are priced $100+.

This is THR and since I read differing opinions, I think it's high time we find out once and for all what the facts are.

My two Ohaus 10-10 beam scales can detect and consistently read 1 piece of 1/4"x1/4" copy paper. Can your digital scale do that?
 
Last edited:
What you posted makes sense to me. While I don't understand the myth thing if we assume a 1/4" x 1/4" piece of paper or anything else for that matter weighs .03 to .05 grain then between 2 and 3 pieces will weigh about .1 grain.

So if a digital scale has a resolution of .1 grain and we define resolution as the ability to read the scale or the scale to be read then the scale will look at the applied weight this way. Any weight less than .05 grain will be displayed as 0.00 and any weight applied exceeding .04 grain will be displayed as .1 grain. So .00, .01, .02, .03, and .04 grain will display as 0.00 while .05, .06, .07, .08 and .09 grain will be displayed as 00.1 grain. Following that 1.0 grain, 1.01, 1.02, 1.03, and 1.04 grain will be displayed as 1.00 grain.

Looking at just the resolution more realistic for example if I am loading some 308 Winchester using a 168 grain HPBT and H4895 powder and I want a load of 39.1 grains I start trickling powder on my scale, when my scale gets a reading of 39.1 grains in reality I have somewhere between 39.05 grains and 39.14 grains of powder. Anyone who has trickled powder especially using a digital scale notices the moment the scale indicates the desired weight they can trickle a few more kernels of powder before the scale toggles or advances to the next higher number. How many kernels, or flakes or whatever is a function of the density of the powder or in the case of pieces of paper the density of the paper. It becomes a question of what the scale can "resolve" or the resolution of the scale. None of this has anything to do with the accuracy of the scale. Accuracy is entirely a different characteristic or animal.

Accuracy I guess we could say is a measure of unbiased precision. Resolution is the ability to read an instrument or the instrument to be read. A good way to relate to accuracy would be this way. I shoot a 5 shot group with the 308 loads I made earlier with the powder charges I weighed. I shoot at 100 yards and place five shots in a sub 1" group (damn I am good) about 3" below and 4" right of dead center X on my target. That is precision or a high measure of repeatability. However, it is not unbiased precision. I was aiming dead center on the X. I adjust my sights and shoot 5 more shots. This time I place 5 shots dead center on the X. That is unbiased precision or accuracy.

When it comes to scales it's the same thing. The calibration of the digital scale is a matter of placing a check weight (a weight of known value) on the scale. The scale sees that amount of weight and provides a reading. In the case of calibration we then tell the scale what that weight was. Then another weight and in the case of very accurate scales several weights and each time telling the scale what the weight was. If we just want to check the scale we toss a few check weights on the scale and if it reads correctly there is no need to calibrate the scale.

That is about it in a very, very condensed nutshell. Most digital scales used for hand loading ammunition have a resolution of 0.1 grain which for their intended use is fine and they seem to like to specify the accuracy as +/- 0.1 grain also. Again that should be good enough. You select a scale based on its intended use or application.

<EDIT> Wow, when I started to post there was only a single post. I really need to get faster with my typing. :) </EDIT>

My two Ohaus 10-10 beam scales can detect and consistently read 1 piece of 1/4"x1/4" copy paper. Can your digital scale do that?

No, but how accurately is it doing it?

Just My Take
Ron
 
Movement is between 0.1 gr lines but consistent.

As to the rest of your post, yes my point exactly. So a digital scale with 0.02 gr resolution should detect and consistently read .03-.05 gr piece of paper and digital scale with 0.1 gr resolution should read 2-3 pieces of paper. If they don't then there's something off about their claim.

FWIW, even the "cheap" Lee Safety Scale is capable of detecting one piece of paper.
 
Last edited:
My scale is merely there to confirm what my Uniflow is throwing. Accuracy and resolution matters why?
Well, what do you mean by "confirm?" Usually something is confirmed to be correct if it is accurate. You want to know how accurately the Uniflow is throwing charges. In your case, the digital scale must be as accurate as you want your confirmation to be.

If you just want to confirm the Uniflow dumped a charge in, you could just look in the case, see a normal looking amount of powder and call that good, and therefore "confirmed". Your accuracy there is "powder or no powder." If you want to know if it's 5 gn or 6 gn, you need a scale that is accurate (and I would argue precise) to better than 1 gn; if you wanted to know if it's 5.3 gn or 5.4 gn, you need a scale accurate to better than 0.1 gn, etc.

Lou
 
Calibration and unit of measure of the digital unit is going to be the key.
Consider the measurement of electricity, the digital unit can only operate and readout within the parameters that are built into it. an analog dial unit will show you a position between the units of measurement and if large enough you should be able to scale the exact reading.
The question is how exact does one need to be?
If the digital consistently shows a test weight to be the same readout it should be safe to assume that you are within a half grain one way or the other.
 
And that's the precisely the point I was trying to make on the other digital scale thread.

Even for Bullseye or 1000 yard Palma match reloading, I think 0.1 gr resolution of beam scale is good enough. But to claim higher 0.02 gr resolution digital scale will allow you to better conduct your load development, I would like to see proof.

So, anyone willing to post some data from their digital scales?
 
If we just want to check the scale we toss a few check weights on the scale and if it reads correctly there is no need to calibrate the scale.
Ron, excellent explanation of accuracy and precision! I would add to your statement above that there is a difference between calibrating a scale and checking accuracy of a scale. Calibration is a procedure of (possibly) adjusting the readout of a scale to represent a known mass being weighed; confirming accuracy is as you describe above.

With a digital scale, confirming accuracy is likely (and hopefully) done much more often than calibration. My RCBS balance beam scale needs calibrating (to zero) every time I use it since being off-zero will create an accuracy problem. However, I will admit there still could be range accuracy problems with that scale that simple zeroing will not address.

Lou
 
Everyone's observations make a lot of sense.

Accuracy is getting the correct weight reliably and repeatedly which I think even the inexpensive scales are good at now. When I set up my cheapy $15 digital I always get out the check weights and compare between the digital and Lyman beam. Comes up spot on every time at 6.0, 25.0, and 40.0.

Problem is the level of precision. Precision is how finely the weight is measured. If the 54.5035 grain load reads 55 grains on a 1 grain scale it is accurate but the precision stinks for our usage. Most of the inexpensive digitals are calibrated with 0.1 grain accuracy read out in 0.2 grain increments. The higher cost 0.05 accuracy scales read out 0.1 increments. With a small beam scale you can see needle movement any time the pan weight is altered in tiny fractions even if the counterweights don't have precision finer than 0.1 grains. I suspect the strain guages in even the cheapy digital likely also sense the smallest changes but since the unit of measure to be displayed is fixed there is no information relayed to me as the user. The headache is the true weight could be anywhere in between the incremental values so 54.5 grains could in fact be 54.4501 or 54.5499 on a 0.05 precision scale and you wouldn't know it. Technically the same issue applies to the beam scale when making multiple separate measurements by moving the counterweights. This is why the recommendation with a beam scale is to set up the counterweights once and trickle up subsequent charges until the needle reaches the same point to reach more consistent weights. With the digital we chase that built in margin of error, while the beam can be fixed.

I suppose the question is how much precision is needed. For mid-range hunting rifle loads I don't see a significant performance difference with 0.2 grains of error margin using the digital. For that matter there really isn't much discernable variation loading by volume with a meter. I do however measure and trickle with the beam for H110 loads in 44 mag and would absolutely do the same for any small capacity pistol rounds.
 
Hate to be rude but the scale accuracy and repeatability have been discussed to death in other threads.

How about some actual data from the OP challenge specifications?

Anyone?
 
Even for Bullseye or 1000 yard Palma match reloading, I think 0.1 gr resolution of beam scale is good enough. But to claim higher 0.02 gr resolution digital scale will allow you to better conduct your load development, I would like to see proof.
I totally agree. And while I agree that 0.2 gn variation in powder weight can make a difference, I think there are dozens of other parameters whose variation can and do make much more of a difference.

Humans often take the easy way out on things. Measuring powder to 0.01 gn is far easier and lots more fun than measuring case volumes, primer ignition material mass (can you even do that?), bullet concentricity ... the list goes on.

Lou
 
I can't see what weighing a piece of paper accomplishes...I prefer check weights for accuracy verification, but here you go.
The first photo is one piece of 20lb. copy paper.
The second is a ½gn RCBS check weight (the lightest in the deluxe set).
The scale is a Gemini 20 that sells for twenty bucks.
Myth or not, as long as the scale is true to the RCBS check weights I'm satisfied.

IMG_1260_zps86490adc.jpg

IMG_1261_zps06e5afbe.jpg
 
Now, can you repeat the readings 5-10 times to verify repeatability?

Already did that......I'm guessing you want me to post 10 more photos?
 
Ron, excellent explanation of accuracy and precision! I would add to your statement above that there is a difference between calibrating a scale and checking accuracy of a scale. Calibration is a procedure of (possibly) adjusting the readout of a scale to represent a known mass being weighed; confirming accuracy is as you describe above.

With a digital scale, confirming accuracy is likely (and hopefully) done much more often than calibration. My RCBS balance beam scale needs calibrating (to zero) every time I use it since being off-zero will create an accuracy problem. However, I will admit there still could be range accuracy problems with that scale that simple zeroing will not address.

Lou
Thank you. Calibration in a nutshell in its simplest form is comparing a known to an unknown. In the unabridged version we cound beat the subject to death and them maybe some.

In my opinion, and just my opinion with the advent of digital scales they have their good and bad points. This is also true of anything digital. As long as the scale meets or exceeds its intended application there really isn't much to debate or debunk.

If you need a scale that will accurately measure and resolve .02 grain then by all means buy one. When hand loading the question begs what accuracy and resolution is really needed? The consensus seems to be a resolution and an accuracy of +/- 0.1 grain is adequate. So what do we really gain with a more accurate scale for our intended application?

Beam scales or call them analog scales also have resolution and accuracy specifications. Like any analog indicating instrument they need to be read. So how well or accurately can we resolve the graticule of the scale? If I add .03 grain to an analog beam scale I see a change. How well can I resolve that change and then, how much does it matter?

Shooting 500 yards using 168 grain bullets how much will 0.1 grain matter? Did we weigh every case? Did we pin gauge every primer flash hole? Did we weigh every bullet? Hell, we have an unending list of other variables. Then if we test fire our loads over a chronograph how accurate is the chronograph? Is the sky screen spacing perfect? We can see that for the truly OCD this can lead to insanity.

We get to a point in hand loading where good enough is just that, it's good enough. Anything better than good enough will only begin to produce diminishing returns for the investment.

So if I toss a feather on a beam scale and the indicator moves that is well and fine. If I can't accurately quantify what the feather actually weighs I don't really know any more than before I tossed a feather on the scale. If I toss enough feathers on a digital scale sooner or later the scale will increment up one count. I still havent a clue as to the actual weight of the feathers though.

My opinion? Beware the hype surrounding the sale and marketing of any scale aimed at the hand loader. Be it digital or analog indicating.

Ron
 
Status
Not open for further replies.
Back
Top