Accuracy and Precision - yet again

Status
Not open for further replies.
"If you adjust the sights so that POA or a known position relative to POA is the center of the circle within which bullets fall, and you want the bullet to go where you aim the gun, then statistically the difference between where you wanted the bullet to go and where it actually goes will be (ignoring human error) the same as the precision of the shooting system."
This is just saying that once "accuracy" is eliminated as an error factor then the only remaining error factor is precision. That's obviously true no matter how one defines accuracy or precision as long as one agrees they're not the same thing and that they are the two factors that determine where a bullet hits on the target.

It's worth pointing out that the statement is clearly still talking about accuracy as measure made from a group of shots (center of the circle within which bullets fall=center of the GROUP).

There is no center of a group if you want to talk about the error measured after a single shot is made. There is no "center of the circle within which bullets fall" because there's only one bullet to measure from.

And that's my point. If you want to talk about accuracy as a group measurement (error between the center of the group and the POA) then you can't also try to talk about it as a single shot error. It leads to ridiculous conclusions.
They don't intertwine and they only overlap if a human adjusted the sights to make them overlap.
Ok, let's take an example. Let's say a person takes a single shot and exactly hits the POA. Is it because the gun is very accurate and precise and therefore the bullet hit exactly at the POA? Or is it because the gun is very inaccurate and imprecise and therefore although the imprecision resulted in the bullet hitting far from the normal POI the normal POI is also far from the POA in the opposite direction?

In the case of a single shot, accuracy and precision can not be separated. It takes a number of shots (groups) to be able to quantify and separate them.
 
Last edited:
There is a reasonable assumption that most people want their sights adjusted so that POI has a close, known relationship to POA. Because of that common desire shared by almost all shooters, precision becomes the primary source of accuracy once human error is excluded.
Ok, this was your first objection to my initial comment that it was fine to talk about accuracy in terms of group size.

Like your last quoted comment, it is basically saying that if you eliminate one of the error factors that the only remaining error factor is the remaining error factor. A true but somewhat unenlightening statement. The implication being that accuracy and precision are interchangeable in the context you call a "reasonable assumption". But then you state that they not the same. I'll go along with the implication that in the normal context they are interchangeable and point out that if that's true, then the statement that they aren't the same is moot.

Then you tried to support your contention that they were separate but not separate by introducing new definitions relating to measuring the error of single shots instead of continuing with the definitions relating to measuring groups and the location of the center of groups. I'll go along with the idea that if you reduce groups to single shots that you can interchange precision and accuracy even if they are different concepts. But I will point out that if they really are different concepts and they can't be separated in the single shot case, then it's pretty pointless to try to define/explain/illuminate them in terms of the single shot case.
 
This is just saying that once "accuracy" is eliminated as an error factor then the only remaining error factor is precision. That's obviously true no matter how one defines accuracy or precision as long as one agrees they're not the same thing and that they are the two factors that determine where a bullet hits on the target.

I don't think you can really call these factors. E.g. precision is not a factor, it is a measure of one type of error: what broadly can be called noise in my world. Variations in ammo, in atmospheric conditions, and so on, that obscure the theoretical Ideal Signal (which in the case of a gun is every bullet traveling through the same hole) . Noise is a factor.

That may be the source of our disagreement. I am talking about two measurements. If you treat them as factors everything goes a bit squirrely.

Accuracy is measure of one thing.
Precision is a measure of something else.
Neither is a factor.

let's take an example. Let's say a person takes a single shot and exactly hits the POA. Is it because the gun is very accurate and precise and therefore the bullet hit exactly at the POA? Or is it because the gun is very inaccurate and imprecise and therefore although the imprecision resulted in the bullet hitting far from the POA the normal POI is also far from the POA in the opposite direction?

The problem is that statistical results based on a single sample are proverbially inaccurate. I made a joke about that before but it is true. With a single sample your margin of error is so high as to make it impossible to derive meaningful results...but that doesn't mean it isn't statistics. Increasing the sample size is not a fundamental change in what you are doing. You are arguing that you need >1 samples to derive meaningful results and I agree.

Add a second sample. What has happened? Your margin of error has decreased. Your ability to draw conclusions increases. It is still poor, but better. Ten, twenty... It just keeps getting better but it doesn't fundamentally change.

Sorry, you are just wrong on your "new definition" hypothesis. I was not trying to introduce a new definition, and I don't accept that it is a new definition.
 
precision is group size. the two are identical (glad i got into this discussion).

everybody knows group size. everybody knows how to measure group size. everybody brags about group size.

nobody knows a thing about precision, if this thread is any indication. so, group size and accuracy still rule.

murf
 
I don't think you can really call these factors. E.g. precision is not a factor, it is a measure of one type of error: what broadly can be called noise in my world.
It doesn't change the meaning if I leave out the word "factor" or refer to "error measurements" instead of "error factors".

This is just saying that once "accuracy" is eliminated as an error then the only remaining error is precision. That's obviously true no matter how one defines accuracy or precision as long as one agrees they're not the same thing and that they are the two errors that determine where a bullet hits on the target.

This is just saying that once "accuracy" is eliminated as an error measurement then the only remaining error measurement is precision. That's obviously true no matter how one defines accuracy or precision as long as one agrees they're not the same thing and that they are the two error measurements that determine where a bullet hits on the target.
With a single sample your margin of error is so high as to make it impossible to derive meaningful results...
I agree. That's part of why I object to your trying to use a single shot example as a way to explain what's going on. Trying to explain a concept using results that are not meaningful is not very meaningful.
I was not trying to introduce a new definition...
My comment about what you "tried to do" was not intended to ascribe motive. I was merely using that turn of a phrase to retrace the steps of the discussion.

I'm not trying to say what you intended to do, only pointing out what you actually did and how that approach is flawed.

My initial point was (and still is) that we all (even those who are opposed to doing so on this thread) talk about accuracy and group size interchangeably. I gave some examples and I'm sure I could find many more with little effort.

If we make "reasonable assumptions" (to use your terms) as opposed to unreasonable ones, and if we restrict ourselves to normal context (as opposed to the context of a discussion like this one) then we can (and we all have) used the terms accuracy and group size interchangeably. We've done so without any misunderstanding, without being misleading and without even being inaccurate.

Anyone who thinks differently should check in on this thread and set this poor misguided soul straight.

http://www.thehighroad.org/showthread.php?t=761708

He thinks he's having an accuracy problem. Maybe someone here should tell him if that's really the case all he needs to do is adjust his sights. :D
 
So, according to the consensus, what makes a gun accurate has nothing to do with the gun or ammunition, it is solely a matter of whether the sights are/can be adjusted properly or not.........


I think you're referring to something else, and your concern expressed in the rest of your post is valid. When I entered this thread, I was originally thinking of "shooting precision/accuracy", which includes the shooter, the gun, the ammunition, the target, and other factors. But someone can still consider the "precision/accuracy" of each of those factors separately.

I've read lots of discussions here about what might make a revolver or pistol shoot better. Some I understand, some I don't. Just the same, having read something Mr. Borland posted, I now know that the accuracy of the gun very much does matter for most of us, not just the "experts".

If the gun has a certain amount of "lack of precision", which can be added to the "lack of precision" if the ammunition isn't tightly controlled, all this gets added to the shooter's lack of precision/accuracy.

Example - if the barrel in a pistol doesn't lock up in the same place for every shot, that will lead to lack of precision in its ability to shoot. I think that's an example of what's meant by handgun "accuracy". It's a lack of precision, which leads to a lack of accuracy. I think.
 
It doesn't change the meaning if I leave out the word "factor" or refer to "error measurements" instead of "error factors".

Umm...well...to me a factor is explicitly something that causes a change. It absolutely changes the meaning.

This is just saying that once "accuracy" is eliminated as an error then the only remaining error is precision. .

But that is nonsensical to me.

I'll try different words. I am not changing any definitions as far as I am concerned, just trying to rephrase in a way that may find common ground.

Precision is a measure of the accuracy of a measurement. It is a way of saying "when I say X, I mean X +/- Y" So in the context of shooting it is many things, one of which is a way of saying "when I say the gun is positioned to send a bullet through the center of the target, I mean it will send a bullet through a zone defined as a circle with a radius of P where P is the precision."

Here's the trick: it is a statistical derived measurement. Which means before you pull the trigger for the first time you have a precision of zero. You don't know of if the bullet will even leave the barrel. When you fire that first shot and your bullet rips a hole in the paper, you now have infinitely more precision, but it is still only enough precision to say that the bullet will hit somewhere in the half of the universe that is down-range of the muzzle. Each additional sample increases the precision of the measurement....until you reach a point where noise physically prevents additional precision. You may reach that point after 10 shots or 500 but you will reach that point of diminished returns.

Accuracy of a firearm is simply where you wanted to hit vs. where you hit. It doesn't actually take statistics, and can be measured in a single shot... AS LONG AS the accuracy being measured is in units larger than the precision. If your precision is that a shot at X,Y coordinates means anywhere in a 1" circle around that point, and your accuracy units are 4" squares, you can do fine with single shots. If, on the other hand, you are trying to control where you are hitting to within 1/2 precision, you will need to shoot groups to calculate the logical center of the POI so as to adjust the sights.


The irreducible noise portion of precision when it comes to guns is human scale. I own a. 22 that spreads bullets over a 10" circle at 25yd. I can see with my unaided eyes that the bullets are landing inches apart even though I am aiming at a single point and with another rifle I can punch a single ragged hole. The accuracy of that gun is awful, with bullets regularly landing 5" from where I was aiming. As a result, the precision is awful. A shot fired dead center in the target can hit anywhere in a zone defined as a circle with a radius of 5" centered over the target. However, the precision is not the same concept as the accuracy. They are different abstract ideas with totally different meanings.

ETA: I suppose that does mean they are related. Not in the sense of being the same concept, but firearm accuracy limits firearm accuracy measurement precision. Whether the accuracy problem is that the sights are loose or the shooter is hepped up on goofballs, the accuracy of the gun affects the precision with which you can specify where it will hit.
 
Last edited:
I am not changing any definitions as far as I am concerned

Oh but you are as accurate and precise are synonymous.
Because "you" want to differentiate you are making up definitions.
Next thing you know you'll be trying to say close only refers to groups and near refers to a single shot.
 
words

There are three pistol shooting events in the modern Olympics. One is air pistol.....a precision event. Another is rapid fire - requiring great timing and accuracy but not as precise. The third is 50 Meter Pistol....a precision event.
The 50 Meter event comprises 60 shots taken over a maximum time of two hours. The 10 ring on the target is 50mm in diameter. The world record is 583/600. Iron sights only. One hand unsupported.
Pete
 
Oh but you are as accurate and precise are synonymous.
Because "you" want to differentiate you are making up definitions.
Next thing you know you'll be trying to say close only refers to groups and near refers to a single shot.
Lol, nope. Look up the terms and you will find they are different.

E.g. " Accuracy*and*precision*are defined in terms of systematic and random errors. The more common definition associates accuracy with systematic errors and precision with random errors. Another definition, advanced by*ISO, associates*trueness*with systematic errors and precision with random errors, and defines accuracy as the combination of both trueness and precision.”

http://en.m.wikipedia.org/wiki/Accuracy_and_precision

The trick is relating those concepts to firearms in a meaningful way. I have been saying that accuracy refers to the difference between where you wanted to hit and actually hit because that is a plain language way of describing trueness or systematic error (depending on how you look at things) in the context of shooting. In other words, error that is introduced due to something like the sights being misaligned. I have been saying precision is a measure of the noise (bullet weight variations, wind gusts, etc) because those are random errors. The exact words as they relate to firearms have been mine, but the concepts I am bringing up are very standard and you can easily research them for yourself.

You will find a consensus that there is a useful distinction between accuracy and precision. Denying that is...well, no skin off my nose of course, but kinda strange.
 
You will find a consensus that there is a useful distinction between accuracy and precision. Denying that is...well, no skin off my nose of course, but kinda strange.
There you go again with incorrect usage of words.
consensus:an idea or opinion that is shared by all the people in a group
Several people here agree with me there fore there is no consensus.

Being precisely inaccurate or wrong may be of importance to you it's not to me.
 
Interesting thread. My take on accuracy and precision, and how I originally learned the terms is what I posted in post #15 and ran with ilbill's post #7 illustrating the targets. My definitions were as the terms applied to Metrology but Metrology circa 1968. This thread has been enlightening to say the least and the fact that ISO ( International Organization for Standardization) has modified and changed the definitions over the years is also interesting.

Terminology of ISO 5725:
A shift in the meaning of these terms appeared with the publication of the ISO 5725 series of standards, which is also reflected in the 2008 issue of the "BIPM International Vocabulary of Metrology" (VIM), items 2.13 and 2.14. [1]

According to ISO 5725-1,[3] the terms trueness and precision are both used to describe the accuracy of a measurement. Trueness refers to the closeness of the mean of the measurement results to the actual (true) value and precision refers to the closeness of agreement within individual results. Therefore, according to the ISO standard, the term "accuracy" refers to both trueness and precision.

ISO 5725-1 and VIM also avoid the use of the term "bias", previously specified in BS 5497-1,[4] because it has different connotations outside the fields of science and engineering, as in medicine and law.

So my definition of accuracy being unbiased precision isn't really correct anymore. From everything I have read it also appears the two terms accuracy and precision can take on different meanings depending on how and to what they are applied.

Ed Ames:
The trick is relating those concepts to firearms in a meaningful way. I have been saying that accuracy refers to the difference between where you wanted to hit and actually hit because that is a plain language way of describing trueness or systematic error (depending on how you look at things) in the context of shooting.

I can buy that since we are looking at how the terms apply to shooting or more on target getting the bullets to go where we would like them to go.

How would the terms accuracy and precision stack up pertaining to my loading scale? Let's say I use a 5gr check weight and consider that to be the legal truth. My standard orf sorts. Each time I place that weight on my scale my scale reads 5.2gr. I can do this over and over multiple times and the scale always reads 5.2gr. To my thinking the scale repeats every time so I have a high level of precision. However, that 0.2gr high reading represents a 4% error. My scale accuracy is stated as +/- 0.1 gr. So while the accuracy is poor in this case the precision is very good if I view precision as a high measure of repeatability. I go ahead and calibrate my scale and now each time my 5.0gr weight is placed on my scale I read a perfect 5.0gr. I now have the same precision (repeatability) but I also have accuracy. Would that be true?

Ron
 
There you go again with incorrect usage of words.
consensus:an idea or opinion that is shared by all the people in a group
Several people here agree with me there fore there is no consensus.

Being precisely inaccurate or wrong may be of importance to you it's not to me.

Lol...yeah, yeah, you are right, there is no consensus here in this thread. This thread is cool because it has a combination of people who come from a STEM background and people who obviously don't, so it immerses us in the chaos of completely divergent thought modes. That's neat, and something we should all be happy about.

However.

You are also wrong about consensus. It doesn't require complete agreement. Straight from a dictionary: "the judgment arrived at by most of those concerned".

In the real world there is consensus that accuracy and precision are meaningfully distinct.
 
How would the terms accuracy and precision stack up pertaining to my loading scale? Let's say I use a 5gr check weight and consider that to be the legal truth. My standard orf sorts. Each time I place that weight on my scale my scale reads 5.2gr. I can do this over and over multiple times and the scale always reads 5.2gr. To my thinking the scale repeats every time so I have a high level of precision. However, that 0.2gr high reading represents a 4% error. My scale accuracy is stated as +/- 0.1 gr. So while the accuracy is poor in this case the precision is very good if I view precision as a high measure of repeatability. I go ahead and calibrate my scale and now each time my 5.0gr weight is placed on my scale I read a perfect 5.0gr. I now have the same precision (repeatability) but I also have accuracy. Would that be true?

Ron

Yes.
 
In the real world there is consensus that accuracy and precision are meaningfully distinct.
even if I accept your distinction that may or may not be a concensus depending on which dictionary you use ( My definition was staight from Merriam Webster) I would still contend that the distinction you are trying to make is far less than meaningful.
Precision is meaningless if it's inaccurate and accuracy doesn't really exist without precision.

So if you want to say you're precisely wrong instead just saying your wrong either way you're still wrong;)
 
Last edited:
Precision is meaningless if it's inaccurate and accuracy doesn't really exist without precision.

Precision and accuracy are distinct. You can have precision without accuracy and accuracy without precision.

But, in the real world, there is a level of accuracy that is deemed acceptable and a level of precision that is also deemed acceptable. So, it appears that they are interdependent.
 
The terms are defined for measurements in science and engineering:

Accuracy = closeness to the target

Precision = consistency in measurement
 
I would still contend that ithe distinction you are trying to make is far less than meaningful.
Precision is meaningless if it's inaccurate and accuracy doesn't really exist without precision.

So if you want to say you're precisely wrong instead just saying your wrong either way you're still wrong;)

Imagine a friend of yours comes up to you and says, "Hey Racer! I just got a gun and it sprays bullets all over the target! Help me!"

Assuming you want to help, what do you do? A reasonable starting point is to try shooting the gun yourself. Why? Because you are trying to determine how much of the error is induced by the shooter and sight misalignment (systematic) vs induced by various uncontrolled factors like gun, ammo, etc. (random).

In more precise language that means you want to see if the issue is one of accuracy or precision.

To help your friend you will use the concepts even if you reject the language.
 
reloadron,

the .2 grains is tare weight (kentucky windage in the shooting world) before you adjust (sight alignment in the shooting world) the scale.

ed ames,

_______________________(precision)____(trueness)
so, per iso 5725: accuracy = group size + sight alignment

just trying to interpret, codify and simplify all the info in this thread to come up with a usable way to evaluate shooting performance. are we gettin close?

mikemeyers,

good thing i didn't try and derail your other thread. would have been a mess.

murf
 
Last edited:
ed ames,

q (precision) (trueness)
so, per iso 5725: accuracy = group size + sight alignment

just trying to interpret, codify and simplify all the info in this thread to come up with a usable way to evaluate shooting performance. are we gettin close?

murf

Close, but I don't think that's the dividing line. But it's one of those commutative things. Meaning there is room for argument about where each part goes without changing the result.

Personally I think trueness includes more than sight zero, specifically where the user is choosing to pull the trigger knowing the sights aren't pointed at the same point they were pointed at for the previous shot. That increases group size but it isn't properly precision IMO. However, shifting that error to precision gives the same end result.
 
The original question was:

....You are asked to analyze the accuracy and precision of a target that has already been shot at. It has the concentric rings for scoring, and a dozen or so bullet holes in it. You don't know anything about the conditions under which it was shot, what kind of gun was used, slow-fire or rapid-fire, or anything else - all you have to go on is the target.

Question:
  • Can you say anything about the "accuracy" and the "precision" with nothing more to go by, than the target? If so, how would you figure out how the positions of the holes might indicate better/worse accuracy and precision?
  • If only the above information is not enough to allow you to comment on the accuracy and precision, what else might you need to know?
......


Would anyone here argue with the following, (which is based on what most people here have posted):

"The precision of that particular target (based on "a dozen or so shots") is equal to the maximum distance between any two shots.

"The accuracy of that particular target, based on "a dozen or so" shots" is equal to the sum of the [distance between the bullseye and the mathematical center of those dozen or so" shots], plus [the precision as defined above]."



This would generate a real number, in inches or centimeters, for any number of shots into any target, and anyone who did the calculation would come up with the same answer.

This definition (if you guys accept it) says nothing about how good or bad the numbers might be. To know that, you'd need to know the distance to the target, and any other relevant information. For example, you might give those two numbers, and also something like "hand held, at a distance of 25 yards, slow-fire".
 
Mike,

This isn't a mathematical answer, but:

If you look at a typical commercial target, it has spaces to make specific notes. E.g. wind, temperature, distance, etc.. The info called out depends on what the target is for but in general that's what the target designer thought you would need to judge a target shot in the way the designer intended.

I have seen targets that only had room for distance, gun, and ammo. Others had spaces for humidity, primer, powder charge, and so on. It depends on the type of shooting you are doing.
 
ed ames,

that is why i used "alignment", in stead of "adjustment" in my equation. i wanted to include all shooter-related adjustments (e.g., wind, mirage, angle of attack, elevation, etc.). i see the precision part of the equation as containing the inherent variations related to the gun, ammo and shooter (so we might consider them to be constant).

murf
 
mikemeyers,

precision is easy to quantify since it is group size. i agree with your assessment as that is how group size has been measured for as long as i can remember (at least 45 years ago).

i think your assessment of accuracy is a bit complicated. i think accuracy is easy to quantify. it is simply the score of your target.

murf
 
Status
Not open for further replies.
Back
Top