mcb
Member
+/- 0.5% accuracy is IMHO pretty good for a lot of sensors in use out there and as an Engineer I have used a fair number of instruments for measuring a variety of things. For a chrono that you only paid a few hundred dollars at most +/-0.5%r seems pretty good in my estimation.
If you were chronographing a load and your average velocity for a small sample measured 3000 fps and you worked up a ballistic table based on that measurement but the actually velocity average was 3015 fps (0.5% error, assuming all the error was in the positive direction) that would move you point of impact up by rough 3-4 inches for a decent bullet (say a ~ .5 BC (G1) give or take a little) at 1000 yards.
You may or may not agree that +/-0.5% error is good or bad but the reality is a lot of long range shooter use those instruments along with some other tricks and manage to create ballistic tables that get them hits out well beyond 1000 yards.
If you were chronographing a load and your average velocity for a small sample measured 3000 fps and you worked up a ballistic table based on that measurement but the actually velocity average was 3015 fps (0.5% error, assuming all the error was in the positive direction) that would move you point of impact up by rough 3-4 inches for a decent bullet (say a ~ .5 BC (G1) give or take a little) at 1000 yards.
You may or may not agree that +/-0.5% error is good or bad but the reality is a lot of long range shooter use those instruments along with some other tricks and manage to create ballistic tables that get them hits out well beyond 1000 yards.