At the risk of offending everyone, a standard deviation of 25 is nothing to be worried about unless you're doing long range shooting with a 1/2 MOA or better rifle.
It's not at all unusual for commercial ammunition to have an SD around 35. I've tested "high energy" 30-06 ammunition that does indeed deliver another 100 FPS but also has an SD around 60. In their calculations, SAAMI assumes that the standard deviation of MV is 4% of MV. So their "norm" is that a 3000 FPS cartridge has a standard deviation of 120 FPS.
My 223 bolt action will do 5/8" five-shot groups at 100 yards all day long with ammo that has a 25 FPS standard deviation.
You can indeed get your SD down into single digits without a huge effort. Sort your bullets by weight, routinely anneal necks after about 4-6 reloadings, etc. But this is not a productive pursuit.
Variation does not add linearly. When there are many sources of variation combined, as there are when you shoot, the largest source will almost entirely determine the total variation. The math probably isn't something we want to do on this forum, but trust me on this. I do this stuff for a living. If your MV variation is the largest source of error in your marksmanship, you have my admiration. If it's not, there is no point in worrying about it. The one and only way to improve rifle accuracy is to find the single largest source of variation (usually you!) and work on that. Working on the second or third most important source of variation will generally produce almost no improvement in the overall picture.
A few years ago I did an article comparing various tools for measuring powder. To my utter astonishment, with ball powder the Lee Perfect Powder Measure has less repeatability error than a good balance scale, which in turn has less repeatability error than a digital scale. Individually hand weighing your charges will not get you more consistency than that little $25 powder measure. You could have knocked me over with a feather when that result came up.