Well, that's the kicker, isn't it. None are that accurate, not even doppler. And - even if they were - how would you be certain?
Optical ones like the ProDigital are pretty reliable for how much they cost (they've got very cheap lately!!). They claim 1% accuracy (+/- 30fps @ 3000fps), but when using them in pairs with good conditions and good setup, I've seen first hand they are capable of better than 0.5%.
I have to make absolutely sure the muzzle blast doesn't screw it up, most people put chronographs WAY too close to the discharging weapon then scratch their heads trying to figure out why their numbers are all over the place. I usually put mine out at 20' and 25' for rifles. Fresh batteries are a must. Have to make the shooting plane and the chronograph line up as squarely as possible; bubble levels fixed to the body and a good, solid, adjustable tripod help there. Clouds and snow .. forget about getting reliable results and leave the chronographs at home; just go shoot.
Given proper conditions, and set up, I'll usually see around +/- 10fps difference reported between my pair. Your mileage may vary. I feed all the data out via USB. By averaging out the differences between the pair of chronographs for each shot, prior to doing standard deviation, it helps iron out equipment irregularities.
I should have, perhaps, been more careful with phrasing. Rather than specifically name a fps, or some arbitrary number which will vary from caliber to caliber, what you are really after is a very low standard deviation across the mean. Measuring errors and fliers will be ironed out in statistical analysis, you're after a taller "curve" in analysis. Whereas, if your shots consistently fall across a short, broad curve (higher SD), velocity is not as consistent; and neither will exterior trajectory. SD will be higher.
Any statistical analysis gathered are meaningless without comparison sets to build trending patterns, which is why conditions are so important when working up a load - not just for the weapon, but also the measuring equipment. I can't get reliable results about 1/2 of the year in Illinois, there's either snow on the ground, it's cloudy, or the sun goes down before I get off work and can get to the range.
Anyway, absolute min and max don't matter nearly as much as how the data populates across the curve for standard deviation. Comparing loads side by side in this fashion can yield good results to form opinions on whether the ammunition in one load is "more consistent" than others. It irons out inconsistencies that simply measuring maximum group size, would NOT be visible.
In short, I don't care if I can shoot a group of a given size at a given distance once; no matter HOW good or bad a load or rifle is, there's always a chance all the planets will align once in a lifetime for a perfect group. Rather, I want to know (with some certainty) how often I can put down a particular size group at a given range. That's why statistics are important - it answers the question we're after - How likely am I to put all of my shots in a given area, in these conditions, with this rifle, with this load, at this distance.