I continually see comments on this forum about sectional density being some kind of a predictor of terminal bullet performance. Such as a 140g 6.5 bullet providing better penetration than a 140g .277 bullet, and therefore being superior on big game. As the SD is no more than weight divided by the square of the diameter, I can grasp how a non-deforming solid of minimally higher SD would out-penetrate a bullet with a lesser SD...in theory. But since the vast majority of hunting appropriate bullets expand in a manner much more determined by what they hit than what they looked like getting there, how can small differences in SD even be quantified in terms of actual penetration? Seems to me the unpredictable differences in expanded diameter when contacting hide/meat/bone, especially when squared, instantly obviate any theoretical advantage the higher SD bullet might have had. Sure, I can see how a doubling of SD would predict better penetration, all other things being equal, but a few ticks three places to the right of the decimal point? Has anyone demonstrated - in a repeatable manner - that slight increases in SD, with equal expanding bullets, equate to more penetration IN something? At a given velocity and bullet construction, I'd guess bullet weight means more to penetration than SD.