Lets say my handy on line reference tells me that a gunshot is 170 db... but 170 db at what distance . the standard equation drops 6 db every doubling of distance from the initial reference point. If for example we were to imagine that the pistol was 170 db measured at 1/1000 for an inch from the gun. doubling that very small initial area would result in a very small sound at distance of only a few ft away. were that 170 db measured at a 1000 ft, we could easily expect a gunshot to be heard clearly many many miles away.

Is there a standard assumption of how far away most decibel measurements are taken at?