i have been looking for a definition of this term and can't seem to come across anything. i've a general idea but really want to know how i can predict accuracy at various ranges given the moment of angle. thanx in advance rick
The term typically used to describe the inherent accuracy of a firearm is "minute of angle". A minute, of course, is a measure of angle equal to 1/60th of a degree.
If you drew two lines that started at the same point, were 100 yards long, and were separated by an angle of one minute, the ends of the lines would be approximately one inch apart (the actual figure is 1.04-something-something-something inches).
So if a rifle can group all of its shots into an inch at 100 yards, it's said to have "one MOA" accuracy.
An iron-sighted AR-15 is good for 2.5-3 MOA. A good deer rifle should do 1.5-2 MOA. Varmint and sniper rifles typicall fall between .5 and 1 MOA, and benchrest rifles tend to hover around .25 MOA.
Ross Seyfried, who some have called the best shot in the world, once shot one-inch groups with a revolver at 100 yards just to prove that it really could be done. However, the handguns that most of us deal with aren't that accurate--especially with iron sights. Two inches at 25 yards is pretty good, and that translates into 8 MOA.
If you ever find a 1911 that consistently groups into one inch at 25 yards (4 MOA), never let go of it.