Attrition: A Footnote In History

Archives

July 18, 2009: So far, U.S. forces in Iraq have suffered 36,000 combat casualties. Some 12 percent of those were fatal, and another 20 percent left the soldier with some degree of permanent disability. Over 30,000 troops suffered injuries from disease and accidents. The combat casualty rate was about a third of what it was in Vietnam and World War II. Disease losses were down as well, but not by as much. That's because the Persian Gulf has always been a disease ridden place, and especially debilitating for anyone who didn't grow up in the area. This was discovered during World War I, when large numbers of European troops were in the area, and sickness put a large number of them out of action. Meanwhile, the eight years of warfare in Afghanistan has created only about 30 percent as many U.S. and NATO casualties, mainly because there were fewer foreign troops there, and there were fewer enemy fighters to deal with.

While the Iraq war is winding down, the Afghanistan operations have a few more years left. The two wars will probably end up causing about 45,000 American combat casualties. For comparison, in the last century, we had World War I, with 320,000 casualties, World War II with 1.07 million casualties, the Korean war, with 129,000 casualties, and Vietnam, with 212,000 casualties.

What makes Iraq and Afghanistan wars different is the lower proportion of dead (12 percent) compared to World War II, Korea and Vietnam  (all 29 percent). The casualty rate (percentage of troops involved who are casualties) in Iraq was a third of the  World War II, Korea and Vietnam rate. The rate was even lower in Afghanistan.

There is no one reason for the lower casualties, and lower casualty rate. But there are two things that are different that seem to account for most of the differences. First, there is technology. Better body armor and armored vehicles, plus much improved medical treatment, prevented many casualties, and ensured that many wounds did not turn into fatalities.

But the most important factor has been the quality of the troops and their leaders. Iraq and Afghanistan are the only wars where the U.S. had a professional force. Conscription was used in all those other wars, meaning that there were lower physical and mental standards for the troops, and a much higher percentage of them were inexperienced. In contrast, the force that went into Afghanistan and Iraq had benefitted from the end of conscription in 1972, and the subsequent growth of a much more capable professional force. Entrance standards for enlisted troops and officers were raised, and terms of service were longer. That the resulting force should be more effective was really no surprise. The history of warfare, going back thousands of years, provides numerous similar examples. Conscription will get you numbers, and maybe even some enthusiasm, but professionals always have a big edge. This is particularly true when the pros are not overwhelmed by much larger forces, or do not have a technological edge. Professionals are more expensive, and often rely on more technology. The U.S. military of the late 20th century was also very quick to adopt new technology. Thus they went to war as the best trained, led and equipped. While this is not much discussed now, it will be in the future, at least among military historians.