by Peter Custers
The war was defended on the grounds that the Libyan people needed to be protected against their dictator via a “no-fly” zone and the public were made to believe the West exclusively aimed at defending the humanitarian interests of Libya’s population. Now, concerns among the Western public over Libyan events have thinned. The need to camouflage war aims has concomitantly decreased.
Now it’s time to highlight some of the long-term implications of the Western intervention. A sound but difficult test case is the West’s use of depleted uranium weapons. Though U.S. and British officials have so far denied their employment over Libya to overthrow Qaddafi, speculation has been rife that ammunition used by the U.S. and NATO (North Atlantic Treaty Organization) contain “depleted” uranium. What to make of these stories?
First, the record on previous uses of so-called depleted uranium weapons is unequivocal. While the very word “depleted” or impoverished appears to indicate that arms containing this type of uranium are not very dangerous, depleted uranium well exemplifies the intractable nature of nuclear production. For the radioactivity spread by these weapons is not just long lasting, it is perennial in a literal sense. It is said to last into the future for nearly as long as planet earth exists: some 4.5 billion years.
Yet for two reasons the U.S. and European states have historically opted to build weapons with everlasting radiating effects. Depleted uranium, largely consisting in uranium-238, is a very hardy metal. Hence it can be employed to strengthen military vehicles and arms. Also, arms containing “depleted” uranium can easily pierce the armament systems of any less powerfully equipped enemy.
The radioactivity spread by these weapons is said to last into the future for nearly as long as planet earth exists: some 4.5 billion years.
But how damaging is the use of depleted uranium in war really? It emerged as a by-product of the process of nuclear enrichment – massive quantities of depleted uranium originally needed to be put aside as waste. Their new destination therefore might appear an appropriate answer to the generation of waste. Yet the deleterious impact of materials containing a relatively “low” dose of radioactivity, as uranium-238 does, have been exposed for decades, well before they started being channelized towards Western weaponry.
The best documented has been the consequences for Iraq – where depleted uranium weapons figured in U.S. tank shells and bombs fired in the 1991 Gulf War, and also in the Occupation War – started in 2003. Two French journalistic accounts published in 2001 have given detailed descriptions of the effects suffered by Iraq’s civilian population after the Gulf War.
The extensive field investigation carried out by the priest, Jean-Marie Benjamin, revealed that there had been a 350 percent increase in the rate of malformations in Iraqi babies at birth, such as dislocations of brains outside the head and of eyes at an unusually wide distance. Again, there have been reports that the number of blood cancers, leukemia, in Iraqi children has not just increased, but multiplied.
Academic reports, for example by the conservative American Rand Corp., have similarly spoken of indiscriminate risks for the lungs and digestive systems of civilians and combatants alike. Radioactive dust may be inhaled after explosions of depleted uranium shells, or people may become radiated after contact with unexploded shells in war zones. The toxic effects from depleted uranium weapons, such as for human mutations, have been recorded too.
Third, not only has the danger of depleted uranium weapons’ use by Western powers been put on record by a variety of sources, the use has also been delegitimized – thanks notably to sustained campaigning by anti-war coalitions over the past decade. Western analysts studying the U.S. and NATO war strategies have long ago admitted that depleted uranium weapons, when spreading their radioactivity, do not differentiate between military and civilian targets.
Significantly, the General Assembly of the United Nations has thrice adopted resolutions expressing its concerns over the given weaponry. In the third resolution adopted towards the end of 2010, no fewer than 148 U.N. member states demanded from states employing depleted uranium weapons that they frankly “reveal their use” whenever asked to do so by affected countries. Perhaps unsurprisingly, four U.N. members voted against: the U.S., Britain, France and Israel. The three countries now waging war against Libya, plus Israel, stood opposed to an overwhelming majority of states expressing humanity’s growing anxiety.
Since the start of the war against Qaddafi, speculation by critics on the likelihood or risk that Western powers use the discredited weaponry in Libya has primarily focused on the potential inclusion of depleted uranium in two types of weapons: as warhead or armor enhancing material in cruise missiles, or as part of the shells fired by A-10 military planes. In view of the past, inclusion in the shells fired by the A-10 Thunderbolt is more than likely.
The U.S., Britain and France, the three countries now waging war against Libya, plus Israel, stood opposed to an overwhelming majority of states expressing humanity’s growing anxiety.
Although Western officials routinely deny that they have used depleted uranium in the war on Libya, they have not excluded its possibility either. There are ample reasons to suspect that the denials are a war tactic – as was the initial denial stating that Western powers do not target bringing down Qaddafi’s government. The fear is justified that the Libyan civilian population will face long-lasting radiation effects from depleted uranium weapons used over their territory.
Dr. Peter Custers is author of a theoretical study on nuclear production, “Questioning Globalized Militarism” (Tulika/Merlin Press, 2007). He can be reached at email@example.com and through his website, petercusters.nl. This story first appeared on Inter Press Service. Copyright © 2011 IPS-Inter Press Service.