Failsafe Revisited…Psychology and Robotic Delivery of the Bomb
It’s not too soon to ask if aircraft drones equipped with small nuclear weapons are in our military future. The answer is yes, but it is less certain the psychology and limits of using such technology are as clear. As the United States and Russia embark on a new era of nuclear arms control in their effort to replace the 1991 Strategic Arms Reduction Treaty, a new pact prohibiting unmanned nuclear armed drones seems a survival imperative.
[picapp align=”center” wrap=”false” link=”term=obama&iid=7409558″ src=”2/b/8/a/US_President_Obama_297c.JPG?adImageId=8665860&imageId=7409558″ width=”380″ height=”283″ /]
Tactical conventional weapon drones are currently used with precision impact against terrorist operations in Pakistan and other conflict points. They spare conventional pilots the extreme danger of being shot down, can circle target areas for hour at a time, perform exacting reconnaissance, have a long history of success and can be remotely controlled thousands of miles from the battlefield. They are now clearly an established instrument of American foreign policy. Despite issues of “collateral damage,” such drones are highly effective.
Remote warriors simulating lethal drone technology…is delivery of nuclear weapons using such robotics next?
As nations assess future military capabilities, it is not surprising that strategic use of drones (including such devices with tactical nuclear weapons) is on mankind’s doorstep. But crossing the tactical/strategic nuclear boundary when considering robotic air warfare is a threshold that we dare not cross. Before it gets too late, this technology should be arrested, contained and outlawed on a planetary scale.
Recent open discussion in the military press has centered on whether strategic bombers should be replaced by nuclear-armed drones. In the June, 2009 issue of Armed Forces Journal, Air Force Research Institute Professor Adam Lowther pondered “whether it’s time to pursue a long-range, unmanned and nuclear armed bomber.” ArmedForcesJournal.com published a November, 2009 article by Col. James Jinnette, warning the “defense establishment has become seduced by the idea of unmanned airpower,” some of which may be controlled by artificial intelligence. He points out that judgment and “creative capacity” may be pushed aside by such technology. With these voices, future militarization takes on a most serious debate, as the world is embarking into uncharted intellectual killing territory.
According to PW Singer in his TED talk of February, 2009, robotic war “changes the experience of the warrior, and even the identity of the warrior.” (See video). The easier and faster it is to initiate a tactical nuclear attack, without endangering crew lives, the more we hide behind robotics to accomplish our human instinct to kill. According to Singer, “Another way of putting this is that mankind’s 5000 year old monopoly on the fighting of war is breaking down in our lifetime.” The more we rely on machines, computer programs and remote control technology, the closer we approach the point of no return by (ironically) further dehumanizing war. Tactical military robotics with conventional weapons can save lives, but nuclear equipped robotics can help end all life.
Much of 20th Century nuclear policy was based on the psychology of “mutual assured destruction.” Human emotions controlled the threats. It is that mindset that has helped us reach 2010. Another reason we have survived is that humans have instincts, and, at the personal level, the desire to survive. It is those qualities that helped avoid an accidental nuclear exchange in 1995 when Russian Rocket Forces mistook a scientific missile launch for an ICBM attack. It is the exercise of reason and intuition that spared America during the 13 days of the Cuban Missile Crisis. The more we encumber the exercise of human judgment (despite it’s frailties) by relying on highly complex but remote technology via nuclear delivery systems, the more inhumane, mechanical and likely nuclear war actually becomes. Machines lack consciousness, and if programmed improperly, they can be subverted to misunderstand logic.
Scrutinizing psychology and technology, consider five practical questions posed by nuclear armed drone capabilities.
- If pre-positioned drones with tactical or strategic nuclear weapons are employed, there will be less time to recall them in the event of human miscalculation. True, once existing (and ready) intercontinental ballistic missiles are launched, there are precious few minutes to avert nuclear destruction. missile defenses would be of no value, given the extreme maneuverability of drone aircraft. The current time buffer to detect and kill an incoming threat is significantly reduced, however, by drones already at the target area, waiting for the command to destroy. If war is the result of human failings, we exponentially enhance mutual destruction if by allowing for robotic nuclear delivery systems which are far more flexible and timely than modern ICBM’s.
- If nuclear armed drones are deployed as instruments of national policy, we risk international isolation and condemnation from angered and threatened populations which are in harm’s way. (The Japanese have been outraged by the forward positioning of nuclear forces for decades). Nuke drones may actually increase the specter of war itself from threatened international actors such as nations and organizations with the ability to embrace, and use, identical technology.
- Since U.S. Preditor drones have already been hacked during the Bosnian war, and reportedly by Iraqi and possibly Afghan insurgents using open source $26.00 software, what is to prevent enemy high-tech warriors from taking control of future unmanned aerial vehicles (UAV’s) and re-directing them? (See December 17, 2009 CNN report).
- Given the potential for literally thousands of these lethal UAV’s pre-positioned across the globe, does it make sense to create new nuclear delivery vehicles which could replace or supplement existing missile technology? The Obama Administration publicly seeks reduction and eventual elimination of ICBM’s, but if all we are doing is substituting one class of vehicle for another, arms control efforts would merely be a shell game. Furthermore, if stealth technology is employed in shielding UAV’s, national technical means of verification (a key issue which is holding up a new treaty between the United States and Russia) would be next to impossible.
- Can failsafe controls be employed effectively in nuclear UAV’s in an era of shrinking budgets across the globe? Rational military experts need double redundancy and recall controls up to the last seconds before pushing the button. We must not let technology get ahead of common sense.
There should be absolutely no debate that completely automated doomsday drone machines should be abolished in the upcoming arms treaty currently under review in Moscow and Washington. The likelihood of such a prohibition, is, of course, fraught with many human complexities. Just as in global warming and climate change, the world needs to wake up to the next great challenge of arms control, and avoid what happened with “the bomb.” We tried to control it, but well after it was too late to contain.
Let’s promote a multi-lateral treaty banning nuclear drone warfare.