- Hamlet's ghost daddy appears and tells Hamlet he's dead (duh, dude you're a ghost...) and that his brother, Claudius, is the guy who done the deed.
- Hamlet and ghost daddy swear revenge upon Claudius.
- Everyone dies.
Pretty much all of Shakespeare's tragedies have a lot of dying and stuff in them. What's cool though is that EVERYONE in a Shakespeare tragedy is fair game. The dude was killing title characters centuries before Game of Thrones made it a thing. Take that George R.R. Martin (but please don't die before you finish the books, the HBO writers aren't that good without your source materials).
Anyway, it might seem like a stretch to connect a play about a ye olden king of Denmark killing people and connect it to a modern day flying machine with lots of cool tech. In some ways you're right. Hamlet's version of 'death from above' was hiding behind a curtain and stabbing in a downward motion. Drones are better at it than Hamlet, they fly and have rockets. It's not a very fair comparison.
The insight Hamlet provides is in the present day debate surrounding the use and ethics of drones. The use of drones to kill insurgents is hotly debated and evokes concerns in some about autonomous Terminator-like robots roaming the battlefield. Detractors argue that using drones to kill is a slippery-slope and could normalise the idea that autonomous weapons are acceptable despite the unknown risks they may pose (e.g. the fear that a machine with advanced A.I. will become a danger to humanity). The counter argument is that drones are a merely another tool of war that allows us to keep soldiers out of harm's way while still striking targets that would be impossible or difficult to get to otherwise. To some degree both arguments are valid and each side raises important points that need to be addressed in any comprehensive discussion moving forward.* Luckily, drones and robots are still rather lame compared to our fantasies so we can try and figure things out before we might have to welcome our new robot overlords.
Expectations
Reality
This brings me back to Hamlet. In act 2 Hamlet verbalises one of the play's central conceits: the lack of certainty in the face of a complex world. In particular, this uncertainty is difficult when faced with a morally complex situation (your ghost daddy asking you to kill your uncle who married your mom qualifies as 'complex'). A brooding Hamlet broods broodingly in act 2 and utters one of the most succinct lines on moral complexity to this day: "...for there is nothing either good or bad, but thinking makes it so".
In other words, things aren't good or bad in and of themselves. It's what we do with things or how we think of things that affixes notions of 'good', 'bad', 'right', and 'wrong' to that thing.
"For drones are neither good or bad, but using them in a manner devoid of a broader contextual understanding of their implications makes them so"
As Hamlet wisely pointed out, we shouldn't think of drones as either 'good' or 'bad' in and of themselves. We design, build, and use drones, we are responsible for them. Whether or not they are inevitably good, bad, or both is up to us. Just looking at the current US drone program shows how this technology is a morally complex enigma. While few would say that the use of a drone (as opposed to a missile or bomb) to kill a known terrorist in a situation where his capture is impossible is bad, the use of drones for so called 'signature strikes' is... questionable to say the least. Here we have two similar uses for a drone (kill terrorists), but the context under which they are carried out is different. In the case of signature strikes, targeting people is based on a believed profile which matches terrorist or insurgent M.Os. Signature strikes are morally questionable because the person being targeted may not be a terrorist, they are merely suspected of being one based on an agreed upon 'profile' which a target is compared against. This has lead to multiple incidents where a drone strike was ordered and civilians were killed as a result (Homeland anyone...). In turn, this has raised questions of whether or not signature strikes are counter productive since they may actually radicalise more individuals and breed new resentments within populations where they are carried out.
However, drones also hold substantial promise to improve lives. Drone delivery services capture the headlines these days, with Google announcing plans for a 2017 program to get off the ground (pun!). While thinking of our Amazon Prime orders being flown in is amusing, these types of programs could also help us develop delivery programs to fly needed medical supplies or food aid to communities more efficiently, more quickly, or respond to disasters much faster as well. When the Fukushima Incident was developing drones were used to observe and record data in areas inhospitable to humans. UN peacekeepers are also using non-weaponised drones in their missions in the Democratic Republic of Congo and Northern Mali to enhance their Intelligence, Surveillance, and Reconnaissance capabilities (ISR) and help the peacekeepers operate more effectively. Even in cases where drones are used to kill, there is evidence to suggest that they are better than other alternatives available (Though it must be noted that these studies don't necessarily look at instances where a drone was used when no action would otherwise have been taken. That is, have drones also lead to an overall increase in strikes and therefore increased the total risk of increased civilian casualties).
Drones are sometimes seen as unique since they introduce not only the issue of how a war is fought but also what is doing the fighting. But this is a bit of a straw man argument. Drones are developed by us and they are not operating offensively in an autonomous manner, there's someone always 'in the loop'. Until such a robot is introduced and deployed, drones are not that different from virtually every weapons system introduced with the goal of increasing lethality and adding more distance or protection to the soldier using it. A gun can be used to kill in anger, or it can be used by a cop to protect the innocent. In either case it is how the gun is used that makes the difference, not the gun itself. Understanding the limitations and potential outcomes of how a piece of technology is used is essential before we can think of it as good or bad.
Besides, even the Terminator became a good guy after the first movie...
* This formation of these arguments has been simplified here for brevity's sake. The debate has many levels of nuance and complications which encompass an array of fields from A.I. to philosophy. I hope to address some of these issues in future posts.




No comments:
Post a Comment