The Essence of Warfare: Politics and Social Construction
While the argument between the weight of technology and
politics is certainly dichotomous, politics and social construction
will always be the central idea behind warfare, not technology. The use of technology in warfare is simply a
tool. This tool is built by man and
controlled by man. To simplify, any
technology we have ever developed is at the very least guided, if not
completely controlled, by man. Consider autopilot
on an airplane. While the plane may be
able to “fly itself”, it still requires the operations of a pilot to input its
destination. The pilot also has the
option to put autopilot on when he or she decides; or to not use it at all.
An even
more basic example is an alarm clock. If
you know you need to be up at a certain time, you can set an alarm clock. You can (for the most part) count on the alarm
clock to go off once you have set it to, however, the alarm clock will never
set itself. If you need to be awake at
9am, you must set the alarm to 9am. The time you need to be up is something that
you construct or decide. The time you
need to be up is something that you construct socially. You decide
when you need to be up, not the alarm clock.
These same types of rules are the basic reasons as to why social
relations and politics will always come before technology. In the study of philosophy, there is a famous
quote saying, “Perception is reality”.
Applying this to warfare, if we perceive another nation to be a threat,
they are a threat. It is through meaning
that we make sense of the world. Just as
the example from lecture 3 talks about a mountain being socially constructed. This
aspect is purely political. It is our political stances that persuade us
to adopt different types of technology.
The
importance of politics in warfare is illustrated in Becker and Shane’s article,
“Secret ‘Kill List’ Proves a Test of Obama’s Principles and Will”. The article summarizes many of the methods
the US uses to “capture or kill terrorists”.
However, after discussing drone strikes, the author says “it is the
president himself who reserves the final moral calculation,” (1). By definition, we know that politics is when
two or more people have interests that cannot be fulfilled in current
conditions. Becker and Shane’s talks specifically
about drone strikes. The political
aspect here is whether or not to allow drone strikes to kill wanted
terrorists. Drones are a recent technological advancement that was
created particularly for the war in Iraq and Afghanistan. Many people would argue that drones are the central part of this war. But just as the quote in Becker and Shane’s
article says, it is the president who reserves the ultimate moral
calculation. Not the drones. Just as your alarm clock does not set itself,
drones do not decide where and when to launch.
The president when and where; and even decides if he wants to use this
type of technology at all. It is his
political views that control the technology.
Becker and
Shane’s article later goes on to say that “war…appears to be ingrained in human
nature,” (3). The importance of this
quote is the connection between war and human
nature. The intention here is to
make the reader understand that war is between people. Regardless of the
types of technology countries develop to attack each other, the root of
conflict lies within the politics of the human race.
In terms of
the war on Iraq and Afghanistan, the United States felt drones were effective
technological tools given the tough, mountainous terrain. Therefore, they developed this technology. Technology,
therefore, is tailored to the political views.
If you have no use for technology, you don’t develop it. Political views and social construction are
what decide what technology is useful. Regardless of technology, the ultimate
goal of war remains the same. A quote
from Arkin’s article titled, “The Case for Ethical Autonomy in Unmanned System”
says “the tendency to destroy the adversary which lies at the bottom of the
conceptualization of war is in no way changed,” (1). Technology may change, but the goals of
warfare do not.
While I agree with most of what you're saying, I think that the Arkin article should give you some pause. He's essentially making a case for the ability of robots to make decisions about whether or not to take lethal force during military operations, with humans removed from the loop entirely. Wouldn't such a system seem to place a much heavier emphasis on technology, than social construction? Of course, one could argue in this futuristic type of situation, the nature of warfare again collapses to the social/political because those forces will shape the design of the robot being used.
ReplyDeleteNathan,
DeleteI think you hit on the social constructionist argument there. Even if robots end up making the decisions, we made the robots, programmed them to make certain types of decisions and use them in certain situations.
Sean,
While you argue that it is social construction that determines political outcomes, could we be doing what we are without the technological developments?
This comment has been removed by the author.
DeleteThanks for the interesting feedback everyone. First, to address your point, Nathan, I agree that you provided a good point for the social construction of warfare at the end of your post. Social and political shape the design of technology. If we don't need it, we don't develop it; and if we don't develop we don't need it. Thus, social/politcal is the root.
ReplyDelete@ Professor S. I do not feel that we could be doing all of the things we are doing without the technology we've CREATED to do it (I emphasize created to support my social/politcal argument). However, things such as targetted drone strikes are something that prevents ground troops from having to go in an eliminate a target. Thus, it could still be done without the drone, but with the risk of soldiers lives, perhaps more civilian lives etc. Thus, the technology is designed to increase efficiency, but not change our social/politcal views.
@ Matt it is true that more technology is completely reliable but neither are humans. Sending in ground troops to eliminate a target isn't always effective either. Consider, perhaps I don't feel that a technological error is any more or less devastating than a human error.