Live Chat
Order Now

Analysis Paper on Autonomous Weapons Systems

Home Free essays Analysis Analysis Paper on Autonomous Weapons Systems

Analysis Paper on Autonomous Weapons Systems


Autonomous weapons are known as the lethal autonomous robots that would be destined to identify and attack targets which may also include human beings by use of lethal forces in absence of human operators. Contrary to the contemporary high-rated automated defensive systems that are designed to shoot incoming missiles, autonomous weapons are pre-planned to operate without a tightly controlled spatial and time limits. However, in the ordinary scenes, such weapons have not been used in battle fields. The use of the autonomous weapons in the line of war has been the debate of the majority of states in the world, which has spurred rigorous discussion on the logistics involved and the effectiveness of the intervention. Consequently, this has resulted in a pressure mounting on the international laws regarding the issue. The armed drones were not geared remotely by human beings in bunkers, but only programmed under pre-planned circumstances in selecting and firing at certain predetermined targets. Though it may be perceived as science fiction, the deployment of such measures in combating crimes is far off. However, ongoing research may prove the radical simulation of the autonomous weapons in use at battle fields among other legitimate fronts.

Get a price quote
- +

First Order Discount 15% OFF

Legitimate Autonomous Weapon Systems

Most countries in the world, including the US, have been engaging near-autonomous weapon systems while targeting other machines, which may include the defensive systems on board certain naval vessels. In such scenario, ships could prove helpless in the event of missile attacks advancing at speeds greater than human reaction moments. Other autonomous weapon decision comprises of the possible implications on human targets, which is likely to plunge into global domains soon. The prominent shift from human-controlled applications to automated machine control is likely to be adopted. Such evolution is deemed inevitable in the contemporary technological advancements due to the increasing demand for greater protection against military personnel invasion and other civilians. Besides that, technological inquiry also reveals that similar technologies can also be used in driving cars and in the robotic surgery missions, among other activities (United States 1922).

The proposition of the international ban on the autonomous weapons by the world treaty argues that such automated weapon technologies pose high risks on human welfare leading to subsequent erosion of ethical standards. Similarly, the provision also argues that artificial intelligence is likely to fail consistently in complying with laws of war (International Humanitarian Laws); particularly, in the prospects of distinguishing between the combatant crews and the non-combatants, thus, causing extensive collateral damage. Indeed, world treaty also asserts that the passage of the automated weapons may produce unpredictable implications on the lives of individuals by leaving international decisions to kills in the hands of machines. Furthermore, such measures may be irresponsibly manipulated. However, the proponents of treaty ban on automated weapons are deemed to fail, particularly in constraining nations or other actors which are eminently perceived to treasure the use of such weapons. Furthermore, the bans on the use of automated weapons through the global treaty is also likely to pose significant challenge, as it would encourage the perennial users an advantage of using it extensive by the fact that other states have been banned from such interventions. Besides that, the enforcement of such bans is hampered by the fact that the automation is likely to take place gradually, and also given that the ban is complicated to design (Alston & Philip 2011),

Nevertheless, autonomous weapon systems cannot be deemed as inherently unethical or illegal, as they can be very effective in serving the law in the battlefield. However, the world nations must collectively seek to address the challenges that automated weapons pose, especially on the human rights. Indeed, just as new military technologies act across the existing normative frameworks, there can be better resolutions than the global treaty.

Is the existing international law enough or should the international community negotiate a new treaty?

There have been ranging wars on the legitimacy of automated weapon systems. In 2009, the International Committee for Robot Arms Control (ICRAC) was instated by a group of professionals headed by Noel Sharkey. This followed the crafting its mission which aimed at launching debate on establishing an international framework for prohibiting autonomous weapon systems. Indeed, there has been relatively rapid development in military robotics, which poses dangers on both regional and global peace, international security, and the civilians in war. This move made a discrete framework upon which the cracking down on the use of autonomous weapon systems could be grounded. Indeed, the major interventions have been based on certain pillars, such as prohibitions on development, and use and deployment of armed autonomous systems, while asserting that the decision to kill must not be left at the discretion of machines. Essentially, the present autonomous weapon systems do not to comply with the legal requirements of International Humanitarian Law (IHL).The possibility of the future technology to comply with the legal provisions should be explored to ensure that international human rights are not violated. The main concern, however, is based on the ability of autonomous weapon systems to comply with the principle of proportionality and distinction as required by Geneva Conventions. This also implies the possibility of holding someone accountable for damages caused by the systems (Doswald-Beck & Vite 1993).

The baseline of legal and ethical guidelines, which serves to control the introduction and use of new weapons, is basically dealing with discriminations and proportionality. Discrimination asserts that any legal weapon must be aimed at lawful targets. In this regard, it is deemed to discriminate between two perspectives, civilians and military targets, as well as their subsequent objects. On the other hand, the principle of proportionality in international law asserts that weapon validity must also entail careful evaluation of the expected military benefits from its use, while taking into account its harm on the civilians. Indeed, the harm on the civilians must not exceed the military gains. However, this calculus is very difficult and forms the base upon which international laws on evaluating the deployment and the use of automated weapons is grounded. The international law fails to provide a discrete formula upon which the computation of collateral damage versus the military gains is done. However, the international laws and ethics provide that any military action must undertake the calculation to establish the legitimacy of the intervention (ICRC 2008).

In the light of policy development, international humanitarian laws did not aim at preventing armed conflicts or the subsequent confinement to territorial borders. Instead, it aimed at regulating armed conflicts in the event of their occurrence. From the legal perspective, the responsibility to prevent armed conflict as well as territorial confinement is essentially not within the dispensation of humanitarian laws. However, it is confined within the United Nations chapter and the law of neutrality. The law of neutrality, however, does not inhibit the manipulation of humanitarian laws with regard to time and place of occurrence of armed hostilities (Chesney & Robert 2010).

save 25%

Benefit from Our Service: Save 25%

Along with the first order offer - 15% discount (with the code "get15off"), you save extra 10% since we provide 300 words/page instead of 275 words/page

Indeed, the objectivity of existence and goals of armed conflict is met while applicability of humanitarian laws is not restricted territorially, but governs the affairs between belligerent without considering geographical settings of the entities involved. However, the applicability of humanitarian laws is not geographically confined, such as in the case of the law of belligerent occupations, but applies whenever belligerent attacks occur. Consequently, the international laws provides that any drone attacks and robotic weapons employed in the light of armed conflicts are basically under the legal administration of humanitarian laws without due consideration of territory. However, the laws lack singular provision that provides a direct and straight evaluation of the effect and legality of intervention in the event of automated weapon system detonation. For instance, the extra-territorial application of robotic weapons is legally bounded in humanitarian laws, but also depends on other international provisions, such as the UN charter, which prohibits hostile interventions among belligerent parties even at the legality of humanitarian law with regard to the latter act (Doswald-Beck & Vite 1993).

Furthermore, there are numerous conflicting provisions with respect to interstate forces which hamper the legal effect of the international laws with respect to the automated weapon systems, as well as other robotic instruments. As a result, the contemporary international laws do not sufficiently address the challenge on the automated weapon systems and, therefore, ought to be addressed amicably from the perspective of inter-state application and provide a framework for swift application across the globe. However, new treaty may not sufficiently address the issue (Crawford & Olleson Simon et al. 2010). Consequently, international community should instate processes upon which the automated weapon ban treaties can be established.

Is a ban on AWS desirable?

The contemporary instances of armed conflict, which imply the inter-state fight over certain discords, have led to the increase in usage of automated technologies. One of the most prominent automated technologies in conflicts is the piloted drones in US military attacks in various states. The combat aircrafts basically entail a number of complex automated take-offs and landing. Furthermore, they entail the use of GPS waypoint finding mechanism and the maintenance of orbits around the GPS location at a defined altitude. Automated weapon systems lead to widespread collateral damages, thus, contradicting the provisions of international human rights. As a result of these concerns, the international community has ended up establishing a ban on the deployment and use of automated weapon systems in order to alleviate the human life losses to drone attacks and other automated invasions (Living under Drones: Death, Injury, and Trauma to Civilians from US Drone Practices in Pakistan 2012).

Essentially, the ban on autonomous weapons is likely to imply a negative effect on international relations. For instance, the ban contravenes the provisions in the international human rights laws with respect to armed conflicts. Such requirements which are implicit on the principles of proportionality, distinction, and military necessity, are contained in international treaties that include the Geneva Convention of 1949 and entrenched in the international customary law. Similarly, the provisions that legalize the use of automated weapons are contravened by the ban which neglects the importance of military interventions via automated weapon detonations for protection motives and which may result in widespread benefits to the civilians. Indeed, the broad range of automated weapons and technologies can be essential in enhancement of the human right to life. Indeed, every country and global state has the absolute duty in military organizations and combatants to partake in armed conflict situations without delegating the task to automated machines or processes. However, in order to address the issue of armed conflict and the conflicting aspect of use of automated forces, it may be impractical to avoid automated approaches systems which essentially are bound to contravene basic human rights of the individuals (Zenko & Micah 2013).

The ban can be grounded on the fact that such detonations are often made on unsupervised non-human systems. Indeed, the international ban can be a firm establishment on the principle that allows the authorities to decide upon the initiation of lethal forces which cannot be delegated legitimately to automated processes, but should be tied up to human supervision in order to govern the loss of human lives to legitimate situations only. Consequently, international ban on automated weapon systems can be perceived as a contravention of other legal provisions which act to sustain security and the general protection of military entities, besides improving the benefits on civilians upon any sought of detonation (Doswald-Beck & Vite 1993).


The world treaties related to automated weapons have been associated with lots of challenges with respect to attacks and counter-attacks both internally and internationally. Unanimous treaties that can harmonize the provisions of humanitarian laws across the globe should be an important benchmark upon which the autonomous weapons can be crafted and deployed with a view to maximize protection and enhance global peace. However, making series of independent treaties with respect to the automated weapons may not resolve the problem of negative implications of armed conflicts. Indeed, the autonomous weapons have been misused than utilized.

Discount applied successfully