G2TT
Is assurance the Achilles heel of military artificial intelligence?  智库博客
时间:2018-07-24   作者: Ben Barry  来源:International Institute for Strategic Studies (United Kingdom)
\u003cp\u003eMachine learning algorithms \u0026ndash; self-taught software \u0026ndash; are at the heart of emerging artificial intelligence (AI) applications. Such applications often use what are called deep neural networks, where the software will re-write elements of its own code as a result of \u0026lsquo;experience\u0026rsquo;, to better carry out its task. Establishing a clear relationship between input and output in this environment, therefore, can be difficult. While this may be acceptable in many civil applications, in the military realm it poses considerable difficulties. Dealing with the issue of assurance represents a significant challenge for armed forces and the defence sector.\u003c/p\u003e\n\u003cp\u003eNot only is the verification of the software and reliability of AI a challenge, there are also vulnerability problems. For example, by taking advantage of limitations in the software, it has proved possible to \u0026lsquo;spoof\u0026rsquo; image-recognition software with pictures that look nothing like the objects the software has been trained to recognise.\u003c/p\u003e\n\u003cp\u003eSoftware that learns by rewriting parts of its own code and military systems that display unanticipated behaviour are antithetical to traditional methods of assuring weapons and defence equipment. If AI applications are to be exploited fully by the military, then these problems will need to be overcome. This likely will require a revised approach to systems assurance.\u003c/p\u003e\n\u003cp\u003eIn a large number of armed forces, including those of China, Russia, the United Kingdom and the United States, there is much anticipation that autonomous systems and AI have the potential to offer considerable military advantage. For example, increasing autonomy would improve resistance to communications degradation and allow single human operators to control or supervise multiple unmanned platforms. The convergence of both AI and autonomous capabilities may create new technical and tactical possibilities. Many advocates suggest that in the medium and long term, autonomy and AI have considerable \u0026lsquo;leap ahead\u0026rsquo; potential. This, of course, requires that the leap be in the right direction.\u003cspan style=\u0022color: red;\u0022\u003e \u003c/span\u003e\u003c/p\u003e\n\u003cp\u003eCivilian investment in robotics, automation, autonomy and AI is being driven by businesses seeking market opportunities and the reduction of costs. In many areas, commercial spending greatly exceeds that by governments. \u003c/p\u003e\n\u003cp\u003eArmed forces and the defence-industrial sector are no doubt keen to exploit this investment, but the extent to which this will occur in the West, at least, is dependent on the ability to address specific military requirements concerning reliability and system behaviour. The military potential and limitations of robotics, autonomy and AI is explored in the \u0026lsquo;\u003ca href=\u0022https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/709359/20180517-concepts_uk_human_machine_teaming_jcn_1_18.pdf\u0022\u003e\u003cspan\u003eHuman Machine Teaming\u003c/span\u003e\u003c/a\u003e\u003cspan\u003e\u0026rsquo;, published recently by the UK Ministry of Defence\u0026rsquo;s Development, Concepts and Doctrine Centre.\u003c/span\u003e\u003c/p\u003e\n\u003ch4\u003eThe assurance issue\u003c/h4\u003e\n\u003cp\u003eBecause fighting equipment, weapons and ammunition must be safe to be used by their operators in times of peace and war, when procuring these most major armed forces are governed by more regulations than those governing commercial industry and business. For example, the Military Aviation Authority must assure the \u003ca href=\u0022https://www.gov.uk/government/publications/jsp-520-uk-mods-ordnance-munitions-and-explosives-safety-management-system\u0022\u003eUK\u0026rsquo;s military aircraft and helicopters\u003c/a\u003e. It is \u0026lsquo;required to regulate UK military registered aircraft, assure the safety of the delivery of military aviation capability and enforce adherence to the regulations\u0026rsquo;. Indeed, the procurement of weapons, ammunition, missiles and explosives for the UK armed forces is subject to considerable efforts to assure safety.\u003c/p\u003e\n\u003cp\u003eTo realise the military benefits of autonomy, AI and machine learning, it will be essential to develop new approaches to assurance. Armed forces, defence ministries, procurement agencies and defence industries all need to be working on ways of tackling this issue.\u003c/p\u003e\n\u003cp\u003eAs this affects both the civilian and defence applications of autonomy and AI, there are potential opportunities for collaboration. For example, the necessary measures for giving approval to civilian driverless cars will be relevant to military efforts to employ driverless logistic vehicles. Similarly, swarming techniques applied to military uninhabited aerial vehicles may be applicable to non-military applications such as the deployment of micro-drones for environmental sensing and research.\u003c/p\u003e\n\u003cp\u003eNot tackling these issues up front carries risk, which would increase over time. For example, if, because of assurance difficulties, NATO, UK or US forces choose not to use employ civilian autonomy and AI capabilities, their publics, politicians and medias might find it difficult to understand why not, particularly if this resulted in avoidable setbacks or casualties, including from unconstrained use of these technologies by opponents. For example, if Amazon successfully fields delivery drones, the public might find it difficult to understand why their army\u0026rsquo;s logistics systems did not have that capability.\u003c/p\u003e\n\u003chr /\u003e\n\u003cp\u003e\u003cstrong\u003eThis analysis originally featured on the \u003ca href=\u0022http://go.iiss.org/2gnVMys\u0022\u003eIISS Military Balance+\u003c/a\u003e, the online database that provides indispensable information and analysis for users in government, the armed forces, the private sector, academia, the media and more. Customise, view, compare and download data instantly, anywhere, anytime.\u003c/strong\u003e\u003c/p\u003e","className":"richtext reading--content font-secondary"}), document.getElementById("react_yKRO3xcjlkeSznFATpnAuw"))});
Artificial intelligence has the potential to offer considerable military advantage to armed forces. But dealing with the issue of assurance represents a significant challenge.

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。