![]() One of these principles is “traceability,” emphasizing that relevant personnel will “possess an appropriate understanding of the technology,” including transparent and auditable data methodology. In February 2020, the Department of Defense released a set of principles AI ethics drafted by the Defense Innovation Board. The Pentagon has taken some steps to address these risks. ![]() A lack of reliable data or an inability to produce datasets that replicate combat conditions will make it more likely that autonomous weapons fail to make accurate identifications. ![]() “Conflict environments are harsh, dynamic and adversarial, and there will always be more variability in the real-world data of the battlefield than in the limited sample of data on which autonomous systems are built and verified,” as Arthur Holland Michel, and associate researcher in the Security and Technology Programme at the UN Institute for Disarmament Research, wrote in a report last year addressing data issues in military autonomous weapons. The challenge of acquiring accurate data sets autonomous systems up for inevitable failure. As militaries move toward greater autonomy in a wide range of systems, they are increasingly reliant on machine learning technology that uses large data sets to make predictions about how a machine should operate. This difficulty faced by Patriot missile batteries in correctly identifying potential targets illustrates one of the most serious challenges facing autonomous weapons-getting accurate training data. The Defense Science Board’s examination of the Patriot concluded that future conflicts will likely be “more stressing” and involve “simultaneous missile and air defense engagements.” In such a scenario, “a protocol that allows more operator oversight and control of major system actions will be needed,” the task force argued. Failures in communication, identification, and fire-control can occur at different points of a chain of events, and it can be difficult to predict how failures will interact with one another and produce a potentially lethal outcome. As policymakers consider how to evaluate the deployment of increasingly autonomous weapons and military systems, the complexity of such systems, the ways in which they might fail, and how human operators oversee them are key issues to consider. While it is tempting to focus on the automated features of the Patriot system when examining the shootdown-or autonomous and semi-autonomous systems more broadly-it is important to consider such weapons as part of broader systems. Raytheon, which manufactures the Patriot, has described the system as having “automated operations” with “man-in-the-loop (human) override” capabilities-technology that allows the weapon to quickly engage targets with the necessary speed to carry out its missile defense mission. As it approaches for impact, the missile’s own radar tracks the target. Once fired, the missiles fly toward an identified intercept point calculated before firing, directions that can be altered by sending updated sensor readings over radio signal to the fired missile. This information is then fed into a computer control station to manage how the missiles are launched in response. Patriot missile batteries use a phased array radar to detect and identify targets. Later, the missile would gain the ability to also intercept other missiles, and as the roles assigned to the missile expanded, its autonomous capabilities increased. Army sought a means to reliably shoot down enemy airplanes. The Patriot missile began development in the 1960s, when the U.S.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |