The U.S. and UK are undermining attempts by the United Nations to negotiate over the future of autonomous weapons—or \u0022killer robots\u0022—talks which, if delayed further, could come too late to prevent so-called \u0022robot wars.\u0022Technology and human rights experts have been pushing for the UN to preemptively ban machines that can kill on the battlefield without human operators, citing a greater risk to civilian life and a broader lack of accountability for military officials. But Christof Heyns, UN special rapporteur on extrajudicial, summary, or arbitrary executions, said Tuesday that the negotiation process is in danger of getting \u0022stuck.\u0022\u0022A lot of money is going into development and people will want a return on their investment,\u0022 Heyns told the Guardian. \u0022If there is not a pre-emptive ban on the high-level autonomous weapons then once the genie is out of the bottle it will be extremely difficult to get it back in.\u0022As the UN General Assembly negotiates an agreement between nations on autonomous weapons, U.S. and UK representatives are reportedly pushing for weaker rules that would only prohibit future technology, but not killer robots developed during the protracted negotiating period. Such delays would also mean that existing semi-autonomous prototypes—like the Phalanx close-in weapons system (CIWS) in the U.S., the Iron Dome in Israel, and the SGR-1 sentry robot in South Korea—would not be subject to the ban.Proponents of killer robots say they will help reduce military casualties in war. But as a report published earlier this year by Human Rights Watch and Harvard Law School\u0026#039;s International Human Rights Clinic argues, such tools bring too many moral and legal risks to justify their continued development. Those risks include higher potential for violation of international law and a lack of accountability for war crimes committed by robots.What\u0026#039;s more, proliferation of autonomous weapons would make a global arms race \u0022inevitable,\u0022 experts—including physicist Stephen Hawking, Apple co-founder Steve Wozniak, and Tesla CEO Elon Musk—said in July.Noel Sharkey, a professor of artificial intelligence and co-founder of the International Committee for Robot Arms Control, is very concerned about where things are headed.\u0022Governments,\u0022 he explained to the Guardian, \u0022are continuing to test autonomous weapons systems, for example with the X49B, which is a fighter jet that can fly on its own, and there are contracts already out for swarms of autonomous gun ships. So if we are tied up [discussing a ban] for a long time then the word ‘emerging’ is worrying.\u0022\u0022The concern that exercises me most is that people like the U.S. government keep talking about gaining a military edge,\u0022 Sharkey said. \u0022So the talk is of using large numbers—swarms—of robots.\u0022If the UN is unable to close a deal on the future of autonomous weapons, countries would still have the option of crafting their own agreements, which is how the Convention on Cluster Munitions came about. But experts say it\u0026#039;s unlikely that major weapon-producing nations would agree to such a treaty.As of now, only five UN member states—Cuba, Pakistan, Egypt, Ecuador, and the Vatican—have backed a ban on killer robots.