

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

Korea Advanced Institute of Science and Technology has partnered with a weapons manufacturer with the stated intent of developing defense technology, leading to a boycott by AI experts who fear the partnership will result in the creation of "killer robots." (Photo: Ys [waiz]/Flickr/cc)
More than 50 artificial intelligence researchers and experts from nearly 30 countries around the world announced a boycott on Thursday of a South Korean university whose new defense partnership, they fear, could lead to the development of "killer robots."
"At a time when the United Nations is discussing how to contain the threat posed to international security by autonomous weapons, it is regrettable that a prestigious institution like KAIST [Korea Advanced Institute of Science and Technology] looks to accelerate the arms race to develop such weapons," wrote the researchers in an open letter.
The letter adds:
If developed, autonomous weapons will be the third revolution in warfare. They will permit war to be fought faster and at a scale greater than ever before. They have the potential to be weapons of terror. Despots and terrorists could use them against innocent populations, removing any ethical restraints. This Pandora's box will be hard to close if it is opened. As with other technologies banned in the past like blinding lasers, we can simply decide not to develop them. We urge KAIST to follow this path, and work instead on uses of A.I. to improve and not harm human lives.
KAIST opened a new lab in February in partnership with Hanwha Systems, one South Korea's largest weapons manufacturers, prompting a report in the Korea Times that the two would join "the global competition to develop autonomous arms."
Hanwha Systems manufactures highly-destructive cluster munitions, which are prohibited by a U.N. convention signed by 120 countries.
The experts plan to "boycott all collaborations with any part of KAIST until such time as the President of KAIST provides assurances, which we have sought but not received, that the Center will not develop autonomous weapons lacking meaningful human control," according to the letter. "We will, for example, not visit KAIST, host visitors from KAIST, or contribute to any research project involving KAIST."
According to the Guardian, in KAIST's since-deleted announcement of its partnership with Hanwha, the university's president said the new lab would "provide a strong foundation for developing national defense technology," focusing on A.I.-based command and decision systems, algorithms for unmanned underwater vehicles, and A.I.-based recognition technology.
The university's activities come as more than 20 countries have urged a ban on all autonomous weapons and as the U.N. plans to convene in Geneva next week to discuss the militarization of artificial intelligence.
"There are plenty of great things you can do with A.I. that save lives, including in a military context, but to openly declare the goal is to develop autonomous weapons and have a partner like this sparks huge concern," Toby Walsh, who organized the boycott, told the Guardian. "This is a very respected university partnering with a very ethically dubious partner that continues to violate international norms."
Dear Common Dreams reader, It’s been nearly 30 years since I co-founded Common Dreams with my late wife, Lina Newhouser. We had the radical notion that journalism should serve the public good, not corporate profits. It was clear to us from the outset what it would take to build such a project. No paid advertisements. No corporate sponsors. No millionaire publisher telling us what to think or do. Many people said we wouldn't last a year, but we proved those doubters wrong. Together with a tremendous team of journalists and dedicated staff, we built an independent media outlet free from the constraints of profits and corporate control. Our mission has always been simple: To inform. To inspire. To ignite change for the common good. Building Common Dreams was not easy. Our survival was never guaranteed. When you take on the most powerful forces—Wall Street greed, fossil fuel industry destruction, Big Tech lobbyists, and uber-rich oligarchs who have spent billions upon billions rigging the economy and democracy in their favor—the only bulwark you have is supporters who believe in your work. But here’s the urgent message from me today. It's never been this bad out there. And it's never been this hard to keep us going. At the very moment Common Dreams is most needed, the threats we face are intensifying. We need your support now more than ever. We don't accept corporate advertising and never will. We don't have a paywall because we don't think people should be blocked from critical news based on their ability to pay. Everything we do is funded by the donations of readers like you. When everyone does the little they can afford, we are strong. But if that support retreats or dries up, so do we. Will you donate now to make sure Common Dreams not only survives but thrives? —Craig Brown, Co-founder |
More than 50 artificial intelligence researchers and experts from nearly 30 countries around the world announced a boycott on Thursday of a South Korean university whose new defense partnership, they fear, could lead to the development of "killer robots."
"At a time when the United Nations is discussing how to contain the threat posed to international security by autonomous weapons, it is regrettable that a prestigious institution like KAIST [Korea Advanced Institute of Science and Technology] looks to accelerate the arms race to develop such weapons," wrote the researchers in an open letter.
The letter adds:
If developed, autonomous weapons will be the third revolution in warfare. They will permit war to be fought faster and at a scale greater than ever before. They have the potential to be weapons of terror. Despots and terrorists could use them against innocent populations, removing any ethical restraints. This Pandora's box will be hard to close if it is opened. As with other technologies banned in the past like blinding lasers, we can simply decide not to develop them. We urge KAIST to follow this path, and work instead on uses of A.I. to improve and not harm human lives.
KAIST opened a new lab in February in partnership with Hanwha Systems, one South Korea's largest weapons manufacturers, prompting a report in the Korea Times that the two would join "the global competition to develop autonomous arms."
Hanwha Systems manufactures highly-destructive cluster munitions, which are prohibited by a U.N. convention signed by 120 countries.
The experts plan to "boycott all collaborations with any part of KAIST until such time as the President of KAIST provides assurances, which we have sought but not received, that the Center will not develop autonomous weapons lacking meaningful human control," according to the letter. "We will, for example, not visit KAIST, host visitors from KAIST, or contribute to any research project involving KAIST."
According to the Guardian, in KAIST's since-deleted announcement of its partnership with Hanwha, the university's president said the new lab would "provide a strong foundation for developing national defense technology," focusing on A.I.-based command and decision systems, algorithms for unmanned underwater vehicles, and A.I.-based recognition technology.
The university's activities come as more than 20 countries have urged a ban on all autonomous weapons and as the U.N. plans to convene in Geneva next week to discuss the militarization of artificial intelligence.
"There are plenty of great things you can do with A.I. that save lives, including in a military context, but to openly declare the goal is to develop autonomous weapons and have a partner like this sparks huge concern," Toby Walsh, who organized the boycott, told the Guardian. "This is a very respected university partnering with a very ethically dubious partner that continues to violate international norms."
More than 50 artificial intelligence researchers and experts from nearly 30 countries around the world announced a boycott on Thursday of a South Korean university whose new defense partnership, they fear, could lead to the development of "killer robots."
"At a time when the United Nations is discussing how to contain the threat posed to international security by autonomous weapons, it is regrettable that a prestigious institution like KAIST [Korea Advanced Institute of Science and Technology] looks to accelerate the arms race to develop such weapons," wrote the researchers in an open letter.
The letter adds:
If developed, autonomous weapons will be the third revolution in warfare. They will permit war to be fought faster and at a scale greater than ever before. They have the potential to be weapons of terror. Despots and terrorists could use them against innocent populations, removing any ethical restraints. This Pandora's box will be hard to close if it is opened. As with other technologies banned in the past like blinding lasers, we can simply decide not to develop them. We urge KAIST to follow this path, and work instead on uses of A.I. to improve and not harm human lives.
KAIST opened a new lab in February in partnership with Hanwha Systems, one South Korea's largest weapons manufacturers, prompting a report in the Korea Times that the two would join "the global competition to develop autonomous arms."
Hanwha Systems manufactures highly-destructive cluster munitions, which are prohibited by a U.N. convention signed by 120 countries.
The experts plan to "boycott all collaborations with any part of KAIST until such time as the President of KAIST provides assurances, which we have sought but not received, that the Center will not develop autonomous weapons lacking meaningful human control," according to the letter. "We will, for example, not visit KAIST, host visitors from KAIST, or contribute to any research project involving KAIST."
According to the Guardian, in KAIST's since-deleted announcement of its partnership with Hanwha, the university's president said the new lab would "provide a strong foundation for developing national defense technology," focusing on A.I.-based command and decision systems, algorithms for unmanned underwater vehicles, and A.I.-based recognition technology.
The university's activities come as more than 20 countries have urged a ban on all autonomous weapons and as the U.N. plans to convene in Geneva next week to discuss the militarization of artificial intelligence.
"There are plenty of great things you can do with A.I. that save lives, including in a military context, but to openly declare the goal is to develop autonomous weapons and have a partner like this sparks huge concern," Toby Walsh, who organized the boycott, told the Guardian. "This is a very respected university partnering with a very ethically dubious partner that continues to violate international norms."