Mar 14, 2019
Artificial Intelligence is one thing. Artificial morality is another. It may sound something like this:
"First, we believe in the strong defense of the United States and we want the people who defend it to have access to the nation's best technology, including from Microsoft."
The words are those of Microsoft president Brad Smith, writing on a corporate blogsite last fall in defense of the company's new contract with the U.S. Army, worth $479 million, to make augmented reality headsets for use in combat. The headsets, known as the Integrated Visual Augmentation System, or IVAS, are a way to "increase lethality" when the military engages the enemy, according to a Defense Department official. Microsoft's involvement in this program set off a wave of outrage among the company's employees, with more than a hundred of them signing a letter to the company's top executives demanding that the contract be canceled.
We are alarmed that Microsoft is working to provide weapons technology to the U.S. Military, helping one country's government 'increase lethality' using tools we built.
"We are a global coalition of Microsoft workers, and we refuse to create technology for warfare and oppression. We are alarmed that Microsoft is working to provide weapons technology to the U.S. Military, helping one country's government 'increase lethality' using tools we built. We did not sign up to develop weapons, and we demand a say in how our work is used."
Wow, words of conscience and hope. The deeper story in all this is ordinary people exercising their power to shape the future and refusing to increase its lethality.
With this contract, the letter goes on, Microsoft has "crossed the line into weapons development. . . . The application of HoloLens within the IVAS system is designed to help people kill. It will be deployed on the battlefield, and works by turning warfare into a simulated 'video game,' further distancing soldiers from the grim stakes of war and the reality of bloodshed."
This revolt was what Smith was responding to when he said he believed in a "strong defense," implying that moral cliches rather than money are what drive the decisions of large corporations, or at least this particular large corporation. Somehow his words, which he attempted to convey as reflective and deeply considered, are not convincing -- not when juxtaposed with a defense contract worth nearly half a billion dollars.
Smith goes on, acknowledging that no institution, including the military, is perfect, but pointing out that "one thing is clear. Millions of Americans have served and fought in important and just wars," cherry-picking such lauded oldies as the Civil War and World War II, where America's enhanced lethality freed slaves and liberated Europe.
Fascinatingly, the tone of his blog post is not arrogant toward the employees -- do what you're told or you're fired -- but, rather, softly placating, seeming to indicate that the power here isn't concentrated at the upper levels of management. Microsoft is flexible: "As is always the case, if our employees want to work on a different project or team -- for whatever reason -- we want them to know we support talent mobility."
The employees who signed the letter demanded cancellation of the Defense contract. Smith offered their personal consciences an out: Come on, join another team if you don't want to cross the line and work on weapons development. Microsoft honors employees of multiple moral persuasions!
Artificial Intelligence is a high-tech phenomenon that requires highly complex thinking. Artificial morality hides behind the nearest cliche in servitude to money.
Employees are standing for something larger than sheerly personal interests, in the process pushing the Big Tech brass to think beyond their need for an endless flow of capital, consequences be damned.
What I see here is moral awakening scrambling for sociopolitical traction: Employees are standing for something larger than sheerly personal interests, in the process pushing the Big Tech brass to think beyond their need for an endless flow of capital, consequences be damned.
This is happening across the country. A movement is percolating: Tech won't build it!
"Across the technology industry," the New York Times reported in October, "rank-and-file employees are demanding greater insight into how their companies are deploying the technology that they built. At Google, Amazon, Microsoft and Salesforce, as well as at tech start-ups, engineers and technologists are increasingly asking whether the products they are working on are being used for surveillance in places like China or for military projects in the United States or elsewhere.
"That's a change from the past, when Silicon Valley workers typically developed products with little questioning about the social costs."
What if moral thinking -- not in books and philosophical tracts, but in the real world, both corporate and political -- were as large and complex as technical thinking? It could no longer hide behind the cliche of the just war (and surely the next one we're preparing for will be just), but would have to evaluate war itself -- all wars, including the ones of the past 70 years or so, in the fullness of their costs and consequences -- as well as look ahead to the kind of future we could create, depending on what decisions we make today. Complex moral thinking doesn't ignore the need to survive, financially and otherwise, in the present moment, but it stays calm in the face of that need and sees survival as a collective, not a competitive, enterprise.
Moral complexity is called peace. There is no such thing as simplistic peace.
Join Us: News for people demanding a better world
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.
Robert C. Koehler
Robert Koehler is an award-winning, Chicago-based journalist and nationally syndicated writer. Koehler has been the recipient of multiple awards for writing and journalism from organizations including the National Newspaper Association, Suburban Newspapers of America, and the Chicago Headline Club. He's a regular contributor to such high-profile websites as Common Dreams and the Huffington Post. Eschewing political labels, Koehler considers himself a "peace journalist. He has been an editor at Tribune Media Services and a reporter, columnist and copy desk chief at Lerner Newspapers, a chain of neighborhood and suburban newspapers in the Chicago area. Koehler launched his column in 1999. Born in Detroit and raised in suburban Dearborn, Koehler has lived in Chicago since 1976. He earned a master's degree in creative writing from Columbia College and has taught writing at both the college and high school levels. Koehler is a widower and single parent. He explores both conditions at great depth in his writing. His book, "Courage Grows Strong at the Wound" (2016). Contact him or visit his website at commonwonders.com.
Artificial Intelligence is one thing. Artificial morality is another. It may sound something like this:
"First, we believe in the strong defense of the United States and we want the people who defend it to have access to the nation's best technology, including from Microsoft."
The words are those of Microsoft president Brad Smith, writing on a corporate blogsite last fall in defense of the company's new contract with the U.S. Army, worth $479 million, to make augmented reality headsets for use in combat. The headsets, known as the Integrated Visual Augmentation System, or IVAS, are a way to "increase lethality" when the military engages the enemy, according to a Defense Department official. Microsoft's involvement in this program set off a wave of outrage among the company's employees, with more than a hundred of them signing a letter to the company's top executives demanding that the contract be canceled.
We are alarmed that Microsoft is working to provide weapons technology to the U.S. Military, helping one country's government 'increase lethality' using tools we built.
"We are a global coalition of Microsoft workers, and we refuse to create technology for warfare and oppression. We are alarmed that Microsoft is working to provide weapons technology to the U.S. Military, helping one country's government 'increase lethality' using tools we built. We did not sign up to develop weapons, and we demand a say in how our work is used."
Wow, words of conscience and hope. The deeper story in all this is ordinary people exercising their power to shape the future and refusing to increase its lethality.
With this contract, the letter goes on, Microsoft has "crossed the line into weapons development. . . . The application of HoloLens within the IVAS system is designed to help people kill. It will be deployed on the battlefield, and works by turning warfare into a simulated 'video game,' further distancing soldiers from the grim stakes of war and the reality of bloodshed."
This revolt was what Smith was responding to when he said he believed in a "strong defense," implying that moral cliches rather than money are what drive the decisions of large corporations, or at least this particular large corporation. Somehow his words, which he attempted to convey as reflective and deeply considered, are not convincing -- not when juxtaposed with a defense contract worth nearly half a billion dollars.
Smith goes on, acknowledging that no institution, including the military, is perfect, but pointing out that "one thing is clear. Millions of Americans have served and fought in important and just wars," cherry-picking such lauded oldies as the Civil War and World War II, where America's enhanced lethality freed slaves and liberated Europe.
Fascinatingly, the tone of his blog post is not arrogant toward the employees -- do what you're told or you're fired -- but, rather, softly placating, seeming to indicate that the power here isn't concentrated at the upper levels of management. Microsoft is flexible: "As is always the case, if our employees want to work on a different project or team -- for whatever reason -- we want them to know we support talent mobility."
The employees who signed the letter demanded cancellation of the Defense contract. Smith offered their personal consciences an out: Come on, join another team if you don't want to cross the line and work on weapons development. Microsoft honors employees of multiple moral persuasions!
Artificial Intelligence is a high-tech phenomenon that requires highly complex thinking. Artificial morality hides behind the nearest cliche in servitude to money.
Employees are standing for something larger than sheerly personal interests, in the process pushing the Big Tech brass to think beyond their need for an endless flow of capital, consequences be damned.
What I see here is moral awakening scrambling for sociopolitical traction: Employees are standing for something larger than sheerly personal interests, in the process pushing the Big Tech brass to think beyond their need for an endless flow of capital, consequences be damned.
This is happening across the country. A movement is percolating: Tech won't build it!
"Across the technology industry," the New York Times reported in October, "rank-and-file employees are demanding greater insight into how their companies are deploying the technology that they built. At Google, Amazon, Microsoft and Salesforce, as well as at tech start-ups, engineers and technologists are increasingly asking whether the products they are working on are being used for surveillance in places like China or for military projects in the United States or elsewhere.
"That's a change from the past, when Silicon Valley workers typically developed products with little questioning about the social costs."
What if moral thinking -- not in books and philosophical tracts, but in the real world, both corporate and political -- were as large and complex as technical thinking? It could no longer hide behind the cliche of the just war (and surely the next one we're preparing for will be just), but would have to evaluate war itself -- all wars, including the ones of the past 70 years or so, in the fullness of their costs and consequences -- as well as look ahead to the kind of future we could create, depending on what decisions we make today. Complex moral thinking doesn't ignore the need to survive, financially and otherwise, in the present moment, but it stays calm in the face of that need and sees survival as a collective, not a competitive, enterprise.
Moral complexity is called peace. There is no such thing as simplistic peace.
Robert C. Koehler
Robert Koehler is an award-winning, Chicago-based journalist and nationally syndicated writer. Koehler has been the recipient of multiple awards for writing and journalism from organizations including the National Newspaper Association, Suburban Newspapers of America, and the Chicago Headline Club. He's a regular contributor to such high-profile websites as Common Dreams and the Huffington Post. Eschewing political labels, Koehler considers himself a "peace journalist. He has been an editor at Tribune Media Services and a reporter, columnist and copy desk chief at Lerner Newspapers, a chain of neighborhood and suburban newspapers in the Chicago area. Koehler launched his column in 1999. Born in Detroit and raised in suburban Dearborn, Koehler has lived in Chicago since 1976. He earned a master's degree in creative writing from Columbia College and has taught writing at both the college and high school levels. Koehler is a widower and single parent. He explores both conditions at great depth in his writing. His book, "Courage Grows Strong at the Wound" (2016). Contact him or visit his website at commonwonders.com.
Artificial Intelligence is one thing. Artificial morality is another. It may sound something like this:
"First, we believe in the strong defense of the United States and we want the people who defend it to have access to the nation's best technology, including from Microsoft."
The words are those of Microsoft president Brad Smith, writing on a corporate blogsite last fall in defense of the company's new contract with the U.S. Army, worth $479 million, to make augmented reality headsets for use in combat. The headsets, known as the Integrated Visual Augmentation System, or IVAS, are a way to "increase lethality" when the military engages the enemy, according to a Defense Department official. Microsoft's involvement in this program set off a wave of outrage among the company's employees, with more than a hundred of them signing a letter to the company's top executives demanding that the contract be canceled.
We are alarmed that Microsoft is working to provide weapons technology to the U.S. Military, helping one country's government 'increase lethality' using tools we built.
"We are a global coalition of Microsoft workers, and we refuse to create technology for warfare and oppression. We are alarmed that Microsoft is working to provide weapons technology to the U.S. Military, helping one country's government 'increase lethality' using tools we built. We did not sign up to develop weapons, and we demand a say in how our work is used."
Wow, words of conscience and hope. The deeper story in all this is ordinary people exercising their power to shape the future and refusing to increase its lethality.
With this contract, the letter goes on, Microsoft has "crossed the line into weapons development. . . . The application of HoloLens within the IVAS system is designed to help people kill. It will be deployed on the battlefield, and works by turning warfare into a simulated 'video game,' further distancing soldiers from the grim stakes of war and the reality of bloodshed."
This revolt was what Smith was responding to when he said he believed in a "strong defense," implying that moral cliches rather than money are what drive the decisions of large corporations, or at least this particular large corporation. Somehow his words, which he attempted to convey as reflective and deeply considered, are not convincing -- not when juxtaposed with a defense contract worth nearly half a billion dollars.
Smith goes on, acknowledging that no institution, including the military, is perfect, but pointing out that "one thing is clear. Millions of Americans have served and fought in important and just wars," cherry-picking such lauded oldies as the Civil War and World War II, where America's enhanced lethality freed slaves and liberated Europe.
Fascinatingly, the tone of his blog post is not arrogant toward the employees -- do what you're told or you're fired -- but, rather, softly placating, seeming to indicate that the power here isn't concentrated at the upper levels of management. Microsoft is flexible: "As is always the case, if our employees want to work on a different project or team -- for whatever reason -- we want them to know we support talent mobility."
The employees who signed the letter demanded cancellation of the Defense contract. Smith offered their personal consciences an out: Come on, join another team if you don't want to cross the line and work on weapons development. Microsoft honors employees of multiple moral persuasions!
Artificial Intelligence is a high-tech phenomenon that requires highly complex thinking. Artificial morality hides behind the nearest cliche in servitude to money.
Employees are standing for something larger than sheerly personal interests, in the process pushing the Big Tech brass to think beyond their need for an endless flow of capital, consequences be damned.
What I see here is moral awakening scrambling for sociopolitical traction: Employees are standing for something larger than sheerly personal interests, in the process pushing the Big Tech brass to think beyond their need for an endless flow of capital, consequences be damned.
This is happening across the country. A movement is percolating: Tech won't build it!
"Across the technology industry," the New York Times reported in October, "rank-and-file employees are demanding greater insight into how their companies are deploying the technology that they built. At Google, Amazon, Microsoft and Salesforce, as well as at tech start-ups, engineers and technologists are increasingly asking whether the products they are working on are being used for surveillance in places like China or for military projects in the United States or elsewhere.
"That's a change from the past, when Silicon Valley workers typically developed products with little questioning about the social costs."
What if moral thinking -- not in books and philosophical tracts, but in the real world, both corporate and political -- were as large and complex as technical thinking? It could no longer hide behind the cliche of the just war (and surely the next one we're preparing for will be just), but would have to evaluate war itself -- all wars, including the ones of the past 70 years or so, in the fullness of their costs and consequences -- as well as look ahead to the kind of future we could create, depending on what decisions we make today. Complex moral thinking doesn't ignore the need to survive, financially and otherwise, in the present moment, but it stays calm in the face of that need and sees survival as a collective, not a competitive, enterprise.
Moral complexity is called peace. There is no such thing as simplistic peace.
We've had enough. The 1% own and operate the corporate media. They are doing everything they can to defend the status quo, squash dissent and protect the wealthy and the powerful. The Common Dreams media model is different. We cover the news that matters to the 99%. Our mission? To inform. To inspire. To ignite change for the common good. How? Nonprofit. Independent. Reader-supported. Free to read. Free to republish. Free to share. With no advertising. No paywalls. No selling of your data. Thousands of small donations fund our newsroom and allow us to continue publishing. Can you chip in? We can't do it without you. Thank you.