Killer Robots (again?)

The International Joint Conference on Artificial Intelligence in July announced an open letter on Autonomous Weapons: an Open Letter from AI & Robotics Researchers which has probably broken the 20,000 signatures mark by now. (Wouldn’t you like your name on a letter signed by Stephan Hawking and Elon Musk, among other impressive figures?)    This touches on the cover topic of SSIT’s Technology and Society Magazine article in Spring of 2009 whose cover image just about says it all:spg09cov

The topic of this issue is Lethal Robots.  The letter suggests that letting AI software decide when to initiate fatal actions was not a good idea.  Specifically, “Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity.”

Unfortunately, I can’t exactly think of any way to actually prevent the development of such systems by organizations that would like to pursue the items listed above for which the killer robots are ideally suited.  Perhaps you have some thoughts?  How can we make these not just “not beneficial” but discourage their development? Or is that possible?

SSIT is a sponsor of a new IEEE Collaboratec community on Collaboratec CyberEthics and CyberPeace. I encourage you to join this community (which is not limited to IEEE members) and contribute to a discussion there.

Who is Driving Your Car?

A recent CBS Sixty Minutes program interviewed folks at DARPA, including a demonstration of how a recent computer-laden car could be hacked and controlled.

Computers in cars are not a new thing, even the dozens that we see in new models, and they have been interconnected for some time as well.  Connecting your car to the network is a more recent advance — “On Star” is one variation that has been on-board for a while.  The ads for this have suggested the range of capabilities — unlock your car for you, turn on your ignition, detect that you may have been in an accident (air bag deployed, but maybe  monitoring capabilities) and of course, they know where your car is — if it is stolen they can disable it. Presumably a hacker can do all of these as well — and the DARPA demonstration shows some of the implications of this — stopping the car, acceleration, etc.  Criminals have already acquired armies of zombie computers to use in attacking their targets, blackmail, etc.  Imagine having a few hundred zombie cars in a major city like LA — enabling both terror or blackmail.

An additional sequence on SIxty Minutes shows the hacking of a drone.  And perhaps equally important, a re-programmed drone that is not (as easily) accessed/hacked.  Behind this is an issue of software engineering and awareness.   The folks making drones, cars, and other Internet of Things (IoT) objects are not ‘building security in’.  What is needed is an awareness for each IoT enabled device of the security risks involved — not just for abuse o f that particular device, but also how that might impact other devices in the network or the health and safety of the user and public.

A recent dialog with some IEEE-USA colleagues surfaced a question of where software engineering licensing (professional engineers) might be required … and we used video games as an example of a point where it did not seem appropriate … of course, that all breaks down if your video game can take over your car or your pace maker.

 

 

Cyberwar and Social Impact

War tends to have significant social impact.  Even back in the days of civilized warfare (civilians from Washington DC went to view the first battle of Bull Run aka Manassas, they were caught in the retreat of the Union forces) there were significant impacts on Society.  In the recent issue of Technology and Society, authors Flowers and Zeadally outline the challenges faced by Cyberwarfare.

When is a cyber abuse an act of war?  The abusers include script-kiddies, criminals, corporate/national espionage, civil protests, up to nation state attacks sometimes accompanied by ‘kinetic’ battles.  Events may go undetected for extended periods, or may take out significant military or economic targets (such as the power grid.)  And identifying the source of an attack can be difficult, particularly if the attackers choose to make it difficult.

This paper outlines nation state attacks ranging back to 1982, when a Soviet pipeline was destroyed, up to fairly recent events.  It also provides a country of origin count for attacks in 2013 — with Russia leading (1.15 million) then the U.S. (.86 million), and in case you were wondering, China comes in at #8 (.25 million) after Germany, Taiwan, Bulgaria, Hungary and Poland. Of course the source country does not mean it is a state sponsored attack, nor does it mean that it is directed at military objectives nor might it damage persons or objects.

The NATO Cyber Defense Center of Excellence have sought to define cyber warfare in the recently published Tallinn Manual on International Law Applicable in Cyberwarfare.  But many of the potential “Perps” are not likely to pay much attention to International Law, and of course the response to a given attack becomes problematic if the source or responsible parties cannot be identified — “beyond a reasonable doubt.”

The paper concludes that cyber attacks are increasing.  Which leads to the question of what might be done … by technologists, by citizens or by nation states.  What evils are creeping across your part of the web?  What might we do about it?

 

Is Hacking Ethical?

And if not, should/can IEEE do something?

IEEE, the world’s largest technical professional society, has a code of ethics.  This clearly states two relevant points:

Accept responsibility in making decisions consistent with the safety, health, and welfare of the public, and to disclose promptly factors that might endanger the public or the environment;

and

to avoid injuring others, their property, reputation, or employment by false or malicious action

IEEE also has an Ethics and Member Conduct Committee (EMCC) that deals with ethics violations, including (according to IEEE Bylaws I-110) expulsion, suspension, or censure. It is interesting to note that institutions (corporations, universities, government agencies) that subscribe to IEEE’s Xplore products are not subject to complaints or review by IEEE related to IEEE’s Code of Ethics — while it is not clear how a process here might proceed, it is clear that these entities benefit from access to the intellectual property curated by IEEE.

So, let’s try out some examples:

  1. A software package called “Blackshades” appears to allow hackers to spy via webcams on users as well as acquire financial information and has resulted in the arrest of both the alleged creators of that software as well as persons suspected of using it in 18 countries.  The creation or use of software like this would appear to be a violation of the IEEE Code of Ethics.  Presumably, the individuals involved, if they are IEEE members, would be subject to action by the EMCC.
    Ok, that one seems pretty straight forward.
  2. The U.S. Justice Department has charged five Chinese military officers (all reportedly associated with  Unit 61398 of the Peoples Liberation Army) with economic espionage/theft of commercial data/secrets.  If these charges have merit, and some of the accused are IEEE members, presumably they could be subject to action by the IEEE EMCC.
    Here things get tricky since this is also a politically charged situation, where IEEE could alienate the government of China and/or members in China and lose some or all memberships, subscriptions and/or rights to do business in China (conferences, etc.)  This creates a conflict of interest within IEEE, potentially dealing with applying it’s Code of Ethics but at the risk of significant economic impact on the organization.  (Note: while IEEE Code of Ethics actions are maintained in confidence by IEEE, that does not prevent 3rd parties such as employers or governments who become aware of the actions from responding in various ways.)
  3. So what about Stuxnet?  “I think it’s pretty clear that the United States government did the Stuxnet attack”  according to Richard Clarke, who has served as US Counterterrorism tzar under three US Presidents,  quoted in Smithsonian Magazine April 2012.  It is clear that Stuxnet damaged property, and those responsible were not acting in a way consistent with the IEEE Code of Ethics.  — unless of course you decide that ethics should be interpreted based on “whose side you are on.”

A variation on the Blackshades situation might involve a corporate entity, perhaps one subscribing to IEEE publications such as Security and Privacy magazine. If a corporation were formed to pursue activities inconsistent with IEEE’s Code of Ethics should IEEE either have some channel for ethical review, or perhaps not accept a subscription from them? Is the answer different before they are convicted or after they are convicted?

Situations 2 and 3 raise issues at the entity level as opposed to the individual level.  They also raise conflict of interest issues at the entity level.  If a Chinese or U.S. entity appears to be using IEEE content to pursue actions inconsistent with IEEE’s Code of Ethics, should  IEEE have some action to take in response?  Does it make a difference if that entity is a major customer?  What if IEEE might lose its non-profit tax status by taking such an action?  (Presumably the U.S. Government does not take retribution against persons (including corporations) for exercising their constitutional rights … presumably.)

Is the Ethics of an action depend on individual vs entity responsibility?  Does it depend on who is taking the action and who is affected by it? Or does it just come down to power .. is the Ethical version of “too big to fail” something like “too big to fault?”

IEEE’s first Ethics Conference will be held this week in Chicago. There does not appear to be any discussion in this particular area among the quite interesting selection of papers and panels.  Ethics is becoming a more important, and more challenging consideration for both the individual professional, and for IEEE as an institution.