The Dangers of Distributed Intelligence

By on August 4th, 2017 in Editorial & Opinion, Magazine Articles, Societal Impact

There is an increasing interest in, and implementation of the Internet of Things. As the number and types of interconnections increase, it is important to be aware of the dangers of distributing the required intelligence among the participants, and the consequences of not foreseeing the dangers.

In order to address these dangers, it is necessary to understand the notion of a “partial pattern match.” As two examples of this kind of thinking, consider computer programmers (who look at similar pieces of code and by tweaking them, make a single subroutine out of them), and punsters (who make partial pattern matches among words …. If Miss Piggy says “moi aussi” it does NOT imply that she is Australian!) The generalization is that any time intelligence is distributed, it should be examined in as many disparate contexts as is reasonably possible, to see if there will be a troublesome possibility of misinterpretation, ideally by an entity not part of either provider or user (see definitions below). If the circumstances warrant, the examiner should be at least familiar with other languages, if not a polyglot or polymath.

Initial example: The rules for Daylight Saving time in the United States changed in 2007, by which year there was a great deal of distributed intelligence in the form of computers and microprocessors that were concerned with monitoring time. Computers per se caused little problem, because the operating systems could be (and were) easily updated over the Internet. However, there were issues with traffic lights and elevators (both involving the direction of traffic during rush hours, either horizontally or vertically), where the individual controllers could not be automatically updated.

Definitions

  • Information: The construction that is placed on data or the absence of data, by either the provider or the user of information.
  • Provider: The intelligence that serves as the source of the information provided.
  • User: The intelligence that responds to the information provided.
  • Enthymeme: An unstated axiom of a system that is a required part of the system, but is assumed or implied.
  • Intelligence is distributed when the provider of information is not an intrinsic part of the user of information (or vice versa).

Taxonomy

Following are a number of situations where distributed intelligence can lead to problems.

  • The provider is not directly connected to the user. (See initial example.)
  • The data required is subsumed in an enthymeme, and the provider, being unaware of this, provides incorrect or incomplete data, or wrongly assumes that the data provided will be correct and unchanging for the duration of its use.
  • The provider’s authority had not been timely revoked or limited.
  • The provider is illicit.
    • Hacking
    • Conflating
  • The provider is not sufficiently aware of the context of the user.
  • The user fails to properly instruct the provider.
  • The provider misunderstands the user (possibly deliberately).

Conclusions

Whenever the user of information is not permanently connected to the provider of information, there exists the possibility of one or more of the following:

  • Misunderstood information,
  • Incomplete information,
  • Inaccurate information,
  • Inappropriate information,
  • Information that subsequently changes independently of initial expectations.

And it becomes incumbent upon the provider and user to examine to examine each of these possibilities as a part of a joint effort to minimize or avoid errors.

Examples Based on Taxonomy

Enthymemes frequently hide behind “Oh, everyone knows that…. “Enthymemes cause problems when a provider or user becomes involved who does not share them … frequently, a person from another culture or background.

My home telephone number is 877–5678. Billy Graham’s organization has a toll-free number which they display on TV as 877-567-8989. The enthymeme here is that “everyone knows you have to dial a ‘1’ before a toll-free number.” The consequence is, of course, that if someone in an area code that includes (as, obviously, does mine) an exchange 877, doesn’t know this, and calls Billy Graham, it will ring a local number in that area code. There is a similar situation with one of the numbers for Disney World, and I have disappointed a number of small children who didn’t know about dialing the “1.”

I am an ESOL (English for Speakers of Other Languages) tutor. One day one of my students asked me why new seat belts were not legal in New Jersey. After a bit of probing, I discovered that the student had just come over a bridge into New Jersey, and had seen a sign that said “Seat Belts Must Be Worn.” The enthymeme here is on the part of the sign-writer, who did not anticipate the multiple meanings of the word “Worn.” (A better sign, I think, would be “Seat Belts Must Be In Use” (but, clearly, not “… Used”!). And try to explain why “1 am delighted” is not the same as “1 am turned off”!

Failure to Revoke

A man went through an unpleasant divorce that ended up with his ex-wife keeping what had been their house and living in it with her new lover. The house had an advanced thermostat, which was controllable by an app which had remained on the man’s smart-phone. When he discovered this, his revenge took the form of “tinkering” with the thermostat: in the wintertime, he dropped the house temperature to 40° at night when the couple were in bed, while in the summertime he raised it to 80°; when the couple were away, he kept the house at 80° in the winter and 40° in the summer, thus running up a prodigious electric bill!

Illicit Provider – Hacking

Instances of computers being hacked, either for amusement, or to steal data, are legion. However, the computers that are embedded in other devices have also been hacked.

Automobiles

Over time the control of an automobile has been taken away from the driver and moved into the automobile itself. My father’s Model A had a “choke” lever on the steering column, but the “choke” control has vanished as carburetors became more sophisticated. Brakes now have an anti-skid feature that prevents controlled skidding. In some newer cars all of the automobile’s functions are processed through an on-board computer, and there are instances of “hackers” being able to take over the control of another vehicle, thus (e.g.) preventing it from braking! Consider, then, cars that proceed with no human driver, and which have their controls hacked. Even if a passenger were able to take control, it is not clear that there would be sufficient time (1).

There was recently a demonstration that a 2014 Jeep Cherokee could be completely taken over by an outside agency, such that the driver had no control whatsoever. For now, owners of vehicles with the Uconnect feature (2) should install the update as soon as possible (2). The patch must be manually installed via USB stick or by a dealership mechanic. The flaw is said to affect several 2013–2014 models of Dodge Ram; the 2013–2014 Dodge Viper; the 2014 Jeep Cherokee, Jeep Grand Cherokee, and Dodge Durango; the 2015 Jeep Cherokee and Jeep Grand Cherokee; and 2015 Chrysler 200s.

Airplanes

In one case the FBI claimed that a security researcher “exploited/gained access to, or ‘hacked’ the (in-flight entertainment) system” and then overwrote code on the airplane’s Thrust Management Computer while aboard a flight (4). The FBI stated that the researcher successfully commanded the system he had accessed to issue the climb command, thereby causing one of the airplane engines to climb resulting in a lateral or sideways movement of the plane during one of these flights.

Robotics

The June 2015 issue of IEEE Spectrum has a discussion of robotic replacement parts for human beings. In a sidebar (3) there is a brief discussion of the possibility of such replacement parts being “hacked” by an unintended provider. This is related to an old science fiction story in which a murder is committed by the murderer’s taking control of the victim’s cardiac pacemaker.

Illicit Provider – Conflating

Modern freight trains are equipped with a “vigilance device,” which is the modern replacement for the traditional dead-man control. It is supposed to sound a warning if none of the controls in the locomotive cab are operated within a certain length of time. If the warning is not promptly acknowledged, the brakes are applied automatically.

On 2014 August 17 two Union Pacific freight trains collided head-on at Hoxie, AR, U.S.A., killing the engine crew of one train and causing considerable damage. One of the locomotives was equipped with a “horn sequencer” such that a single press of a foot pedal would repeatedly sound the standard level-crossing warning (long-long-short-long) until the second press of the pedal. Apparently the engineer became disabled after pressing this foot pedal, and the horn went on sounding for four minutes. However, as far as the vigilance device was concerned, each blast of the horn meant that a control had been operated, so it reset its timer, and thus never applied the brakes.

Insufficient Awareness of Context

Many supermarkets provide motorized carts for the physically disabled, but that doesn’t help when the shopper needs something on a high shelf, or if an aisle display blocks access. Yes, people are usually quite helpful in lending a hand, but the principle is still there: perhaps provide a call button on the carts to summon help? (I have suggested this to our local supermarket, which is looking into it!)

There was a recent science fiction story about a robot’s killing a technician who attempted to repair it because the robot had not been made aware of the technician’s assignment, and therefore considered the technician’s attempts to repair as a malicious attempt to disable, and therefore defended itself from attack. For those with longer memories, there is the famous Stanley Kubrick movie, “2001,” written by Kubrick and Arthur C. Clarke in which a spaceship’s computer, called “Hal,” murdered several astronauts apparently (at least according to Clarke’s novelization of the movie) to protect its perception of the success of the mission.

Some years ago I was tutoring a 14-year-old female. She showed me her class roster. She was enrolled in a course called “Thematic Studies.” I called the Board of Education, suggesting that it was rather inappropriate to have abbreviated this as “Them Studs.” (The person to whom I spoke would not let me talk with the IT people … she wanted to do that herself!)

User Fails to Instruct Provider

A manager who specified a program to be written by a third party for maintaining a large database for an organization, realized, after some years, that some changes were necessary. However, he had thought that “source code” and “data base specification” were all technical things that he, as a user, didn’t have to be concerned with, had not asked for when the program was delivered, and were no longer available (after eight years) from the original third party developer … and, as a result, the program had to be rewritten from scratch.

The Provider Misunderstands the User (Possibly Deliberately)

There is an (unsubstantiated) anecdote that illustrates this very nicely. At the height of the Cold War, a condom manufacturer in the United States received a very large order for condoms from the USSR. The order specified that the condoms were to be 30cm (12“) long and 8cm (3“) in diameter. After careful consideration, the manufacturer filled the order, but each of the condoms was stamped “SMALL.” The result, if the story be true, is that any potential political advantage to the USSR was lost as a consequence of the “SMALL” stamp.

Author

Peter Zilahy Ingerman is an Independent Consultant in Willingboro, NJ; www.ingerman.org. He can be reached at syscon@ingerman.org.