Toys, Terrorism and Technology

Recent attacks on citizens in all too many countries have raised the question of creating back-doors in encrypted communications technology.  A November 22 NY Times article by Zeynep Tufekci: “The WhatsApp Theory of Terrorism“, does a good job of explaining some of the flaws in the “simplistic” – government mandated back-doors. The short take: bad guys have access to tools that do not need to follow any government regulations, and bad guys who want to hack your systems can use any backdoor that governments do mandate — no win for protection, big loss of protection.

Toys? The Dec. 1 Wall Street Journal covered: “Toy Maker Says Hack Accessed Customer Information“.  While apparently no social security or credit card data was obtained, there is value in having names – birthdates – etc for creating false credentials.  How does this relate to the Terrorist Threat?  — two ways actually:

  1. there are few, if any, systems that hackers won’t target — so a good working assumption is someone will try to ‘crack’ it.
  2. technologists, in particular software developers, need to be aware, consider and incorporate appropriate security requirements into EVERY online system design.

We are entering the era of the Internet of Things (IoT), with many objects now participating in a globally connected environment.  There are no doubt some advantages (at least for marketing spin) with each such object.  There will be real advantages for some objects.  New insight may be discovered though the massive amount of data available  – for example, can we track global warming via the use of IoT connected heating/cooking devices? However, there will be potential abuses of both individual objects (toys above), and aggregations of data.  Software developers and their management need to apply worst case threat-analysis to determine the risks and requirements for EVERY connected object.

Can terrorists, or other bad guys, use toys? Of Course!  There are indications that X-Box and/or Playstations were among the networked devices used to coordinate some of the recent attacks. Any online environment that allows users to share data/objects can be used as a covert communications channel.  Combining steganography and ShutterFly,  Instagram, Minecraft,  or any other site where you can upload or manipulate a shareable image is a channel.  Pretending we can protect them all is a dangerous delusion.

Is your employer considering IoT security?  Is your school teaching about these issues?


Technologists who Give a Damn?

I’ve been using Zen and the Art of Motorcycle Maintenance in classes for a while now.  One key message of the book is that professionals (well everybody) needs to care about their work.  Perhaps more in Zen terms, to be mindful while they work.  The author asserts that one of the reasons technology is so alienating now-a-days is that the lack of care is evident in the workmanship, robustness, etc. I’ve also been working on an update of the SSIT Strategic Plan, and one element of that discussion has been what catchphrase should we use?… Like on business cards.  IEEE’s is “Advancing technology for humanity” which is a good one.  Currently we are using “Where Technology and Society Talk” … but it is tempting to use: “Technologists that Give a Damn” … a bit demeaning to imply that some (many?) don’t, but unfortunately this is at least occasionally true. There are at least two levels of caring.  The obvious one for SSIT is paying attention to the social impact of inventions and products (the “should we make it” as opposed to the “how we make it“).  There is a lower level that is also critical, in software we might ask “is this code elegant?”  Oddly, there seems to be a relationship between underlying elegance and quality.  Clean, simple design often works better than a ‘hack’, and it takes both a level of mastery, and a level of mindfulness to accomplish.  Some number of cyber security holes are a result of code where folks didn’t care enough to do it right. No doubt many “blue screen of death” displays and other failures and frustrations emerge from this same source.  Often management is under pressure, or lack of awareness, and is satisfied with shipping the product rather than making sure it is done well.  I’m not aware of any equivalent in most development facilities of the Japanese “line stop buttons” that make quality a ubiquitous responsibility.  The reality is we need technologists who invent and produce products that are right socially, done right technically — technologists who embrace “care” at all levels. A retired career counselor from the engineering school at one of our ivy league schools in my Zen class observed that we were more focused on ‘career skills’ than ‘quality’ in our education, and may be suppressing student’s sense of care.  We then observed that this apparent lack of care, evidenced in so many consumer products, might be a factor in why girls are choosing to not enter STEM education and careers. I suppose the question  that remains is “do we care?”

Who is Driving Your Car?

A recent CBS Sixty Minutes program interviewed folks at DARPA, including a demonstration of how a recent computer-laden car could be hacked and controlled.

Computers in cars are not a new thing, even the dozens that we see in new models, and they have been interconnected for some time as well.  Connecting your car to the network is a more recent advance — “On Star” is one variation that has been on-board for a while.  The ads for this have suggested the range of capabilities — unlock your car for you, turn on your ignition, detect that you may have been in an accident (air bag deployed, but maybe  monitoring capabilities) and of course, they know where your car is — if it is stolen they can disable it. Presumably a hacker can do all of these as well — and the DARPA demonstration shows some of the implications of this — stopping the car, acceleration, etc.  Criminals have already acquired armies of zombie computers to use in attacking their targets, blackmail, etc.  Imagine having a few hundred zombie cars in a major city like LA — enabling both terror or blackmail.

An additional sequence on SIxty Minutes shows the hacking of a drone.  And perhaps equally important, a re-programmed drone that is not (as easily) accessed/hacked.  Behind this is an issue of software engineering and awareness.   The folks making drones, cars, and other Internet of Things (IoT) objects are not ‘building security in’.  What is needed is an awareness for each IoT enabled device of the security risks involved — not just for abuse o f that particular device, but also how that might impact other devices in the network or the health and safety of the user and public.

A recent dialog with some IEEE-USA colleagues surfaced a question of where software engineering licensing (professional engineers) might be required … and we used video games as an example of a point where it did not seem appropriate … of course, that all breaks down if your video game can take over your car or your pace maker.



Robotics Commission

Ryan Calo, UW School of Law, published a recent Brookings Institute Report “The Case for a National Robotics Commission“.  He argues that robotics is sufficiently complex that policy makers (legislative and/or federal commissions) cannot expect to have the expertise to make informed policy recommendations, laws and determinations.  He sites various examples from driverless cars to Stephen Colbert’s Twitter bot @RealHumanPraise

While I agree with Ryan’s observation about the challenge governments face trying to make informed decisions related to technology issues, I fear “robotics” is too focused of scope.  Similar issues emerge with medical devices, baggage sorting systems and automated phone systems.

The field of Software Engineering is moving towards licensed (Professional Engineering) status in various US states at this time, and that distinction  will help establish criteria for some of the related applications.  Essentially any health or safety related application (cars, medical devices, etc.) should have review/endorsement by a licensed software engineer (as is the case of civil engineering, mechanical engineering and electrical engineering.)  That folks might be writing software for critical systems and not be familiar with the concepts surfaced in the Software Engineering Body of Knowledge (which is the basis for state licensing exams, the IEEE CS certification program, and a number of IEEE/ISO standards) is a disturbing reality.

Similar considerations exist in the critically  related areas of robotics, sensors, intelligent vehicles — and no doubt will emerge with artificial intelligence over time. Technology domains are moving rapidly on all fronts. Processes to track best practices, standards, university curriculum, vendor independent certifications, licensing, etc. at best lag behind and often get little or no industry/academic support. Identifying knowledge experts is difficult in more static fields. Many of the issues facing policy makers span fields — is it software, hardware, mechanical, etc?

So while the concept of a robotics commission may help get the discussion going, in reality we need a rich community of experts spanning a range of technology fields who are prepared to join in the discussion/analysis as complex issues arise.  Drawing these from pools of corporate lobbyists, or other agenda laden sources is problematic. It may be that partnership between agencies and professional societies may provide such a pool of experts.  Even here the agenda risks are real, but at least there can be some balance between deep-pocket established interests and emerging small companies where disruptive innovation considerations can be taken into account.

What forums exist in your country/culture/environment to help inform policy and regulatory action in technology areas?  How can government draw on informed experts to help?

It takes People, Patience and Some Understanding

To get technology done right, the folks ‘commissioning’ it need to understand a little bit about what it takes to do it right.  Case in point, the U.S. Affordable Care (ACA) website.  This has generated a lot of heat, and little light due to the political stakes and opportunities associated with this particular issue. So lets ignore the politics.

Information Week points out that 40% of major IT projects fail, and I’ve heard higher percentages. And many of these projects totally fail, not just “can’t handle the traffic”, etc.  In short, the expectation that this site would work was optimistic.

CNBC reports that the security for the web site may not be sufficient and   MIT Technology Review points out that the complexity of the project (interoperability with 1000’s of insurance company sites, real time integration with IRS, and ‘last minute’ requirements changes) with no phased in testing process were complicating factors as well.

Software projects, particularly large scale ones with highly  visible deadlines and significant social impact require extra consideration on the part of those commissioning the production.  There is an emerging awareness of the need for software engineering as a recognized (and licensed) professional skill. (See IEEE Institute article.)   The ACA project is just one of many where this skill set, well established and documented by the IEEE Computer Society, is essential.

So we know how to do this, but then it requires an essential understanding on the part of the people involved.  Software engineering training, certification, licensing and capability maturity models can only take you so far.  You need people who understand these things as well as how and when to apply them.  And these people need to be on the “commissioning” side of the activity as well as execution side.  Corporate or governmental leaders who think, “oh that’s a simple matter of programming” ..don’t get it.  Failure to have clearly defined and comprehensive requirements is a critical part of project success. Systems like the FBI Case File point this out clearly (a contributing factor to that 100 million dollar failure was a continuously fluxuating set of requirements.)

Given the ACA challenges it was non-trivial that the site has become even partially functional.

If we can look beyond the political muck-raking, and consider the lessons to be learned from this situation we just might be able to find our way to a more satisfactory approach to applying technology to meet social objectives.

Your examples are solicited as comments below!

Mr .Jim Isaak

Mr.Jim Isaak
Bedford, NH; United States
SSIT Volunteer since: 2.003
picture of Mr.Isaak SSIT Roles
Vice President, 2015; Blog master 2014-present
IEEE Roles
SSIT Board (elected or appointed), IEEE Director, Member IEEE Technical Activities Board, IEEE Standards Association, IEEE Section/Chapter
SSIT 5 Pillars Interest:
Sustainability, Ethics, Impact of Emerging Technology
privacy, predictive (science) fiction, policy
Web site & Social Media
IEEE Senior Member in New Hampshire Section of Region 1

Last updated: 29/01/2017