Online physical attack

It should be noted that an early, if not first, instance of a physical attack on a person has been carried out by the use of online means, in particular social media used to trigger an epileptic seizure.  This concept has surfaced in science fiction, notably in Neil Stephenson’s Snow Crash (which also inspired the creation of Google Earth).  In that case, persons are exposed to an attack while in virtual reality that causes them to become comatose.

With the Internet of Things, and the potential for projecting “force” (or at least damage causing light/sound ) over the network a new level of abuse and need for protection is emerging.  One key in this particular case, and into the future, might be to have true identity disclosed, or as a criteria for accepting content over the net.

Privacy and Security

Guest Post from: Marvi Islam

Let me start it with privacy and link it to security. Well, all of us know about the privacy settings on Facebook and we like them so much as we can hide from our family members, the things we do and the people we’re with. But wait, what about security? How is privacy linked to security?

Let’s leave the digital platform and move our focus towards our daily lives. We need security in our banks, schools, public places and even in our homes and parks. But have you ever wondered what price we pay for this non-existent blanket of security? Privacy.  Let me reiterate –  security at the price of privacy. Those cute little things we see on the ceilings of our school corridors; we call them “CCTV” –  they are installed for our security. But security from? No one bothers to ask. Maybe they (the authorities) want to tape everything in case something bad happens so that they can go through the tapes and catch perps red-handed. But they are taping every single thing and we don’t take this as them breaching our privacy?

A number of times these tapes have been misused causing niggling unpleasantries and yet it’s ok. There’s a famous proverb in Hindi that translates to this,“You have to sacrifice one thing to get another”. Here we sacrifice our privacy to get security. With self-driving cars grabbing all the attention, there goes more data to stay connected and apparently, “secure”.

Similarly, some companies check what their employees are up to and what they are doing on their computers while they are at work. This, from the company’s perspective is to avoid plausible breach of sensitive data but is such constant monitoring even ethical? So, does it really have to be a tradeoff? Security for privacy and vice versa?

Marvi Islam is from Islamabad, Pakistan and studies at Capital University of Science and Technology, Islamabad.

What does it mean to be human?

Guest Blog from: John Benedict

“… I’d like to share a revelation that I’ve had during my time here. It came to me when I tried to classify your species, and I realized that you’re not actually mammals. Every mammal on this planet instinctively develops a natural equilibrium with the surrounding environment; but you humans do not. You move to an area and you multiply, and multiply until every natural resource is consumed and the only way you can survive is to spread to another area. There is another organism on this planet that follows the same pattern. Do you know what it is? A virus. Human beings are a disease, cancer on this planet, you are a plague, and we…are the cure…”

Let’s hope it doesn’t come to that.

Eighteen years have passed since the birth of a blind child and his graduation from high school. Eighteen years ago, there were no iPods, USSR was a superpower, Japan looked to the United States for economic leadership and support, smoking was permitted on airplanes, there were no companies which researched on biotechnology and only a handful of mobility and medical specialists taught in the nation’s public schools.

In eighteen more years, today’s blind infants will graduate from a strikingly different world. What we teach these kids today will determine how well they survive in their future. We have to make educated guesses about that future (and keep guessing) to prepare them for success.

When a much earlier world changed from a hunting-and-gathering culture to an agricultural age, human relationships were redefined and concepts about space and time changed. The speed of life accelerated. Leadership shifted; old power structures were replaced by the newly empowered. Old definitions and institutions collapsed and new ones took their place.

The hunting-to-survive stage lasted for a million years, the agricultural age – another six thousand years and the Industrial age lasted three hundred years. Some futurists defined an information age and then declared it dead after forty years.The concept of a “job” was also invented by the Industrial age. It pulled the children off the farms to the cities where they had to adjust to new spatial and temporal rules. A job required an employee to be at a certain place for a set amount of time, to do repetitive tasks – to “work” at producing things that were not immediately relevant to the individual’s life. In exchange for the loss of an agricultural lifestyle, employers gave steady wages (not affected by the weather or natural rhythms).

The industrial age saw the creation of vacations, health insurance, and sick days; all resulting from the invention of the job (a new way to work). This change was traumatic for a farm-based  agricultural culture, and many resisted. Human beings no longer were “ruled” by their natural rhythms or by the seasons. Respect for the wisdom of the elders of the society declined as their power was bypassed; they no longer controlled the source of wealth, and their knowledge was irrelevant to the new age.

The rules are ever changing in this age of communication. The life cycle of a business is only seven years now. The cycle in technology is down to six months, and in the software business, if a company is to survive, it must bring new products to market within two or three months. There is hardly time to plan; certainly the present is of little help.

The amount of information in the world is doubling every eight years. One-half of everything a college student learned in his or her freshman year is obsolete by the time they graduate. The amount of knowledge we are asking a typical high school senior to learn is more information than their grandparents absorbed in a lifetime. Our decision load is growing. We are running too fast, making too many decisions too quickly about things we know too little about. How can all these grand ideas about individual web pages, global consciousness, and the coming of massively capable workstations ever be implemented when we hardly have time to eat? This is the major social question facing the beneficiaries of the communications age.

The question remains – with advancements in technology, do we have too little time for what is important and much more for what might not? Are we missing out on morals and courtesies and relying too much on an online presence? We may be called social beings, but are we stepping away from human interaction? The answers to all these are terrifying to even think about! It’s time that we reclaim what we lost.

I finish this essay as I started – with a quote from The Matrix Revolutions.

“…Illusions, Mr. Anderson. Vagaries of perception. Temporary constructs of a feeble human intellect trying desperately to justify an existence that is without meaning or purpose. And all of them as artificial as the Matrix itself, although… only a human mind could invent something as insipid as love…”

The machines may be right but our entire purpose is built on something as insipid as love.

John Benedict is from Hyderabad, India and works with Amazon, India.

Predictive Analytics – Rhinos, Elephants, Donkeys and Minority Report

The  IEEE Computer Society published “Saving Rhinos with Predictive Analytics” in both IEEE Intelligent Systems, and in the more widely distributed ‘Computing Edge‘ (a compendium of interesting papers taken from 13 of the CS publications and provided to members and technologists at no cost.  The article describes how data based analysis of both rhino and poacher activity in concert with AI algorithms can focus enforcement activities in terms of timing and location and hopefully save rhinos.

For those outside of the U.S., the largest population of elephants (Republicans) and donkeys (Democrats) are in the U.S.– these animals being symbols for the respective political parties, and now on the brink of the 2016 presidential primaries, these critters are being aggressively hunted — ok, actually sought after for their votes.  Not surprisingly the same tools are used to locate, identify and predict the behaviour of these persons.   When I was young (1964) I read a book called The 480, which described the capabilities of that timeframe for computer based political analysis and targeting of “groups” required to win an election. (480 was the number of groupings of the 68 million voters in 1960 to identify which groups you needed to attract to win the election.)   21st century analytics are a bit more sophisticated — with as many as 235 million groups, or one per potential voter (and over 130 million voters likely to vote.).  A recent kerfuffle between the Sanders and Clinton campaign over “ownership/access” to voter records stored on a computer system operated by the Democratic National Committee reflects the importance of this data.  By cross connecting (data mining) registered voter information with external sources such as web searches, credit card purchases, etc. the candidates can mine this data for cash (donations) and later votes.  A few percentage point change in delivering voters to the polls (both figuratively, and by providing rides where needed) in key states can impact the outcome. So knowing each individual is a significant benefit.

Predictive Analytics is saving rhinos, and affecting the leadership of super powers. But wait, there’s more.  Remember the movie “Minority Report” (2002). This movie started on the surface with apparent computer technology able to predict future crimes by specific individuals — who were arrested to prevent the crimes.  (Spoiler alert) the movie actually proposes a group of psychics were the real source of insight.  This was consistent with the original story (Philip K Dick) in 1956, prior to The 480, and the emergence of the computer as a key predictive device.  Here’s the catch, we don’t need the psychics, just the data and the computers.  Just as the probability of a specific individual voting for a specific candidate or a specific rhino getting poached in a specific territory can be assigned a specific probability, we are reaching the point where aspects of the ‘Minority Report’ predictions can be realized.

Oddly, in the U.S., governmental collection and use of this level of Big Data is difficult due to privacy illusions, and probably bureaucratic stove pipes and fiefdoms.   These problems do not exist in the private sector.  Widespread data collection on everybody at every opportunity is the norm, and the only limitation on sharing is determining the price.  The result is that your bank or insurance company is more likely to be able to predict your likely hood of being a criminal, terrorist, or even a victim of a crime than the government.  Big Data super-powers like Google, Amazon, Facebook and Acxiom have even more at their virtual fingertips.

Let’s assume that sufficient data can be obtained, and robust AI techniques applied to be able to identify a specific individual with a high probability of a problematic event — initiating or victim of a crime in the next week.  And this data is implicit or even explicit in the hands of some corporate entity.  Now what?  What actions should said corporation take? What probability is needed to trigger such actions? What liability exists for failure to take such actions (or should exist)?

These are issues that the elephants, and donkeys will need to consider over the next few years — we can’t expect the rhinos to do the work for us.  We technologists may also have a significant part to play.

“Reality” Covers it Well

"Reality" Summer cover for Technology and  Society, image by Eran FowlerThe cover for the Summer 2014 issue of Technology and Society demonstrates that a picture can be worth at least a thousand words.   So, in effect this is a guest blog entry implicitly from Eran Fowler the creative artist involved. The piece is titled ‘Reality”.

SSIT often touches on the issues associated with virtual reality, the potential isolation from on-line connectivity compared with human connectivity. There is an irony that the editorial for this issue is on Lifelogging — folks who record their every activity, and in some cases post it online in real time. One can envision the “life log” of the individual in the cover image.  It is possible that he/she is living someone else’s life-log.  I also note that there is no evident form of input device — our subject here is a passive receiver.  A letter to the editor in the issue, from Jim Fifth – a prospective game developer and father (accompanied by a larger copy of this image) observes that in the limited life time any individual has what he “would be taking from these people isn’t their money, but their time, their participation in reality, their relationships, hopes and dreams.”

This image, like many in art, is a commentary.  If presented as an editorial, or a technically-researched, peer-reviewed paper, there would be a dialog on the percentage of individuals in this category, or even out-right refutation.  Art can lie.  That is something that propagandists have known for centuries (I know St. George killed the dragon, I saw the picture) Images can have significant social impact which is why governments censor some images and block photography or recording in various situations. If a simple photography or image can have that impact, consider the potential for motion pictures, or virtual reality.  The issue of how video games or movies affect behaviour is a recurrent topic in academic and  public discourse.  So look at this cover again. Is it a painful truth?  A good lie? Both? What action does it suggest? Is Jim Fifth’s observation that we pay good money to toss away hours, days or even years of our life a social concern?  Or does it placate the masses and keep them from questioning authority, deal with unemployment, and tolerate a declining quality of life?  Or is it an individual choice? Is the subject in this image “living for the moment”, immersed in the “now”,   expending the only real currency they have: their time in the way that seems best to them?

Culture vs Technology

Freedom of Speech vs the Right to be Forgotten …. Technology and Society, but whose society?  A recent European Court ruled that Google (also Bing, Yahoo, etc.) might be required to remove links that might be “accurate” but objectionable to the affected individual(s).  It is easy in a world with a dominating culture (U.S.A.) and particularly for technologists working in that culture (Google, et al) to adopt and apply the values of that culture (Free speech) without being aware of alternative cultural norms.

Apparently in Europe, particularly in Germany and France, have some precedents that suggest that prior offences, actions, public knowledge should become unaccessible in the present and future.  This is being considered as part of new E.U. Privacy legislation, and not just a court finding.

It is easy (particularly for those of us in the USA) to hold up the sacred right of free speech (as written in the book of Constitution, verse 1:1) and ignore the concerns, and abuses associated with this.  Some folks on-line are surprised that Facebook (or other) postings of their pictures/activities may result in them being expelled from college, fired, or fail to get a job. This “long tail” left by all of us in the exponentially growing web may contain many issues of concern.  For example, if I mention diabetes in this posting might I lose health insurance? Or if someone with a very similar name is leading a quite different life style, might I suffer some of the consequences?  And of course if I advocate an issue or candidate or religious affiliation could I find that I am persecuted in the media, or worse by police showing up  at my door (consider the recent transitions in Egypt… ops, there I go).

Consider one example, the widespread “sex offender” registration required by many US states.  This has  been a topic of non-academic discussion (Dear Abby) recently but presents an interesting reference. Note that the distinction between an individual found guilty of molesting children many times and a eighteen year old’s indiscretions with a seventeen year old can be indistinguishable in this context.  The public “right to know” would seem to apply in one case, and the chances of recurrence seems unlikely in the other —  yet both may lead to loss of job opportunities, shunning by neighbors, etc.

Facilitating the oppression of religious groups, political dissidents, or even un-informed misuse of the failings of youth seems a good rationale for a “Right to be Forgotten”.  At the same time,  and almost in the same breath, we can hear the need to know a political candidate’s racist remarks, the series of lawsuits brought against a used car dealer (the U.S. stereotype for a shady business),  or perhaps that my fiance has three divorces in the last five years. (This is hypothetical!)  The “Right to be Forgotten” may also be countered with the “Right to Never Be Forgotten”.  The Internet has created a global village — with all of the gossip and “everyone knows” implications of the spotlight of a small town.

This challenge is just beginning.  With face recognition and public web-cams, and many other sources of personal data being captured explicitly or implicitly how we deal with the diversity of cultural norms is non-trivial.

What are the issues you see?

Kicking the Online Habit

The spring issue of Technology and Society (T&S) starts with an editorial addressing Internet Addiction.  Perhaps the most disturbing example is the Sundance premier of Love Child.  This documentary covers the death of a child in South Korea attributed to her parent’s addiction to online gaming. They pled guilty claiming addiction as part of their defense, which is an interesting situation if not a precedent.  In South Korea, drunkenness is a form of addiction that mitigates legal liability, which provides a basis for the couple’s plea approach.  Apparently they also had little or no education on taking care of their premature baby. (One might wonder if a more realistic video game environment, they were raising a virtual child, might have lead to a different outcome.)

This captures the issue in a nutshell.  Video gaming can be an educational tool. But may result in problematic, or apparently, fatal responsibility failures.  The T&S editorial continues to outline other countries and situations that reflect the “Internet Addiction Disorder.”  When you combine gaming, with texting, email, web searches, smart-phone connectedness, and the increasing need to be on-line and/or have  remote access for your job, our screen times are rapidly expanding.  Since 2009 the average screen time for U.S. adults has  doubled.  Of course some of this is folks using their cell phones while watching TV and using their PC, but it is still a significant change in the way we use our time.

How much time is being used by these “brain suckers”? — (Curious that zombies have become a major horror show topic … perhaps there is more to this than we realize.)  To what extent are we losing essential aspects of society such as relationships, mindfulness, personal growth, productivity, etc?

And significantly, what can we do about it?  Your thoughts?  (as you read this online….)

What you post may be used against you

The Jan 9 Wall St Journal points out that credit analysts are starting to use your Facebook, LinkedIn and eBay activities to evaluate you.   For example, does your job history and status on these sites correspond with the one you submitted in an application?  What are buyers saying about you on eBay (assuming you are selling stuff there?) , etc.  In short, your “rep” (as in reputation) is being tracked as it spans social media.

This is added to the “75% of employers check your social media presence before pursuing an interview” (feedback from an HR friend of mine). Universities that use your presence as part of their acceptance process (are you really sure you want those party pictures on-line?), and even schools that have expelled students for violations admitted on their social media sites.

Scott McNealy asserted “You have no privacy anyway, get over it“, and it appears the NSA may concur.  However, it is not clear this is a situation we should take lying down …. anyone want to stand up?


T&S Magazine, Winter 2013

T&S Magazine





4 EDITORIAL For Now We See Through a Glass, Darkly Katina Michael

6 BOOK REVIEW Digital Whoness: Identity, Privacy and Freedom in the Cyberworld


9 LEADING EDGE Automatic Quality Management in Crowdsourcing Daniel Schall

14 LEADING EDGE Transparancy-Driven Business Process Management in Healthcare Settings Mathias Kirchmer, Sigifredo Laengle, and Victor Masías

17 OPINION The Future of Outdoor Media Systems Ron Harwood

19 OPINION Promoting Psychological Wellbeing: Loftier Goals for New Technologies Rafael A. Calvo and Dorian Peters

22 OPINION Open Prosperity: Breaking Down Financial and Educational Barriers to Creating Physical Goods Stephen Fox

 25 COMMENTARY Understanding the Human Machine* Deborah Lupton

    SPECIAL SECTION ON SENSORS Katherine Albrecht and Katina Michael

31 GUEST EDITORIAL Connected: To Everyone and Everything Katherine Albrecht and Katina Michael  

SPECIAL SECTION FEATURES   35 Asynchronous Adaptations to Complex Social Interactions* Sally Applin and Michael Fischer

45 Comparing British and Japanese Perceptions of a Wearable Ubiquitous Monitoring Device* Stuart Moran, Toyoaki Nishida, and Keiichi Nakata

50 Public Open Sensor Data: Revolutionizing Smart Cities* Albert Domingo, Boris Bellalta, Manuel Palacin, Miquel Oliver, and Esteve Almirall

57 Public Domain Treaty Compliance Verification in the Digital Age* Christopher W. Stubbs and Sidney D. Drell


*Refereed articles Cover Image: Fabio Lima.

NSA and the tip of the Iceberg

So how can we touch on the social implications of technology and not address the news related to the U.S. NSA?  First, let me welcome my colleagues at NSA to our Blog.  This follows a tradition we started when I was in a multinational  group at Digital Equipment when we welcomed our NSA colleagues to our discussion.  Those were international, SSIT is international, this Blog is public, and I hope international as well. So, at least one country has folks “monitoring” our discussion.

Monitoring is a curious word, and one where technology plays a big role. I have a brother-in-law who flew a parallel path to U-2 planes that were picking up signals from inside “enemy” areas, passing them to his plane “outside” enemy territory where he did real time translation to see if there were interesting things happening.  This is an expensive proposition. While technology has expanded massively the amount of communications, it has also provided tools to capture information about these, store it, and analyse it.  In today’s world it is possible to scan discussions in multiple languages, text or voice and watch for key words.  These can then be used for deeper analysis, and if warranted, engage humans to do even more detailed evaluation.

A second form of analysis is based on “traffic”.  A classic example is trying to decode the “Coke” formula by logging the delivery of various components to the Coke factory.  In theory, if you could log the amounts of each thing going in, you would eventually be able to figure out the recipe.  A similar approach can be used tracking say phone messages.  Without knowing  the actual “content”, if I know the calling number, number called, time and duration I can start to build traffic models. Ultimately this depends on knowing something about some numbers, such as this source is suspect.  Then building model of the network connecting to and from that number can show patterns.  This can be confirmed by playing back history records now that you have a confirmed ‘interesting’ number.  A similar concept to the way IBM taught Watson, and Deep Blue to play human games can be used to evaluate bad-guy games.  Once you sort out the patterns that differentiate a pot-luck dinner, a pot-dealer and a terrorist event you can automate the search, and focus on the interactions of interest.

What technology does is significantly  expand the ability to analyze and correlate information. Storing information about billions of phone calls, emails, tweets, blogs (hi again), social media posts, etc. has become possible.  Similarly it is now possible to cross-connect those streams — your tweets to your phone calls, etc.  More sophisticated models emerging from more complex data.  AI techniques, such as genetic algorithms, should be able to surface results with methods that humans might not even understand. (Patents for genetic algorithm creations have been filed.)

So we should not be surprised at the increased level of tracking, correlation, and potential intervention resulting from our on-line footprints. The same tools that make Google search work, or NetFlix recommendations relevant can be, and probably are being applied to monitoring our web trails for government “security” purposes; and not just by NSA.

There is an assumption that no weapon has been created that has not been used in warfare (eventually).  Similarly it is quite possible that technology developed for ‘good purposes’ eventually gets used for bad purposes.  If the NSA (and their peers), are currently only using the information they collect for improved security today; might they start using it to select out the targets for the next pogrom tomorrow.

There is a real dilemma of technology here. An algorithm or device created for improved search, or generating bitcoins may find applications by organizations like the NSA.  And the direction given such an organization in today’s world, may change tomorrow as the winds of politics blow.  There will be unintended, and probably undesirable consequences.

What concerns do you see from the broader picture?