Culture vs Technology

Freedom of Speech vs the Right to be Forgotten …. Technology and Society, but whose society?  A recent European Court ruled that Google (also Bing, Yahoo, etc.) might be required to remove links that might be “accurate” but objectionable to the affected individual(s).  It is easy in a world with a dominating culture (U.S.A.) and particularly for technologists working in that culture (Google, et al) to adopt and apply the values of that culture (Free speech) without being aware of alternative cultural norms.

Apparently in Europe, particularly in Germany and France, have some precedents that suggest that prior offences, actions, public knowledge should become unaccessible in the present and future.  This is being considered as part of new E.U. Privacy legislation, and not just a court finding.

It is easy (particularly for those of us in the USA) to hold up the sacred right of free speech (as written in the book of Constitution, verse 1:1) and ignore the concerns, and abuses associated with this.  Some folks on-line are surprised that Facebook (or other) postings of their pictures/activities may result in them being expelled from college, fired, or fail to get a job. This “long tail” left by all of us in the exponentially growing web may contain many issues of concern.  For example, if I mention diabetes in this posting might I lose health insurance? Or if someone with a very similar name is leading a quite different life style, might I suffer some of the consequences?  And of course if I advocate an issue or candidate or religious affiliation could I find that I am persecuted in the media, or worse by police showing up  at my door (consider the recent transitions in Egypt… ops, there I go).

Consider one example, the widespread “sex offender” registration required by many US states.  This has  been a topic of non-academic discussion (Dear Abby) recently but presents an interesting reference. Note that the distinction between an individual found guilty of molesting children many times and a eighteen year old’s indiscretions with a seventeen year old can be indistinguishable in this context.  The public “right to know” would seem to apply in one case, and the chances of recurrence seems unlikely in the other —  yet both may lead to loss of job opportunities, shunning by neighbors, etc.

Facilitating the oppression of religious groups, political dissidents, or even un-informed misuse of the failings of youth seems a good rationale for a “Right to be Forgotten”.  At the same time,  and almost in the same breath, we can hear the need to know a political candidate’s racist remarks, the series of lawsuits brought against a used car dealer (the U.S. stereotype for a shady business),  or perhaps that my fiance has three divorces in the last five years. (This is hypothetical!)  The “Right to be Forgotten” may also be countered with the “Right to Never Be Forgotten”.  The Internet has created a global village — with all of the gossip and “everyone knows” implications of the spotlight of a small town.

This challenge is just beginning.  With face recognition and public web-cams, and many other sources of personal data being captured explicitly or implicitly how we deal with the diversity of cultural norms is non-trivial.

What are the issues you see?

Kicking the Online Habit

The spring issue of Technology and Society (T&S) starts with an editorial addressing Internet Addiction.  Perhaps the most disturbing example is the Sundance premier of Love Child.  This documentary covers the death of a child in South Korea attributed to her parent’s addiction to online gaming. They pled guilty claiming addiction as part of their defense, which is an interesting situation if not a precedent.  In South Korea, drunkenness is a form of addiction that mitigates legal liability, which provides a basis for the couple’s plea approach.  Apparently they also had little or no education on taking care of their premature baby. (One might wonder if a more realistic video game environment, they were raising a virtual child, might have lead to a different outcome.)

This captures the issue in a nutshell.  Video gaming can be an educational tool. But may result in problematic, or apparently, fatal responsibility failures.  The T&S editorial continues to outline other countries and situations that reflect the “Internet Addiction Disorder.”  When you combine gaming, with texting, email, web searches, smart-phone connectedness, and the increasing need to be on-line and/or have  remote access for your job, our screen times are rapidly expanding.  Since 2009 the average screen time for U.S. adults has  doubled.  Of course some of this is folks using their cell phones while watching TV and using their PC, but it is still a significant change in the way we use our time.

How much time is being used by these “brain suckers”? — (Curious that zombies have become a major horror show topic … perhaps there is more to this than we realize.)  To what extent are we losing essential aspects of society such as relationships, mindfulness, personal growth, productivity, etc?

And significantly, what can we do about it?  Your thoughts?  (as you read this online….)

T&S Magazine, Winter 2013

T&S Magazine





4 EDITORIAL For Now We See Through a Glass, Darkly Katina Michael

6 BOOK REVIEW Digital Whoness: Identity, Privacy and Freedom in the Cyberworld


9 LEADING EDGE Automatic Quality Management in Crowdsourcing Daniel Schall

14 LEADING EDGE Transparancy-Driven Business Process Management in Healthcare Settings Mathias Kirchmer, Sigifredo Laengle, and Victor Masías

17 OPINION The Future of Outdoor Media Systems Ron Harwood

19 OPINION Promoting Psychological Wellbeing: Loftier Goals for New Technologies Rafael A. Calvo and Dorian Peters

22 OPINION Open Prosperity: Breaking Down Financial and Educational Barriers to Creating Physical Goods Stephen Fox

 25 COMMENTARY Understanding the Human Machine* Deborah Lupton

    SPECIAL SECTION ON SENSORS Katherine Albrecht and Katina Michael

31 GUEST EDITORIAL Connected: To Everyone and Everything Katherine Albrecht and Katina Michael  

SPECIAL SECTION FEATURES   35 Asynchronous Adaptations to Complex Social Interactions* Sally Applin and Michael Fischer

45 Comparing British and Japanese Perceptions of a Wearable Ubiquitous Monitoring Device* Stuart Moran, Toyoaki Nishida, and Keiichi Nakata

50 Public Open Sensor Data: Revolutionizing Smart Cities* Albert Domingo, Boris Bellalta, Manuel Palacin, Miquel Oliver, and Esteve Almirall

57 Public Domain Treaty Compliance Verification in the Digital Age* Christopher W. Stubbs and Sidney D. Drell


*Refereed articles Cover Image: Fabio Lima.

Meme Propagation

Deb Roy in his TED presentation on “The Birth of a Word” gives us a glimpse at a technology with potentially high impact.  His primary talk discusses how, over 5 years, with a 18/7 audio/video recording from every room in his house, his MIT team is able to trace the word acquisition of his son. I will let you contemplate the pros and cons of having 100% of your household activities recorded for posterity.

However, his team applied the software they used to capture every use of specific words by his son, then connecting these with every word from members of the household in the proximity of his son, to analyse other interactions.  One source was the feed from every major television network. The second was to track emerging phrases from these sources via the blogosphere/twitterverse.  Their result is the ability to obtain near-real time measurement of the impact that a given source is currently having on the population at large.

A popular TV show may trigger social media flow with a positive feedback loop bringing more viewers into the show.  The proliferating comments may provide analysis of what works best in the show, what is not working, where viewers want the story to go.  One can envision a program driven impromptu by viewer responses measured in the Twitterverse.

However, a second example was President Obama’s State of the Union address. This showed much broader distribution than any single TV show, with massive response and interaction in the Twitterverse.  One can envision real time AI analysis (Deb Roy has been working with Bluefin Labs which does this commercially) that is used to critique a political speech or event.  In the extreme, a presenter may get coaching feedback from real time evaluation, altering the presentation spinning up on the teleprompter in real time.

Consider a political debate where candidates are receiving real time talking points based on analysis of the Blogosphere, and altering their apparent positions based on this.  The good news is that it would require some fairly smart candidates to pull that off, or perhaps ones with nothing in their heads except the words that are being feed to them anyway.  But now the kicker. Activating a zombie army of previously captured devices, all of which have twitter accounts, and you can re-direct the discussion. The discussion migrates from starvation in Iran to education in Latvia. Both candidates arguing the strategic value of Latvia and how their proposals will yield the best educational outcomes.  And the only indication that this discussion has been hijacked is that neither candidate knows where Latvia is. But the good news is that they are supporting education …..somewhere.

Don’t Like This

Hopefully most folks have seen news of the paper in the Proceedings of the National Academy of Sciences that shows how analysis of what you “Like” in Facebook can be used to infer aspects of your personality.  The gist of this is that your “Likes” form a profile about you that can be associated with other aspects of your life — religious affiliation, sexual preference, drug use — with a certain probability.  Presumably other visible aspects of your preferences could also be used in similar ways — who you follow on Twitter, topics in your blog (or on Twitter), etc.  Combining various of these methods is likely to increase the probability that a given “assertion” is accurate.

Some of these things may be “don’t care” for you, but others could be problematic.  With the tendency of employers, schools, and others to evaluate your web presence as part of their interview and other processes, this becomes another subtle channel for discrimination.  Of course the results can yield “false positives”, but it is unlikely that you will know about the uses/abuses of such evaluations and as such won’t have any way to counter the conclusions.

The automated evaluation of many aspects of our digital footprints is something we need to constantly revisit.  “Traffic analysis” is a concept that has been applied to determine communications patterns but also to find the recipe for trade secret foods (three tankers of corn syrup, one of vanilla, etc.)  Watching where you go on the web is something your employer, ISP, search engine, and others can do (it is a key aspect of Facebook “Like” and other such options — and you don’t need to actually select “Like” for Facebook to record your trip to that site.)  Some aspects of this are only visible to the data collector (Facebook) and their friends, but publicly accessible indicators such as “Likes” on your Facebook page are open to analysis by any interested parties.  No doubt we will see emerging services that will either do this on request, or will sort out groups of interest for targeted advertising, or other uses.

As with many possible privacy issues, it is the aggregation of data points that starts to reveal details we might have assumed were at least obfuscated if not private.