Technology In the Classroom?

The Wall Street Journal has a Pros/cons article on this question … which is at the core of Social Impact of Technology in Education.

My son-in-law teaches a university class where students get the “lecture” portion online, and come into class to work on projects/homework. My granddaughter has online assignments regularly, many key tests are done online, and they don’t get ‘snow days’ — in case of inclimate weather they stay home and login. Programs like the Kahn Academy, and a number of Universities offer courses free to “audit”.

At the same time, kids need the real world collaboration, social experience, ideally no bullying, and ideally sufficiently strong (positive) peer groups that help them develop a bunch of skills that are real world based.

What are the key references you find informative on the question of how we educate the next generation?

Emoti Con’s

I’m not talking about little smiley faces :^( ,,, but how automation can evaluate your emotions, and as is the trend of this blog – how that information may be abused.

Your image is rather public.  From your Facebook page, to the pictures posted from that wedding you were at, to the myriad of cameras capturing data in every store, street corner, ATM machine, etc. And, as you (should) know, facial recognition is already there to connect your name to that face.  Your image can also be used to evaluate your emotions, automatically with tools described in a recent Wall St Journal article (The  Technology That Unmasks Your Hidden Emotions.)  These tools can be used in real time as well as evaluation of static images.

So wandering though the store, it may be that those cameras are not just picking up shop-lifters, but lifting shopper responses to displays, products and other aspects of the store.  Having identified you (via facial recognition, or the RFID constellation you carry)  the store can correlate your personal response to specific items.  The next email you get may be promoting something you liked when you were at the store, or an well researched-in-near-real-time evaluation of what ‘persons like you’ seem to like.

The same type of analysis can be used analysing and responding to your responses in some political context — candidate preferences, messages that seem to be effective. Note, this is no longer the ‘applause-meter’ model to decide how the audience responds, but personalized to you, as a face-recognized person observing that event. With cameras getting images though front windshields posted on political posters/billboards it may be possible to collect this data on a very wide basis, not just for those who chose to attend an event.

Another use of real time emotional tracking could play out in situations such as interviews, interrogations, sales show rooms, etc.  The person conducting the situation may be getting feedback from automated analysis that informs the direction they lead the interaction. The result might be a job offer, arrest warrant or focused sales pitch in these particular cases.

The body-language of lying is also being translated.  Presumably a next step here is for automated analysis of your interactions. For those of us who never, ever lie, that may not be a problem. And of course, being a resident of New Hampshire where the 2016 presidential season has officially opened, it would be nice to have some of these tools in the hands of the citizens as we seek to narrow down the field of candidates.

 

Privacy Matters

Alessandro Acquisti’s TED talk, Why Privacy Matters.lays out some key privacy issues and revealing research of what is possible with online data  In one project they were able to locate student identities via face recognition in the few minutes needed to fill out a survey …. and potentially locate their Facebook page using that search.  In a second project they were able to deduce persons social security numbers (a key U.S. personal identifier) from their Facebook page data. This opens the possibility that any image of you can lead to both identifying you, and also locating significant private information about you.

There is a parallel discussion sponsored by the IEEE Standards Association on “The Right to be Forgotten“.  This was triggered by a recent European court case where an individual did not want information about his past to be discoverable via search engines. These two concepts collide when an individual seeking to be “forgotten” has their image captured by any range of sources (store cameras, friends posting photos, even just being “in the picture” that someone else is taking.)  If that can be translated into diverse personal information, then even the efforts of the search engine providers to block the searches will be futile.

Alessandro identifies some tools that can help:  The Electronic  Freedom Foundation’s anonymous internet portal, and Pretty Good Privacy (PGP) which can deliver a level of encryption that is very expensive to crack, with variations being adopted by Google, Yahoo and maybe even Apple to protect the content of their devices and email exchanges. There are issues with the PGP model and perhaps some better approaches. There  is also government push back against too strong of encryption — which is perhaps one of the best endorsements for the capability of such systems.

Behind all this is the real question of how seriously we choose to protect our privacy. It is a concept given greater consideration in Europe than in the U.S. — perhaps because the deeper European history has proven that abuse by governments or other entities can be horrific — an experience that has not engaged the “Youth of America”, nor discouraged the advertising/commercial driven culture that dominates the Internet.

Alessandro observes that an informed public that understands the potential issues is a critical step towards developing policy, tools and the discipline needed to climb back up this slippery slope.

 

Too Close for Comfort? Detecting your presence.

A group of authors in the August 2014 issue of IEEE Computer outline some pros, cons and examples of proximity sensing technology that initiates advertising, action and may report your presence to some data collection process. The article is called The Dark Patterns of Proxemic Sensing.

There are simple examples which most folks have encountered: the faucet that turns on when you put your hands near it, followed by the automated hand dryer or paper towel dispenser.  This paper Identifies some current examples that many of us may not have encountered: the mirror that presents advertising, a wall of virtual “paparazzi” that flash cameras at you accompanied by cheering sounds, and urinals that incorporate video gaming. Some of these systems are networked, even connected to the internet.  Some interact anonymously, others are at least capable of face or other forms of recognition.

The article identifies eight “dark” aspects of this proximity interaction:

  1. Captive Audience – there is a concern of unexpected/undesired interactions in situations where the individual must go for other reasons.
  2. Attention Grabbing – detection and interaction allows these systems to distract the target individual.  Which may be problematic, or just annoying.
  3. Bait and Switch – initiating interaction with an attractive first impression, then switching to a quite different agenda.
  4. Making personal information public — for example, displaying or announcing your name upon recognition.
  5. We never forget – tracking an individual from one encounter to the next, even spanning locations for networked systems.
  6. Disguised data collection – providing (personalized) data back to some central aggregation.
  7. Unintended relationships – is that person next to you related in some way — oh, there she is again next to you at a different venue…
  8. Milk factor – forcing a person to go through a specific interaction (move to a location, provide information …) to obtain the promised service.

Most of these are traditional marketing/advertising concepts, now made more powerful by automation and ubiquitous networked systems.  The specific emerging technologies are one potentially  disturbing area of social impact.  A second is the more general observation that the activities we consider innocuous or even desirable historically may become more problematic with automation and de-personalization.  The store clerk might know you by name, but do you feel the same way when the cash register or the automatic door knows you?

Issues in this area area also discussed in the Summer 2014 issue of Technology and Society – Omnipresent Cameras and Personal Safety Devices being relevant articles in that issue.

The Technological Implications of Society

Sometimes you just need to look at it the other way.  SSIT (our host and benefactor for this BLOG) is the Society for the Social Implications of Technology.  A recent issue of IEEE’s Computer Magazine focused on Gender Diversity in Computing.  In the introduction, Jane Prey (NSF) and Alf Weaver (U. Va) assert that “a lack of diverse perspectives will inhibit innovation, productivity, and competitiveness.”  This certainly matches what little research I’ve done on the contribution that diverse perspectives are a catalyst for innovation and solving problems.  In short … we need to consider the impact that society has on technology.

The types of people we employ to invent and develop technology, the funding sources we have for research, the mass media fad-de-jour, the winds of politics, and even the dolls our children play with set the stage.  Consider how these factors change as you move from the U.S. to Germany to Russia to India to China to Brazil, etc.  Each culture, government, and set of parental and social expectations influence what will happen in education and from there into research and industry.  Some of these environments highly value engineering, others lean towards sports or movie stars.  Some have significant investments in military technology, others in educational systems, and others in infrastructure.  Perhaps we get the Technology we ask for, or worse the technology we deserve.

I participated in a “Congressional Visits Day” this last month.  Two hundred or so scientists, engineers and the like went off to visit our elected representatives in Washington DC. Our request was simple — fund basic R&D or else … we were gentle with the “or else” — but think about it. Laser R&D occurred in the 1960’s, the applications emerged in the 1980’s. The Human Genome project ran from 1990 to 2003 — and the (GM) flowers are starting to bloom all over. The long lead time impact of research is legendary.  But when one society doesn’t fund research or technology they relinquish the leading edge to others.  No doubt the Brazilians (et al) will be willing to share with the U.S. the products of their investments and innovations, but the culture of innovation will thrive where it is fostered.  And, in this scenario, it will lack the insight that would come from having U.S. participants in the creative mix.

And the reverse of this is the value each of the diverse cultures above (and those not mentioned) bring to the table when we engage the full spectrum of human experience rather than the perspective found in the common cubicle mono-culture.  What side of the road should your intelligent car use?  Can your input device handle thousands of characters? Can the voice recognition detect intonations or clicks? And don’t limit this to cultural variations.  Is your display readable by red-green color blind individuals? How does a person with limited vision traverse your web site, your building, your city? Does your audio information system serve the needs of hearing impaired individuals?  Diversity comes in all shapes, sizes, colors, abilities, languages, religions, prejudices and even genders.

Society has real impact on technology.  Diversity is one tool for helping to assure this impact is as beneficial as it can be.

wait, 3-d printing organs?

Several years ago, a relative of mine needed a liver transplant. While she did ultimately receive one, she was one of the lucky ones, but the wait was far too long. Sadly, she never fully recovered after the transplant and died a few months later. I thought then (and still think) if she had been able to get a transplant when she needed it, she might have lived. There is an organ shortage the world wide, but it is particularly acute in the United States because our medical system makes organ donation an ‘opt-in’ that ill and injured people’s families aren’t prepared to deal with. Social scientists and ethicists are still working on that problem, but maybe someday we won’t need donated organs, we’ll be able to 3-D print them.

This is a very, very early step towards someday being able to manufacture tissues and organs. They are just figure out how to do this. But the prospect of being able to manufacture tissue in this way is a huge leap forward. Of course, no one is going to be printing off a heart in the next year. But the potential benefits here are enormous. Skin grafts for burn victims might be one early application, but kidneys and livers are the real prize. Organ trading is illegal in every country on earth, but it’s virtually impossible to prevent very wealthy people from paying people, often uneducated people in third world countries, to ‘donate’ a kidney. I’m not saying that this will democratize access to healthcare – but I do think that this is an area where the natural scarcity leads to vastly, vastly unequal access and outcomes. ANYTHING that can move the needle is welcome, and long overdue.

Don’t Like This

Hopefully most folks have seen news of the paper in the Proceedings of the National Academy of Sciences that shows how analysis of what you “Like” in Facebook can be used to infer aspects of your personality.  The gist of this is that your “Likes” form a profile about you that can be associated with other aspects of your life — religious affiliation, sexual preference, drug use — with a certain probability.  Presumably other visible aspects of your preferences could also be used in similar ways — who you follow on Twitter, topics in your blog (or on Twitter), etc.  Combining various of these methods is likely to increase the probability that a given “assertion” is accurate.

Some of these things may be “don’t care” for you, but others could be problematic.  With the tendency of employers, schools, and others to evaluate your web presence as part of their interview and other processes, this becomes another subtle channel for discrimination.  Of course the results can yield “false positives”, but it is unlikely that you will know about the uses/abuses of such evaluations and as such won’t have any way to counter the conclusions.

The automated evaluation of many aspects of our digital footprints is something we need to constantly revisit.  “Traffic analysis” is a concept that has been applied to determine communications patterns but also to find the recipe for trade secret foods (three tankers of corn syrup, one of vanilla, etc.)  Watching where you go on the web is something your employer, ISP, search engine, and others can do (it is a key aspect of Facebook “Like” and other such options — and you don’t need to actually select “Like” for Facebook to record your trip to that site.)  Some aspects of this are only visible to the data collector (Facebook) and their friends, but publicly accessible indicators such as “Likes” on your Facebook page are open to analysis by any interested parties.  No doubt we will see emerging services that will either do this on request, or will sort out groups of interest for targeted advertising, or other uses.

As with many possible privacy issues, it is the aggregation of data points that starts to reveal details we might have assumed were at least obfuscated if not private.

Mr .Jim Isaak


Mr.Jim Isaak
Bedford, NH; United States
SSIT Volunteer since: 2.003
picture of Mr.Isaak SSIT Roles
Vice President, 2015; Blog master 2014-present
IEEE Roles
SSIT Board (elected or appointed), IEEE Director, Member IEEE Technical Activities Board, IEEE Standards Association, IEEE Section/Chapter
SSIT 5 Pillars Interest:
Sustainability, Ethics, Impact of Emerging Technology
Other:
privacy, predictive (science) fiction, policy
Web site & Social Media
http://JimIsaak.com
http://www.linkedin.com/in/Jim-Isaak
IEEE Senior Member in New Hampshire Section of Region 1

Last updated: 29/01/2017