Police Cameras

My daughter is attending a citizen police academy. They discussed the challenges that police cameras (body, squad car, interview rooms, traffic monitoring, etc.) present — and these related, in part, to the objectives of having such cameras.

1) When an officer is apprehending a suspect, a video of the sequence covers a topic that is very likely to be raised in court (in the  U.S. where fairly specific procedures need to be followed during an arrest.)  Evidence related to this has to follow very specific rules to be admissible.  An example of this concept is in the Fort Collins Colorado police FAQ where they provide some specifics. This process requires managed documentation trails by qualified experts to assure the evidence can be used.  There are real expenses here beyond just having a camera and streaming/or transferring the sequences to the web. Web storage has been created that is designed to facilitate this management challenge. Note that even if the prosecution does not wish to use this material, the defense may do so, and if it is not managed correctly, seek that charges be dismissed. (For culture’s where defendants are not innocent until proven guilty and/or there is not a body of case or statutory defendants rights this may sound odd, but in the U.S. it is possible for a blatantly guilty perpetrator to have charges against him dropped due to a failure to respect his rights.)

2) There are situations where a police officer is suspected of criminal actions. For real time situations (like those in the news recently), the same defendants rights need to be respected for the officer(s) involved. Again close management is needed.

Note that in these cases, there are clear criminal activities that the police suspect at the time when the video is captured, and managing the ‘trail of evidence’ is a well defined activity with a cost and benefit that is not present without the cameras.

The vast majority of recorded data does not require the chain-of-evidence treatment. If a proper request for specific data not associated with an arrest results in data that is used in court, it is most likely to be by a defendant, and the prosecutor is unlikely to challenge the validity of the data since it deprecates their own system.

Of course there are other potential uses of the data.  It might contain information relevant to a divorce actions (the couple in the car stopped for the ticket – one’s spouse wants to know why the other person was in the car); or the images of bystanders at a site might impact the apparent privacy of such persons. (Although in general no right of privacy is recognized in the U.S. for persons in public.)

The Seattle police are putting some video on YouTube, after applying automated redaction software to protect the privacy of individuals captured in the frame. Just the presence of the video cameras can reduce both use of force and citizen complaints.

There are clearly situations where either the police, or the citizens involved, or both would find a video recording to be of value, even if it did not meet evidentiary rules.  Of course the concern related to such rules is the potential for in-appropriate editing of the video to transform it from an “objective” witness to bias it in one direction or another.

We have the technology— should we use it?  An opinion piece by Jay Stanley in SSIT’s Technology and Society journal outlines some of these issues in more detail.

Emoti Con’s

I’m not talking about little smiley faces :^( ,,, but how automation can evaluate your emotions, and as is the trend of this blog – how that information may be abused.

Your image is rather public.  From your Facebook page, to the pictures posted from that wedding you were at, to the myriad of cameras capturing data in every store, street corner, ATM machine, etc. And, as you (should) know, facial recognition is already there to connect your name to that face.  Your image can also be used to evaluate your emotions, automatically with tools described in a recent Wall St Journal article (The  Technology That Unmasks Your Hidden Emotions.)  These tools can be used in real time as well as evaluation of static images.

So wandering though the store, it may be that those cameras are not just picking up shop-lifters, but lifting shopper responses to displays, products and other aspects of the store.  Having identified you (via facial recognition, or the RFID constellation you carry)  the store can correlate your personal response to specific items.  The next email you get may be promoting something you liked when you were at the store, or an well researched-in-near-real-time evaluation of what ‘persons like you’ seem to like.

The same type of analysis can be used analysing and responding to your responses in some political context — candidate preferences, messages that seem to be effective. Note, this is no longer the ‘applause-meter’ model to decide how the audience responds, but personalized to you, as a face-recognized person observing that event. With cameras getting images though front windshields posted on political posters/billboards it may be possible to collect this data on a very wide basis, not just for those who chose to attend an event.

Another use of real time emotional tracking could play out in situations such as interviews, interrogations, sales show rooms, etc.  The person conducting the situation may be getting feedback from automated analysis that informs the direction they lead the interaction. The result might be a job offer, arrest warrant or focused sales pitch in these particular cases.

The body-language of lying is also being translated.  Presumably a next step here is for automated analysis of your interactions. For those of us who never, ever lie, that may not be a problem. And of course, being a resident of New Hampshire where the 2016 presidential season has officially opened, it would be nice to have some of these tools in the hands of the citizens as we seek to narrow down the field of candidates.


Your DNA into Your Picture

A recent Wall St Journal interview with J. Craig Venter indicates his company is currently working on translating DNA data into a ‘photo of you’, or the sound of your voice. The logic of course is that genetics (including epigenetic elements) include the parts list, assembly instructions and many of the finishing details for building an individual.  So it may not come as a surprise that a DNA sample can identify you as an individual (even distinct from your identical twin — considering mutations and epigenetic variations) — or perhaps even to create a clone.  But having a sample of your DNA translated into a picture of your face (presumably at different ages) or an imitation of your voice is not something that had been in my  genomic awareness.

The DNA sample from the crime scene may do more than identify the Perp, it may be the basis for generating a ‘police sketch’ of her face.

The movie Gattaca projected a society where genetic evaluation was a make/break factor in selecting a mate, getting a job, and other social decisions.  But it did not venture into the possibility of not just the evaluation of genetic desirability of a mate, but perhaps projecting their picture some years into the future.  “Will you still need me .. when I’m sixty four?

The interview also considers some of the ethical issues surrounding insurance, medical treatment and extended life spans … What other non-obvious applications can you see from analyzing the genomes and data of a few million persons?

Privacy Matters

Alessandro Acquisti’s TED talk, Why Privacy Matters.lays out some key privacy issues and revealing research of what is possible with online data  In one project they were able to locate student identities via face recognition in the few minutes needed to fill out a survey …. and potentially locate their Facebook page using that search.  In a second project they were able to deduce persons social security numbers (a key U.S. personal identifier) from their Facebook page data. This opens the possibility that any image of you can lead to both identifying you, and also locating significant private information about you.

There is a parallel discussion sponsored by the IEEE Standards Association on “The Right to be Forgotten“.  This was triggered by a recent European court case where an individual did not want information about his past to be discoverable via search engines. These two concepts collide when an individual seeking to be “forgotten” has their image captured by any range of sources (store cameras, friends posting photos, even just being “in the picture” that someone else is taking.)  If that can be translated into diverse personal information, then even the efforts of the search engine providers to block the searches will be futile.

Alessandro identifies some tools that can help:  The Electronic  Freedom Foundation’s anonymous internet portal, and Pretty Good Privacy (PGP) which can deliver a level of encryption that is very expensive to crack, with variations being adopted by Google, Yahoo and maybe even Apple to protect the content of their devices and email exchanges. There are issues with the PGP model and perhaps some better approaches. There  is also government push back against too strong of encryption — which is perhaps one of the best endorsements for the capability of such systems.

Behind all this is the real question of how seriously we choose to protect our privacy. It is a concept given greater consideration in Europe than in the U.S. — perhaps because the deeper European history has proven that abuse by governments or other entities can be horrific — an experience that has not engaged the “Youth of America”, nor discouraged the advertising/commercial driven culture that dominates the Internet.

Alessandro observes that an informed public that understands the potential issues is a critical step towards developing policy, tools and the discipline needed to climb back up this slippery slope.


Too Close for Comfort? Detecting your presence.

A group of authors in the August 2014 issue of IEEE Computer outline some pros, cons and examples of proximity sensing technology that initiates advertising, action and may report your presence to some data collection process. The article is called The Dark Patterns of Proxemic Sensing.

There are simple examples which most folks have encountered: the faucet that turns on when you put your hands near it, followed by the automated hand dryer or paper towel dispenser.  This paper Identifies some current examples that many of us may not have encountered: the mirror that presents advertising, a wall of virtual “paparazzi” that flash cameras at you accompanied by cheering sounds, and urinals that incorporate video gaming. Some of these systems are networked, even connected to the internet.  Some interact anonymously, others are at least capable of face or other forms of recognition.

The article identifies eight “dark” aspects of this proximity interaction:

  1. Captive Audience – there is a concern of unexpected/undesired interactions in situations where the individual must go for other reasons.
  2. Attention Grabbing – detection and interaction allows these systems to distract the target individual.  Which may be problematic, or just annoying.
  3. Bait and Switch – initiating interaction with an attractive first impression, then switching to a quite different agenda.
  4. Making personal information public — for example, displaying or announcing your name upon recognition.
  5. We never forget – tracking an individual from one encounter to the next, even spanning locations for networked systems.
  6. Disguised data collection – providing (personalized) data back to some central aggregation.
  7. Unintended relationships – is that person next to you related in some way — oh, there she is again next to you at a different venue…
  8. Milk factor – forcing a person to go through a specific interaction (move to a location, provide information …) to obtain the promised service.

Most of these are traditional marketing/advertising concepts, now made more powerful by automation and ubiquitous networked systems.  The specific emerging technologies are one potentially  disturbing area of social impact.  A second is the more general observation that the activities we consider innocuous or even desirable historically may become more problematic with automation and de-personalization.  The store clerk might know you by name, but do you feel the same way when the cash register or the automatic door knows you?

Issues in this area area also discussed in the Summer 2014 issue of Technology and Society – Omnipresent Cameras and Personal Safety Devices being relevant articles in that issue.

Soft Biometrics

Karl Ricanek has an article in the Sept. Computer Magazine, “Beyond Recognition: The Promise of Biometric Analytics“.  He points out a range of possible applications for biometric analysis beyond identifying specific individuals.   Many of these are ‘grist’ for the social impact mill.  Karl defines Biometric Analytics as the discovery of potentially interesting information about a person other than identity using biometric signal patterns. He includes in these emotional state, longevity, aliveness (if you a reading this, you are alive), continuous authentication, ethnicity, gender, age (demographics in general), honesty, concentration, mood, attitude, and even frustration with automated phone systems (‘dial 1 if you like talking to robots, dial 2 if you would like to toss your phone out the window, …’) A few specific examples include:

  1. Audience reaction research – detecting smiles, confusion, boredom, or distress. This could help editing movies, or developing higher impact advertising.
  2. Karl’s own research is on detection of age and longevity.  He has a web site, FaceMyAge that uses facial photos for this. Apparently if you look older than you are, you are likely to die younger, insight life insurance companies might value. Also cosmetic companies in terms of helping you to look younger (and maybe reduce your life insurance premiums?)

Karl anticipates answers to everyday questions  such as: “is the speaker on TV being honest?” (not needed for QVC, politicians, … or even many news programs now days); “How much money will I need for retirement?” (a discrete way of asking ‘how much time do I have left?’);”Will using this cosmetic really make me look younger?” — and the most dangerous question of all “does this outfit make me look fat?” (ok, Karl does not include this one.) Engineers and autistic persons are reputedly poor at reading the emotional state of others. Perhaps a possible Google Glass app would provide a some clues. Some devices for the improved transmission  of biometric signals have been developed as well.  My granddaughter just obtained a set of Brainwave Cat Ears which are supposed to indicate your state (focused, in-the-zone, relaxed) …. and, ‘ur, ah no you look just great in those cat ears, not fat at all’ (or at least that is what my Glass app suggested I say.) What biometric analytics would you encourage? What unanticipated consequences do you envision?

Private Cameras vs State Cameras

A recent opinion piece in Technology and Society by Jay Stanley (ACLU) questions the impact of Omnipresent Cameras — every cell phone is a video device, potentially streaming to the net live. Drones, private and government have their eyes open. Streets are monitored at traffic lights (and elsewhere) by government cameras as are many buildings via private cameras. The next article by Steve Mann talks about the “black bubbles” that are used to obscure cameras — and includes delightful images of Steve and friends with similar bubbles on their heads.  Steve points to lighting devices that incorporate cameras that can recognize faces and read license plates Jay points out that today we expect significant events in the public space to be recorded. The aftermath of the Ferguson shooting was captured by a cell phone camera, but the police car recordings (if any) have not been released.

All of this leads to  cultural questions  on the appropriate expectations of privacy, possible restrictions on public recording of government activities (such as police at a traffic stop, or the evolution of a demonstration in the streets of (pick your favorite city). It does not take much to demonstrate that eye witnesses are poor recorders of events (see Dan Simmons research on selective attention) — this makes the availability of “recorded” evidence quite useful. With more cameras on cars (backup cameras), on person (Glass), on buildings, on planes/drones, light-bulbs, and yes-the increasing image quality of the cameras that turn on/off devices in the bathroom (Steve points out these are up to  1024 pixels) the expectations of privacy “in public” are diminishing, and the potential for photographic evidence are increasing. Jay suggests that both police and the folks they interact with act differently when officers are equipped with body cameras.

So is this good?  What ethical issues, or even rules of evidence apply? How does it vary from culture to culture?

Insight from tracking the Boston Marathon bombing suspects

PBS Nova recently broadcast (with amazingly short turn around) a show on the technology used to track the Boston Marathon bombing suspects.  (Since NOVA is produced in Boston via WGBH that is not too surprising, but they also pulled off a NOVA the same night on the Oklahoma Tornado)

The show outlined the types of technology used, and examples from other cities of technology that might have been used if it were available.  It presents an interesting, but perhaps a bit disturbing view of current and emerging police systems in the U.S.

Within minutes, Boston police were starting to get access to the many corporate video cameras recording in the area, and started an intensive evaluation. Boston does not have the fully integrated, real time video feeds from private and public sources that New York does.  The NYC system is called the “Domain Awareness System” (DAS)  and is integrated with many other elements.  For example, a 911 call can be triangulated to the cell phone connection, cameras in that area highlighted so even before the phone discussion starts, the dispatcher can be viewing the source of the call. Integration with face recognition allows for identifying suspects (although that process failed in the Boston case, in part due to the poor quality of pictures available.)

The DAS system also tracks the license plates of every vehicle entering or leaving Manhattan via cameras associated with every bridge and tunnel. It appears from the show that at least 30 days of video are saved for every one of the rumored 3000+ video feeds being tracked in the system. Interestingly, there is very little information about this system on the NYC web site, a bit on guidelines and authority, press releases, but not a lot of government transparency.

While standard face recognition technology did not yield results, two other systems did. Marios Savvides at CMU is developing an advanced facial recognition system that can operate off lower quality images, and when a good picture was included in a million plus face data base, it was number twenty in the selected group.  So we can expect new generations of this capability. There was a surge of crowd-sourcing in terms of providing the police with information and pictures, but also in terms of reviewing posted pictures (REDDIT) being a key community for this.  While the Reddit process did not ID valid suspects (it did surface innocent folks), it did force the Boston Police into posting their photos of the suspects.  This also did not result in an ID, but it did appear to force the suspects into activities that ultimately killed a police officer, hijacked a car and resulted in the death of one and capture of the other.

Cell phone triangulation played a role in locating the suspects, just with the phone turned on, no call required.  (I wonder if an OnStar or similar system could also have been used.) The Boston Police also used plane and helicopter based infra-red cameras to try to locate the final suspect.  This proved quite successful as the individual was hiding in a boat with a covering that was IR transparent, providing a clear indication of his location.

The NOVA program ends with an often heard commentary about how some of the technology really helped, some didn’t deliver… but we must consider the implications for the privacy of citizens as this technology emerges.