Tag Archives: privacy

The Epoc and Your Next Job Interview


Imagine you are waiting to be interviewed for a job that you really want.  You’d probably be nervous, fingers drumming the table, eyes restlessly staring around the room.  The door opens and a man appears, he is wearing a lab coat and he is holding an EEG headset in both hands.  He places the set on your head and says “Your interview starts now.”

This Philip K Dick scenario became reality for intern applicants at the offices of TBWA who are an advertising firm based in Istanbul.  And thankfully a camera was present to capture this WTF moment for each candidate so this video could be uploaded to Vimeo.

The rationale for the exercise is quite clear.  The company want to appoint people who are passionate about advertising, so working with a consultancy, they devised a test where candidates watch a series of acclaimed ads and the Epoc is used to measure their levels of ‘passion’ ‘love’ and ‘excitement’ in a scientific and numeric way.  Those who exhibit the greatest passion for adverts get the job (this is the narrative of the movie; in reality one suspects/hopes they were interviewed as well).

I’ve seen at least one other blog post that expressed some reservations about the process.

Let’s take a deep breath because I have a whole shopping list of issues with this exercise.

Continue reading

Share This:

Lifestreams, body blogging and sousveillance


Way back in June, I planned to write a post prompted by Kevin Kelly’s talk at the Quantified Self conference in May and a new word I’d heard in an interview with David Brin.  Between then and now, the summer months have whipped by, so please excuse the backtracking – those of you who have seen the site before will have heard of our bodyblogger project, where physiological data is collected on a continuous basis and shared with others via social media sites or directly on the internet.  For instance, most of the time, the colour scheme for this website responds to heart rate changes of one of our bodybloggers (green = normal, yellow = higher than normal, red = much higher than normal – see this for full details).  This colour scheme can be mapped over several days, weeks and months to create a colour chart representation of heart rate data – the one at the top of this post shows a month’s worth of data (white spaces = missing data).

Continue reading

Share This:

Physiological Computing, Challenges for Developers and Users.

I recently received a questionnaire from the European Parliament, or rather  its STOA panel with respect to developments in physiological computing and implications for social policy.  The European Technology Assessment Group (ETAG) is working on a study with the title “Making Perfect Life” which includes a section on biocybernetic adaptation as well as BCI as other kinds of “assistive” technology.  The accompanying email told me the questionnaire would take half-an-hour to complete (it didn’t) but they asked some interesting questions, particularly surrounding the view of the general public about this technology and issues surrounding data protection.

I’ve included a slightly-edited version of the questionnaire with my responses. Questions are in italics.
Continue reading

Share This:

Better living through affective computing

I recently read a paper by Rosalind Picard entitled “emotion research for the people, by the people.”  In this article, Prof. Picard has some fun contrasting engineering and psychological perspectives on the measurement of emotion.  Perhaps I’m being defensive but she seemed to have more fun poking fun at the psychologists than the engineers, but the central impasse that she identified goes something like this: engineers develop sensor apparatus that can deliver a whole range of objective data whilst psychologists have decades of experience with theoretical concepts related to emotion, so why haven’t people really benefited from their union through the field of affective computing.  Prof. Picard correctly identifies a reluctance on the part of the psychologists to define concepts with sufficient precision to aid the work of the engineers.  What I felt was glossed over in the paper was the other side of the problem, namely the willingness of engineers to attach emotional labels to almost any piece of psychophysiological data, usually in the context of badly-designed experiments (apologies to any engineers reading this, but I wanted to add a little balance to the debate).
Continue reading

Share This:

In the shadow of the polygraph

I was reading this short article in The Guardian today about the failure of polygraph technologies (including fMRI versions and voice analysis) to deliver data that was sufficiently robust to be admissible in court as evidence.  Several points made in the article prompted a thought that the development of physiological computing technologies, to some extent, live in the shadow of the polygraph.

Think about it.  Both the polygraph and physiological computing aim to transform personal and private experience into quantifiable data that may be observed and assessed.  Both capture unconscious physiological changes that may signify hidden psychological motives and agendas, subconscious or otherwise – and of course, both involve the attachment of sensor apparatus.  The convergence between both technologies dictates that both are notoriously difficult to validate (hence the problems of polygraph evidence in court) – and that seems true whether we’re talking about the use of the P300 for “brain fingerprinting” or the use of ECG and respiration to capture a specific category of emotion.

Whenever I do a presentation about physiological computing, I can almost sense antipathy to the concept from some members of audience because the first thing people think about is the polygraph and the second group of thoughts that logically follow are concerns about privacy, misuse and spying.  To counter these fears, I do point out that physiological computing, whether it’s a game or a means of adapting a software agent or a brain-computer interface, has been developed for very different purposes; this technology is intended for personal use, it’s about control for the individual in the broadest sense, e.g. to control a cursor, to promote reflection and self-regulation, to make software reactive, personalised and smarter, to ensure that the data protection rights of the individual are preserved – especially if they wish to share their data with others.

But everyone knows that any signal that can be measured can be hacked, so even capturing these kinds of physiological data per se opens the door for spying and other profound invasions of privacy.

Which takes us inevitably back in the shadow of the polygraph.

I’m sure attitudes will change if the right piece of technology comes along that demonstrates the up side of physiological computing.  But if early systems don’t take data privacy seriously, as in very seriously, the public could go cold on this concept before the systems have had a chance to prove themselves in the marketplace.

For musings on a similar theme, see my previous post Designing for the Guillable.

Share This: