Tag Archives: lie-detection

The Epoc and Your Next Job Interview


Imagine you are waiting to be interviewed for a job that you really want.  You’d probably be nervous, fingers drumming the table, eyes restlessly staring around the room.  The door opens and a man appears, he is wearing a lab coat and he is holding an EEG headset in both hands.  He places the set on your head and says “Your interview starts now.”

This Philip K Dick scenario became reality for intern applicants at the offices of TBWA who are an advertising firm based in Istanbul.  And thankfully a camera was present to capture this WTF moment for each candidate so this video could be uploaded to Vimeo.

The rationale for the exercise is quite clear.  The company want to appoint people who are passionate about advertising, so working with a consultancy, they devised a test where candidates watch a series of acclaimed ads and the Epoc is used to measure their levels of ‘passion’ ‘love’ and ‘excitement’ in a scientific and numeric way.  Those who exhibit the greatest passion for adverts get the job (this is the narrative of the movie; in reality one suspects/hopes they were interviewed as well).

I’ve seen at least one other blog post that expressed some reservations about the process.

Let’s take a deep breath because I have a whole shopping list of issues with this exercise.

Continue reading

Share This:

In the shadow of the polygraph

I was reading this short article in The Guardian today about the failure of polygraph technologies (including fMRI versions and voice analysis) to deliver data that was sufficiently robust to be admissible in court as evidence.  Several points made in the article prompted a thought that the development of physiological computing technologies, to some extent, live in the shadow of the polygraph.

Think about it.  Both the polygraph and physiological computing aim to transform personal and private experience into quantifiable data that may be observed and assessed.  Both capture unconscious physiological changes that may signify hidden psychological motives and agendas, subconscious or otherwise – and of course, both involve the attachment of sensor apparatus.  The convergence between both technologies dictates that both are notoriously difficult to validate (hence the problems of polygraph evidence in court) – and that seems true whether we’re talking about the use of the P300 for “brain fingerprinting” or the use of ECG and respiration to capture a specific category of emotion.

Whenever I do a presentation about physiological computing, I can almost sense antipathy to the concept from some members of audience because the first thing people think about is the polygraph and the second group of thoughts that logically follow are concerns about privacy, misuse and spying.  To counter these fears, I do point out that physiological computing, whether it’s a game or a means of adapting a software agent or a brain-computer interface, has been developed for very different purposes; this technology is intended for personal use, it’s about control for the individual in the broadest sense, e.g. to control a cursor, to promote reflection and self-regulation, to make software reactive, personalised and smarter, to ensure that the data protection rights of the individual are preserved – especially if they wish to share their data with others.

But everyone knows that any signal that can be measured can be hacked, so even capturing these kinds of physiological data per se opens the door for spying and other profound invasions of privacy.

Which takes us inevitably back in the shadow of the polygraph.

I’m sure attitudes will change if the right piece of technology comes along that demonstrates the up side of physiological computing.  But if early systems don’t take data privacy seriously, as in very seriously, the public could go cold on this concept before the systems have had a chance to prove themselves in the marketplace.

For musings on a similar theme, see my previous post Designing for the Guillable.

Share This: