Category Archives: News

IEEE Computer Special Issue on Physiological Computing

intro_pdf__page_2_of_5_

The October 2015 edition of IEEE Computer magazine is devoted to the topic of Physiological Computing.  Giulio Jacucci, myself and Erin Solovey acted as co-editors and the introduction for the magazine is available here.

The paper included in the special issue cover a range of topics, including: measurement of stress in VR, combining pupilometry with EEG to detect changes in operator workload and using mobile neuroimaging to create attention-aware technologies.

There is also a podcast associated with the SI featuring the guest editors in conversation with Robert Jacobs from Tufts University on current topics and future directions in Physiological Computing – you can hear it here.

Share This:

Special Issue of Interacting with Computers

iwc_oxfordjournals_org_content_27_5_local_front-matter_pdf

 

I am one of the co-editors of a special issue of the Interacting With Computers, which is now available online here.   The title for the special issue is Physiological Computing for Intelligent Adaptation, it contains five full research papers covering a range of topics such as:  use of VR for stress reduction, mental workload monitoring and a comparison of EEG headsets.

Share This:

Book Announcement – Advances in Physiological Computing

It was way back in 2011 during our CHI workshop that we first discussed the possibility of putting together an edited collection for Springer on the topic of physiological computing.  It was clear to me at that time that many people associated physiological computing with implicit monitoring as opposed the active control that characterised BCI.  When we had the opportunity to put together a collection, one idea was to extend the scope of physiological computing to include all technologies where signals from the brain and the body were used as a form of input.  Some may interpret this relabelling of physiological computing as an all-inclusive strategy as a provocative move.  But we did not take this option as a conceptual ‘land-grab’ but rather an attempt to be as inclusive as possible and to bring together what I still perceive to be a rather disparate and fractured research community.  After all, we are all using psychophysiology in one form or another and share a common interest in sensor design, interaction mechanics and real-time measurement.

The resulting book is finally close to publication (tentative date: 4th April 2014) and you can follow this link to get the full details.  We’re pleased to have a wide range of contributions on an array of technologies, from eye input to digital memories via mental workload monitoring, implicit interaction, robotics, biofeedback and cultural heritage.  Thanks to all our contributors and the staff at Springer who helped us along the way.

 

Share This:

The Epoc and Your Next Job Interview

job-interview

Imagine you are waiting to be interviewed for a job that you really want.  You’d probably be nervous, fingers drumming the table, eyes restlessly staring around the room.  The door opens and a man appears, he is wearing a lab coat and he is holding an EEG headset in both hands.  He places the set on your head and says “Your interview starts now.”

This Philip K Dick scenario became reality for intern applicants at the offices of TBWA who are an advertising firm based in Istanbul.  And thankfully a camera was present to capture this WTF moment for each candidate so this video could be uploaded to Vimeo.

The rationale for the exercise is quite clear.  The company want to appoint people who are passionate about advertising, so working with a consultancy, they devised a test where candidates watch a series of acclaimed ads and the Epoc is used to measure their levels of ‘passion’ ‘love’ and ‘excitement’ in a scientific and numeric way.  Those who exhibit the greatest passion for adverts get the job (this is the narrative of the movie; in reality one suspects/hopes they were interviewed as well).

I’ve seen at least one other blog post that expressed some reservations about the process.

Let’s take a deep breath because I have a whole shopping list of issues with this exercise.

Continue reading

Share This:

The Ultimate Relax to Win Dynamic

I came across an article in a Sunday newspaper a couple of weeks ago about an artist called xxxy who has created an installation using a BCI of sorts.  I’m piecing this together from what I read in the paper and what I could see on his site, but the general idea is this: person wears a portable EEG rig (I don’t recognise the model) and is placed in a harness with wires reaching up and up and up into the ceiling.  The person closes their eyes and relaxes – presumably as they enter a state of alpha augmentation, they begin to levitate courtesy of the wires.  The more that they relax or the longer they sustain that state, the higher they go.  It’s hard to tell from the video, but the person seems to be suspended around 25-30 feet in the air.

Continue reading

Share This:

Physiological Computing meets Augmented Reality in a Museum

First of all, an apology – Kiel and I try to keep this blog ticking over, but for most of 2011, we’ve been preoccupied with a couple of large projects and getting things organised for the CHI workshop in May.  One of the “things” that led to this hiatus on the blog is a new research project funded by the EU called ARtSENSE, which is the topic of this post.

Continue reading

Share This:

Studentships in Physiological Computing

Liverpool John Moores University
PhD Studentships in Applied Neuroscience/Psychophysiology
School of Natural Sciences and Psychology

EDIT: Application closed

Please quote Ref: IRC544

Applications are invited for two PhD studentships in the School of Natural Sciences and Psychology. The studentships consist of a tax-free stipend (currently £13,590 per annum for the 2010-2011 academic year) and tuition fees.

We seek candidates with a strong research background and interest in physiological computing research (http://www.physiologicalcomputing.net/) for a new research project funded by the EU. Specifically we are seeking to fund studentships in two areas associated with this project: –
Continue reading

Share This:

Revised Physiological Computing FAQ

This is a short post to inform regular readers that I’ve made some changes to the FAQ document for the site (link to the left).  Normally people alter the FAQ because the types of popular questions have changed.  In our case, it is my answers to those questions that have changed in the time since I wrote my original responses – hence the need to revise the FAQ.

The original document firmly identified physiological computing with affective computing/biocybernetic adaptation.  There was even a question making a firm division between BCI technology and physiological computing.  In the revised FAQ, I’ve dumped this distinction and attempted to view BCI as part of a broad continuum of computing devices that rely on real-time physiological data for input.  This change has not been made to arrogantly subsume BCI within the physiological computing spectrum, but to reconcile perspectives from different research communities working on common measures and technologies across different application domains.  In my opinion, the distinction between research topics and application domains (including my own) are largely artificial and the advancement of this technology is best served by keeping an open mind about mash-ups and hybrid systems.

I’ve also expanded the list of indicative references to include contributions from BCI, telemedicine and adaptive automation in order to highlight the breadth of applications that are united by physiological data input.

The FAQ is written to support the naive reader, who may have stumbled across our site, but as ever, I welcome any comments or additional questions from domain experts.

Share This:

Valve experimenting with physiological input for games

This recent interview with Gabe Newell of Valve caught our interest because it’s so rare that a game developer talks publicly about the potential of physiological computing to enhance the experience of gamers.  The idea of using live physiological data feeds in order to adapt computer games and enhance game play was first floated by Kiel  in these papers way back in 2003 and 2005.  Like Kiel, in my writings on this topic (Fairclough,   2007; 2008 – see publications here), I focused exclusively on two problems: (1) how to represent the state of the player, and (2) what could the software do with this representation of the player state.  In other words, how can live physiological monitoring of the player state inform real-time software adaptation?  For example, to make the game harder or to increase the music or to offer help (a set of strategies that Kiel summarised in three categories, challenge me/assist me/emote me)- but to make these adjustments in real time in order to enhance game play.

Continue reading

Share This:

Mobile Monitors and Apps for Physiological Computing

I always harbored two assumptions about the development of physiological computing systems that have only become apparent (to me at least) as technological innovation seems to contradict them.  First of all, I thought nascent forms of physiological computing systems would be developed for desktop system where the user stays in a stationary and more-or-less sedentary position, thus minimising the probability of movement artifacts.  Also, I assumed that physiological computing devices would only ever be achieved as coordinated holistic systems.  In other words, specific sensors linked to a dedicated controller that provides input to adaptive software, all designed as a seamless chain of information flow.

Continue reading

Share This: