Manifest.AR: Invisible ARtaffects

First of all, apologies for our blog “sabbatical” – the important thing is that we are now back with news of our latest research collaboration involving FACT (Foundation for Art and Creative Technology) and international artists’ collective Manifest.AR.

To quickly recap, our colleagues at FACT were keen to create a new commission tapping into the use of augmented reality technology and incorporating elements of our own work on physiological computing.  Our last post (almost a year ago now to our shame) described the time we spent with Manfest.AR last summer and our show-and-tell event at FACT.  Fast-forward to the present and the Manifest.AR piece called Invisible ARtaffects opened last Thursday as part of the Turning FACT Inside Out show.

manar_exhibit

Continue reading

Share This:

Deadline extension for BCI Grand Challenge at ICMI 2012

I am one of the organisers for a workshop event at ICMI 2012 entitled “BCI Grand Challenges.”  The deadline for submissions was this coming Friday (15th) but has now been extended until the 30th June.  Full details are below.

Continue reading

Share This:

Troubleshooting and Mind-Reading: Developing EEG-based interaction with commercial systems

With regards to the development of physiological computing systems, whether they are BCI applications or fall into the category of affective computing, there seems (to me) to be two distinct types of research community at work. The first (and oldest) community are university-based academics, like myself, doing basic research on measures, methods and prototypes with the primary aim of publishing our work in various conferences and journals. For the most part, we are a mixture of psychologists, computer scientists and engineers, many of whom have an interest in human-computer interaction. The second community formed around the availability of commercial EEG peripherals, such as the Emotiv and Neurosky. Some members of this community are academics and others are developers, I suspect many are dedicated gamers. They are looking to build applications and hacks to embellish interactive experience with a strong emphasis on commercialisation.

There are many differences between the two groups. My own academic group is ‘old-school’ in many ways, motivated by research issues and defined by the usual hierarchies associated with specialisation and rank. The newer group is more inclusive (the tag-line on the NeuroSky site is “Brain Sensors for Everyone”); they basically want to build stuff and preferably sell it.

Continue reading

Share This:

Reflections on Body Lab

Way back in February, Kiel and I did an event called Body Lab in conjunction with our LJMU colleagues at OpenLabs.  The idea for this event originated in a series of conversations between ourselves and OpenLabs about our mutual interest in digital health. The brief of OpenLabs is to “support local creative technology companies to develop new products and services that capitalise upon global opportunities.”  Their interest in our work on physiological computing was to put this idea out among their community of local creatives and digital types.

I was initially apprehensive about wisdom of this event. I’m quite used to talking about our work with others from the research community, from both the commercial and academic side – what makes me slightly uncomfortable is talking about possible implementations because I feel the available sensor apparatus and other tools are not so advanced.  I was also concerned about whether doing a day-long event on this topic would pull in a sufficient number of participants – what we do has always felt very “niche” in my view.  Anyhow, some smooth-talking from Jason Taylor (our OpenLabs contact) and a little publicity in the form of this short podcast convinced that we should give it our best shot.

Continue reading

Share This:

BCI, biocybernetic control and gaming

Way back in 2008, I was due to go to Florence to present at a workshop on affective BCI as part of CHI. In the event, I was ill that morning and missed the trip and the workshop. As I’d prepared the presentation, I made a podcast for sharing with the workshop attendees. I dug it out of the vaults for this post because gaming and physiological computing is such an interesting topic.

The work is dated now, but basically I’m drawing a distinction between my understanding of BCI and biocybernetic adaptation. The former is an alternative means of input control within the HCI, the latter can be used to adapt the nature of the HCI. I also argue that BCI is ideally suited certain types of game mechanics because it will not work 100% of the time. I used the TV series “Heroes” to illustrate these kinds of mechanics, which I regret in hindsight, because I totally lost all enthusiasm for that show after series 1.

The original CHI paper for this presentation is available here.

 [iframe width=”400″ height=”300″ src=”http://player.vimeo.com/video/32983880″]

Share This:

Mood and Music: effects of music on driver anger

[iframe width=”400″ height=”300″ src=”http://player.vimeo.com/video/32915393″]

Last month I gave a presentation at the Annual Meeting of the Human Factors and Ergonomics Society held at Leeds University in the UK.  I stood on the podium and presented the work, but really the people who deserve most of the credit are Marjolein van der Zwaag (from Philips Research Laboratories) and my own PhD student at LJMU Elena Spiridon.

You can watch a podcast of the talk above.  This work was originally conducted as part of the REFLECT project at the end of 2010.  This work was inspired by earlier research on affective computing where the system makes an adaptation to alleviate a negative mood state.  The rationale here is that any such adaptation will have beneficial effects – in terms of reducing duration/intensity of negative mood, and in doing so, will mitigate any undesirable effects on behaviour or the health of the person.

Our study was concerned with the level of anger a person might experience on the road.  We know that anger causes ‘load’ on the cardiovascular system as well as undesirable behaviours associated with aggressive driver.  In our study, we subjected participants to a simulated driving task that was designed to make them angry – this is a protocol that we have developed at LJMU.  Marjolein was interested in the effects of different types of music on the cardiovascular system while the person is experiencing a negative mood state; for our study, she created four categories of music that varied in terms of high/low activation and positive/negative valence.

The study does not represent an investigation into a physiological computing system per se, but is rather a validation study to explore whether an adaptation, such as selecting a certain type of music when a person is angry, can have beneficial effects.  We’re working on a journal paper version at the moment.

Share This:

REFLECT Project Promo Video

[iframe width=”400″ height=”300″ src=”http://player.vimeo.com/video/25081038″]

Some months ago, I wrote this post about the REFLECT project that we participated in for the last three years.  In short, the REFLECT project was concerned with research and development of three different kinds of biocybernetic loops: (1) detection of emotion, (2) diagnosis of mental workload, and (3) assessment of physical comfort.  Psychophysiological measures were used to assess (1) and (2) whilst physical movement (fidgeting) in a seated position was used for the latter.  And this was integrated into the ‘cockpit’ of a  Ferrari.

The idea behind the emotional loop was to have the music change in response to emotion (to alleviate negative mood states).  The cognitive loop would block incoming calls if the driver was in a state of high mental workload and air-filled bladders in the seat would adjust to promote physical comfort.  You can read all about the project here.  Above you’ll find a promotional video that I’ve only just discovered – the reason for my delayed response in posting this is probably vanity, the filming was over before I got to the Ferrari site in Maranello.  The upside of my absence is that you can watch the much more articulate and handsome Dick de Waard explain about the cognitive loop in the film, which was our main involvement in the project.

Share This:

Lifestreams, body blogging and sousveillance

 

Way back in June, I planned to write a post prompted by Kevin Kelly’s talk at the Quantified Self conference in May and a new word I’d heard in an interview with David Brin.  Between then and now, the summer months have whipped by, so please excuse the backtracking – those of you who have seen the site before will have heard of our bodyblogger project, where physiological data is collected on a continuous basis and shared with others via social media sites or directly on the internet.  For instance, most of the time, the colour scheme for this website responds to heart rate changes of one of our bodybloggers (green = normal, yellow = higher than normal, red = much higher than normal – see this for full details).  This colour scheme can be mapped over several days, weeks and months to create a colour chart representation of heart rate data – the one at the top of this post shows a month’s worth of data (white spaces = missing data).

Continue reading

Share This:

Physiological Computing, Challenges for Developers and Users.

I recently received a questionnaire from the European Parliament, or rather  its STOA panel with respect to developments in physiological computing and implications for social policy.  The European Technology Assessment Group (ETAG) is working on a study with the title “Making Perfect Life” which includes a section on biocybernetic adaptation as well as BCI as other kinds of “assistive” technology.  The accompanying email told me the questionnaire would take half-an-hour to complete (it didn’t) but they asked some interesting questions, particularly surrounding the view of the general public about this technology and issues surrounding data protection.

I’ve included a slightly-edited version of the questionnaire with my responses. Questions are in italics.
Continue reading

Share This:

Biometrics and evaluation of gaming experience part two: a thought experiment

Recent posts on the blog have concerned the topic of psychophysiology (or biometrics) and the evaluation of player experience.  Based on those posts and the comments that followed, I decided to do a thought experiment.

Imagine that I work for a big software house who want to sell as many games as possible and ensure that their product (which costs on average $3-5 million to develop per platform) is as good as it possibly can be – and one of the suits from upstairs calls and asks me “how should we be using biometrics as part of our user experience evaluation?  The equipment is expensive, its labour-intensive to analyse and nobody seems to understand what the data means.”  (This sentiment is not exaggerated, I once presented a set of fairly ambiguous psychophysiological data to a fellow researcher who nodded purposefully and said “So the physiology stuff is voodoo.”)

Here’s a list of 10 things I would push for by way of a response.

Continue reading

Share This: