Neuroadaptive Technology Conference 2019

 

The international conference on Neuroadaptive Technology will be held on the 16-18th July 2019 in Liverpool. This will be the second meeting on this topic, the first took place in Berlin two years ago. You’ll find a link at the top of this page for the schedule, registration costs and other details about the meeting.

In this short post, I’d like to give a little background for the meeting and say some things about the goals and scope of the conference. The original idea came from a conversation between myself and Thorsten Zander (my co-organiser) about the absence of any forum dedicated to this type of implicit closed-loop technology. My work on physiological computing systems was always multidisciplinary, encompassing: psychological sciences, wearable sensors, signal processing and human-computer interaction. Work in the area was being submitted and published at conferences dedicated to engineering and computer science, but these meetings always emphasised one specific aspect, such as sensors or signal processing aspects or machine learning. I wanted to have a meeting where all aspect of the closed loop were equally represented, from sensors through to interface design. On the other hand, Thorsten had developed concept of passive brain-computer interfaces where EEG signals were translated into control at the interface without any intentionality on the part of the user.

We had at least two things in common, we were both interested in closed-loop control using signals from the brain and the body and we were both frustrated that our work didn’t seem to fit comfortably into existing forums.

Thorsten took the first step and organised a passive BCI meeting at Demelhorst just outside Bremen for two (very hot) days in August 2014. On the last day of that meeting, along with the other attendees, we batted around various names with which to christen this emerging area of work. If memory serves, I don’t remember anyone coming up with a label that everyone in the room was completely endorsed. The term ‘neuroadaptive technology’ that I appropriated from this 2003 paper from Lawrence Hettinger and colleagues was the one that people were the least unhappy about – and so, when it came time to organise the first conference, that was the name that we ran with.

From the beginning, we decided to make the ‘neuro’ in the title of the conference as broad as possible, encompassing psychophysiological sensors/measures as well as those derived from neurophysiology. At that first conference, we also wanted to draw attention to the breadth of work in this field and so we invited Rob Jacob as a keynote to talk about new modes of human-computer interaction and Pim Haselager to address the ethical implications of the technology, as well as speakers on EEG signal processing. A full list of abstracts and the schedule for that 2017 meeting is available here.

The fundamental thinking behind the neuroadaptive technology conference is that despite the significant range of applications under consideration in this field, which runs from autonomous driving to marketing, researchers share a significant number of interests, such: sensor design, signal processing methods in the field, machine learning for classification, designing implicit modes of human-computer interaction, establishing methodology for evaluation – and that’s far from an exhaustive list.

And so, in Liverpool this July, we’ll be doing it all again with a wide range of speakers from around the world. The deadline for abstract submission is 31st March 2019 and we’re in the process of organising keynote speakers and a clear route to publication for the work presented at the conference.

Full details will appear at the link from the top of this page over the next few months.

Share This:

IEEE Computer Special Issue on Physiological Computing

intro_pdf__page_2_of_5_

The October 2015 edition of IEEE Computer magazine is devoted to the topic of Physiological Computing.  Giulio Jacucci, myself and Erin Solovey acted as co-editors and the introduction for the magazine is available here.

The paper included in the special issue cover a range of topics, including: measurement of stress in VR, combining pupilometry with EEG to detect changes in operator workload and using mobile neuroimaging to create attention-aware technologies.

There is also a podcast associated with the SI featuring the guest editors in conversation with Robert Jacobs from Tufts University on current topics and future directions in Physiological Computing – you can hear it here.

Share This:

Special Issue of Interacting with Computers

iwc_oxfordjournals_org_content_27_5_local_front-matter_pdf

 

I am one of the co-editors of a special issue of the Interacting With Computers, which is now available online here.   The title for the special issue is Physiological Computing for Intelligent Adaptation, it contains five full research papers covering a range of topics such as:  use of VR for stress reduction, mental workload monitoring and a comparison of EEG headsets.

Share This:

Book Announcement – Advances in Physiological Computing

It was way back in 2011 during our CHI workshop that we first discussed the possibility of putting together an edited collection for Springer on the topic of physiological computing.  It was clear to me at that time that many people associated physiological computing with implicit monitoring as opposed the active control that characterised BCI.  When we had the opportunity to put together a collection, one idea was to extend the scope of physiological computing to include all technologies where signals from the brain and the body were used as a form of input.  Some may interpret this relabelling of physiological computing as an all-inclusive strategy as a provocative move.  But we did not take this option as a conceptual ‘land-grab’ but rather an attempt to be as inclusive as possible and to bring together what I still perceive to be a rather disparate and fractured research community.  After all, we are all using psychophysiology in one form or another and share a common interest in sensor design, interaction mechanics and real-time measurement.

The resulting book is finally close to publication (tentative date: 4th April 2014) and you can follow this link to get the full details.  We’re pleased to have a wide range of contributions on an array of technologies, from eye input to digital memories via mental workload monitoring, implicit interaction, robotics, biofeedback and cultural heritage.  Thanks to all our contributors and the staff at Springer who helped us along the way.

 

Share This:

The Epoc and Your Next Job Interview

job-interview

Imagine you are waiting to be interviewed for a job that you really want.  You’d probably be nervous, fingers drumming the table, eyes restlessly staring around the room.  The door opens and a man appears, he is wearing a lab coat and he is holding an EEG headset in both hands.  He places the set on your head and says “Your interview starts now.”

This Philip K Dick scenario became reality for intern applicants at the offices of TBWA who are an advertising firm based in Istanbul.  And thankfully a camera was present to capture this WTF moment for each candidate so this video could be uploaded to Vimeo.

The rationale for the exercise is quite clear.  The company want to appoint people who are passionate about advertising, so working with a consultancy, they devised a test where candidates watch a series of acclaimed ads and the Epoc is used to measure their levels of ‘passion’ ‘love’ and ‘excitement’ in a scientific and numeric way.  Those who exhibit the greatest passion for adverts get the job (this is the narrative of the movie; in reality one suspects/hopes they were interviewed as well).

I’ve seen at least one other blog post that expressed some reservations about the process.

Let’s take a deep breath because I have a whole shopping list of issues with this exercise.

Continue reading

Share This:

The Ultimate Relax to Win Dynamic

I came across an article in a Sunday newspaper a couple of weeks ago about an artist called xxxy who has created an installation using a BCI of sorts.  I’m piecing this together from what I read in the paper and what I could see on his site, but the general idea is this: person wears a portable EEG rig (I don’t recognise the model) and is placed in a harness with wires reaching up and up and up into the ceiling.  The person closes their eyes and relaxes – presumably as they enter a state of alpha augmentation, they begin to levitate courtesy of the wires.  The more that they relax or the longer they sustain that state, the higher they go.  It’s hard to tell from the video, but the person seems to be suspended around 25-30 feet in the air.

Continue reading

Share This:

Physiological Computing meets Augmented Reality in a Museum

First of all, an apology – Kiel and I try to keep this blog ticking over, but for most of 2011, we’ve been preoccupied with a couple of large projects and getting things organised for the CHI workshop in May.  One of the “things” that led to this hiatus on the blog is a new research project funded by the EU called ARtSENSE, which is the topic of this post.

Continue reading

Share This:

Studentships in Physiological Computing

Liverpool John Moores University
PhD Studentships in Applied Neuroscience/Psychophysiology
School of Natural Sciences and Psychology

EDIT: Application closed

Please quote Ref: IRC544

Applications are invited for two PhD studentships in the School of Natural Sciences and Psychology. The studentships consist of a tax-free stipend (currently £13,590 per annum for the 2010-2011 academic year) and tuition fees.

We seek candidates with a strong research background and interest in physiological computing research (http://www.physiologicalcomputing.net/) for a new research project funded by the EU. Specifically we are seeking to fund studentships in two areas associated with this project: –
Continue reading

Share This:

Revised Physiological Computing FAQ

This is a short post to inform regular readers that I’ve made some changes to the FAQ document for the site (link to the left).  Normally people alter the FAQ because the types of popular questions have changed.  In our case, it is my answers to those questions that have changed in the time since I wrote my original responses – hence the need to revise the FAQ.

The original document firmly identified physiological computing with affective computing/biocybernetic adaptation.  There was even a question making a firm division between BCI technology and physiological computing.  In the revised FAQ, I’ve dumped this distinction and attempted to view BCI as part of a broad continuum of computing devices that rely on real-time physiological data for input.  This change has not been made to arrogantly subsume BCI within the physiological computing spectrum, but to reconcile perspectives from different research communities working on common measures and technologies across different application domains.  In my opinion, the distinction between research topics and application domains (including my own) are largely artificial and the advancement of this technology is best served by keeping an open mind about mash-ups and hybrid systems.

I’ve also expanded the list of indicative references to include contributions from BCI, telemedicine and adaptive automation in order to highlight the breadth of applications that are united by physiological data input.

The FAQ is written to support the naive reader, who may have stumbled across our site, but as ever, I welcome any comments or additional questions from domain experts.

Share This:

Valve experimenting with physiological input for games

This recent interview with Gabe Newell of Valve caught our interest because it’s so rare that a game developer talks publicly about the potential of physiological computing to enhance the experience of gamers.  The idea of using live physiological data feeds in order to adapt computer games and enhance game play was first floated by Kiel  in these papers way back in 2003 and 2005.  Like Kiel, in my writings on this topic (Fairclough,   2007; 2008 – see publications here), I focused exclusively on two problems: (1) how to represent the state of the player, and (2) what could the software do with this representation of the player state.  In other words, how can live physiological monitoring of the player state inform real-time software adaptation?  For example, to make the game harder or to increase the music or to offer help (a set of strategies that Kiel summarised in three categories, challenge me/assist me/emote me)- but to make these adjustments in real time in order to enhance game play.

Continue reading

Share This: