Wednesday 19 December 2012

Paper accepted for Intelligent User Interfaces conference


Our paper entitled "Towards Cooperative Brain-Computer Interfaces for Space Navigation," has been
accepted for presentation at ACM's Intelligent User Interfaces (IUI) 2013 conference.  The review process was extremely selective  with only about 20% of submissions being accepted for presentation. The paper is co-authored by Riccardo Poli  Caterina Cinel  Ana Matran-Fernandez, Francisco Sepulveda and Adrian Stoica.

Here is the abstract of the paper:


We explored the possibility of controlling a spacecraft simulator using an analogue Brain-Computer Interface (BCI) for 2-D pointer control. This is a difficult task, for which no previous attempt has been reported in the literature. Our system relies on an active display which produces event-related potentials (ERPs) in the user’s brain. These are analysed in real-time to produce control vectors for the user interface. In tests, users of the simulator were told to pass as close as possible to the Sun. Performance was very promising, on average users managing to satisfy the simulation success criterion in 67.5% of the runs. Furthermore, to study the potential of a collaborative approach to spacecraft navigation, we developed BCIs where the system is controlled via the integration of the ERPs of two users. Performance analysis indicates that collaborative BCIs produce trajectories that are statistically significantly superior to those obtained by single users.

Monday 17 December 2012

Paper accepted for oral presentation at CogSIMA conference


Our  paper entitled "Improving Decision-making based on Visual Perception via a Collaborative Brain-Computer Interface" has been accepted for oral presentation at the 2013 IEEE International Multi-Disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support (CogSIMA). The paper is co-authored by Riccardo Poli , Caterina Cinel , Francisco Sepulveda and Adrian Stoica.

Here is the abstract of the paper:


In the presence of complex stimuli, in the absence of sufficient time to complete the visual parsing of a scene, or when attention is divided, an observer can only take in a subset of the features of a scene, potentially leading to poor decisions. In this paper we look at the possibility of integrating the percepts from multiple non-communicating observers as a means of achieving better joint perception and better decision making. Our approach involves the combination of brain-computer interface (BCI)
technology with human behavioural responses. To test our ideas in controlled conditions, we asked observers to perform a simple visual matching task involving the rapid sequential presentation of pairs of visual patterns and the subsequent decision as whether the two patterns in a pair were the same
or different. Visual stimuli were presented for insufficient time for the observers to be certain of the decision. The degree of difficulty of the task also depended on the number of matching features between the two patterns. The higher the number, the more difficult the task. We recorded the response times of observers as well as a neural feature which predicts incorrect decisions and, thus, indirectly indicates the confidence of the decisions made by the observers. We then built a composite neuro-behavioural feature which optimally combines these behavioural and neural measures. For group decisions, we tested the use of a majority rule and three further decision rules which weigh the decisions of each observer based on response times and our neural and neuro-behavioural features. Results indicate that the integration of behavioural responses and neural features can significantly improve accuracy when compared with individual performance. Also, within groups of each size, decision rules based on such features outperform the majority rule.

Friday 14 December 2012

Recent papers published within the project


We have published a number of papers under the support from this grant in the last few months. Here is a list: 

[1].   J Cannan and H. Hu, A Multi-Sensor Armband based on Muscle and Motion Measurements, Proc. of IEEE Int. Conf. on Robotics and Biomimetics, Guangzhou, China, 11-14 December 2012, pages 1098-1103.
[2].   S. Wang, L. Chen, H. Hu and and K. McDonald-Maier, Doorway Passing of an Intelligent Wheelchair by Dynamically Generating B´ezier Curve Trajectory, Proc. of IEEE Int. Conf. on Robotics and Biomimetics, Guangzhou, China, 11-14 December 2012, pages 1206-1211.
[3].   E.J. Rechy-Ramirez, H. Hu and K. McDonald-Maier, Head movements based control of an intelligent wheelchair in an indoor environment, Proc. of IEEE Int. Conf. on Robotics and Biomimetics, Guangzhou, China, 11-14 December 2012, pages 1464-1469.
[4].   L. Chen, H. Hu and K. McDonald-Maier, EKF based Mobile Robot Localisation, Proc. of the 3rd International Conf. on Emerging Security Technologies (EST-2012), Lisbon, Portugal, 5-7 Sept. 2012, pages 149-154.
[5].   S. Wang, H. Hu and Klaus McDonald-Marie, Optimization and Sequence Search based Localization in Wireless Sensor Networks, Proc. of the 3rd International Conf. on Emerging Security Technologies (EST-2012), Lisbon, Portugal, 5-7 September 2012, pages 155-160.
[6].   Y. Kovalchuk, H. Hu, D. Gu, K. McDonald-Maier, D. Newman, S. Kelly, G. Howells, Investigation of Properties of ICmetrics Features, Proc. of the 3rd International Conf. on Emerging Security Technologies (EST-2012), Lisbon, Portugal, 5-7 September 2012, pages 115-120.
[7].   Y. Kovalchuk, H. Hu, D. Gu, K. McDonald-Maier, G. Howells, ICmetrics for Low Resource Embedded Systems, Proc. of the 3rd International Conf. on Emerging Security Technologies (EST-2012), Lisbon, Portugal, 5-7 September 2012, pages 121-126.
[8].   B. Lu, D. Gu, H. Hu and K. McDonald-Marier, Sparse Gaussian Process for Spatial Function Estimation with Mobile Sensor Networks, Proc. of the 3rd International Conf. on Emerging Security Technologies (EST-2012), Lisbon, Portugal, 5-7 September 2012, pages 145-150.
[9].   L. Chen, S. Wang and H. Hu, B´ezier Curve based Path Planning for an Intelligent Wheelchair to pass a Doorway, Proceedings of the UKACC Int. Conference on Control, Cardiff, 3-5 September 2012.

Wednesday 12 September 2012

ROBOSAS Workshop – Robotics

We held a joint workshop in the Essex Robot Arena between JPL and Essex on the special topic of Robotics.

Here is the program of the day:


09:30 Huosheng Hu – Introduction of Essex robotics research
09:40 Adrian Stoica – Human-cantered robotics: learning & multi-robot control
10:10 Yumi Iwashita – Visual recognition for robots
10:30 Adrian Clark – Learning to see
10:50 Theo Theodoridis – Multi-modality control of multiple robots
11:30 James Cannan – Robot management and wearable technology
11:50 Ericka Rechy-Ramirez – Facial expressions based control of wheelchair
12:10 Lunch
14:00 John Woods – Region based analysis of images for robots
14:30 Ling Chen – IMU/GPS based pedestrian localization
14:50 Hossein Farid Ghassem Nia – Bayesian decision theory for optical counting
15:10 Sen Wang – Localization in wireless sensor networks
15:30 Potential collaboration discussion in Room 1NW.3.7
17:30 End

Saturday 1 September 2012

Klaus visit's at JPL this Summer

Professor Klaus McDonald-Maier, who leads this collaborative EPSRC Global Engagement Project with NASA, visited JPL this Summer. Klaus is interested in looking at increased reliability for hardware and software architectures in robotic systems. These systems include extra-terrestrial robotic rovers such as the Curiosity Mars Science Laboratory, which recently successfully landed on Mars where it will undertake a series of experiments. Curiosity relies on complex programmable control systems and an enormous amount of dedicated software which must be extremely reliable as it is deployed in such a remote location.

Klaus commented ‘It was extremely exciting being at NASA JPL when Curiosity successfully landed and it will be very interesting as it embarks on fundamental experiments which should give us great insights into the structure and characteristics of Mars.’



Friday 10 August 2012

Maria visit's to JPL this Summer

Prof Maria Fasli, co-investigator in the project, visited Adrian Stoica at JPL between the end of July and early August this Summer. Maria is an expert in multi-agent systems and she is looking at mixed-initiative systems for robot control with Ardian's group.

Friday 3 August 2012

Collaborative Brain Computer Interface for Simulated Spacecraft Control

While at NASA JPL in the Summer 2012 working with Adrian Stoica, Riccardo, Caterina and Francisco experimented with the real-time control of a simulated spacecraft using the brain signals from two users.

Here is the setup


The subject on the left is Riccardo. The subject on the right is Luis Clark, a student volunteer.

Here is a close-up of the simulator


The objective was to pass as close as possible to the planet (intermittently visible in white). Here is a movie of one of the simulations


The development and experimentation of the software had started some months before at Essex and are still ongoing. Preliminary results on this line of research will appear in the Intelligent User Interfaces conference in March 2013 in a paper entitled "Towards Cooperative Brain-Computer Interfaces for Space Navigation". 

Here is the abstract:

We explored the possibility of controlling a spacecraft simulator using an analogue Brain-Computer Interface (BCI) for  2-D pointer control. This is a difficult task, for which  no previous attempt has been reported in the literature. Our system  relies on an active display which produces event-related potentials (ERPs)  in the user's brain. These are analysed in real-time to produce  control vectors for the user interface.  In tests, users of the simulator were told to pass as close as  possible to the Sun.   Performance was very promising, on average users managing to satisfy the simulation success criterion in 67.5% of the runs.  Furthermore, to study the  potential of a collaborative approach to spacecraft navigation, we  developed  BCIs   where the system is controlled via the integration of the ERPs of two users. Performance   analysis indicates that collaborative BCIs  produce trajectories  that are statistically significantly superior to those obtained by single users. 



Tuesday 31 July 2012

Do acupuncture points provide better or worse electrode locations for BCI?


One of the studies we did at JPL during the Summer was to investigate the effect of supposed acupuncture nodes on scalp electrode/skin impedance (which may result in unusual EEG readings). We could only do a limited number of measurements due to the limitations of the equipment we had with us.

Lois Clark (a student) volunteered to be the subject of the tests. Below is the motage we were able to put together. It places 6 electrodes approximately 1cm apart in the area of a well-known acupuncture point.



We found nothing significant (albeit with one participant only), but a larger study will be run in the future before the hypothesis can be ruled out.

Tuesday 24 July 2012

Some of our researchers visiting JPL

Here is a picture of some of our team members visiting NASA's Jet Propulsion Laboratory in July 2012, just when the Curiosity was approaching Mars. For the occasion JPL set up a display of copies (some in scale, some 1:1) of various rovers which over the years have been launched by NASA.


From right to left: Riccardo Poli, Caterina Cinel, and Francisco Sepulveda (all from Essex), Adrian Stoica (our main collaborator at JPL) and Luis Clark (a student who acted a few times as a subject for our experiments at JPL).

Friday 25 May 2012

Riccardo and Francisco's visit to JPL

Francisco and Riccardo visited Adrian Stoica's lab at JLP in late May 2012 interacting particularly with Adrian but also Christopher Assad (he is specifically interested in EMG applications to robotic control and prosthesis) and other people.

They were also invited to give a tutorial on BCI to get people in the lab up to speed on the state-of-the-art and recent developments in the field.

The seminar was entitled "Brain Signals + Evolutionary Computation = Human Competitive Brain Computer Interfaces". Here is a summary:

The keyboard and mouse provide us with reliable, but unnatural forms of input, being primitive transducer of muscular movement. People who lack muscle control cannot used them.  Wouldn't it be nice some day to be able to replace the mouse and keyboard with systems capable of directly interpreting the intentions of computer users from their
brain activity? 

This is the goal of the field of Brain-Computer Interfaces (BCI). Unfortunately, this goal is hampered by a number of problems: brain signals are typically extremely noisy, they vary in location and temporal dynamics from subject to subject, they depend on the age,

tiredness, attention, food and drug intake of subjects, etc. So, even the best BCIs are extremely slow and prone to misinterpret user intentions.

In this seminar we will briefly review the different approaches to BCI, with particular attention to non-invasive EEG-based BCIs, highlighting their difficulties and limitations. We will then illustrate a number of cases from our own research in the Essex BCI group, where evolutionary algorithms and genetic programming in particular have helped develop systems which are competitive with human-designed ones, thereby accelerating the development of practical BCI technology.


Thursday 10 May 2012

Potential Simulators for the Project



Caterina and I (Riccardo) have done a search of space flight simulators, particularly focusing on anything written in Python or that can be interfaced to Python (since this is the language we have used for our BCI Analogue Mouse system).

Here is a preliminary list:
  • JPL have developed and used in a number of missions their own simulators. Two have caught my eye:
    • DARTS (which stands for Dynamics Algorithms for Real-Time Simulation) a general simulator for multi body dynamics (and on which other simulators are based). This could be very good for simulating the flying of a spacecraft.
    • ROAMS (which stands for Rover Analysis, Modeling and Simulation) which would be good for simulating rovers performing missions on a planet or something.
  • Vega Strike is a sort of a game of space exploration and missions. I've downloaded and installed version 0.5.1 on Ubuntu and it worked really well (note: I did not get it from the ubuntu repositories, since that is partly broken). Apparently one can create missions by Python scripting, but I haven't got round to test that. It would give us the possibility of running really realistic simulations. (One issue to consider is whether our BCI Mouse software would interact ok with it, since it grabs the mouse pointer and goes full screen.)
  • Space Commander is a physically realistic simulator of space travel entirely written in Python. I've already committed and adapted version 0.4 of the simulator (see changesets r3 and r4). Because it is written in Python and it is very compact I would suggest to use it to do the first few experiments. (Incidentally, it uses pygame, which I think is the same package Psychopy uses: Psychopy is used for stimulus presentation etc. in our BCI Mouse.)

Friday 13 April 2012

Some Potential for Assisted Living

The research in the area of Brain Computer Interfaces to be carried out in this project may also have an impact on Assisted Living. We will try to exploit its results within the Ageing and Assisted Living Network of the University of Essex.

Thursday 12 April 2012

We will use our analogue BCI mouse for this project

In earlier research (partly funded by the Engineering and Physical Sciences Research Council, EPSRC) we developed an analogue Brain Computer Interface for cursor control which we call the Essex BCI Mouse. We have decided to use it as a starting point to build some of the control and navigation applications to be explored in this project.

Here is a short video we produced last year to illustrate the operations of our BCI mouse:


More information on the BCI Mouse project is available here.

Thursday 5 April 2012

Adrian Stoica and the JPL Advanced Robotic Controls Group

General

Here is Adrian's webpage and the Advanced Robotic Controls Group's page. They are interested in:
  • Human-robot interfaces and controls using biological signals such as EMG and EEG
  • Mobility algorithms, planning and navigation for underwater, surface, and aerial robotic platforms
  • Robot intelligence, behavior control, robot learning, cognition and decision making, including distributed reasoning and optimal control for cooperative robots

Interest in Biological Interfaces

In the area of biological interfaces they have Christopher Assad and Michael Wolf with an interest in the area. Also, one of their key projects is entitled Bioelectric Sensor Arrays for Reliable Prosthetic Interface sponsored by DARPA. It is mostly about EMG, but the objectives are similar to those of BCI.

Facilities

They have a number of robotic arenas and a very sophisticated simulator called DARTS.