5th Dutch Bio-Medical Engineering Conference 2015
22-23 January 2015, Egmond aan Zee, The Netherlands
15:00   Keynote Lecture: Prof. Paul McCullagh (University of Ulster, Belfast)
15:00
25 mins
THE SMART ENVIRONMENT CONTROLLED BY THE INTELLIGENT HUMAN: AT THE INTERFACE
Paul McCullagh
Abstract: This presentation discusses two areas of relevance to biomedical engineering: (i) advances in smart environments and pervasive computing and (ii) communication and control using scalp recorded brain signals. Merging these areas offers opportunity for a new intuitive human-computer interface, with collaboration between human and environment. To realise this man-environment collaboration provides significant technical challenge for biomedical engineering, and when successfully deployed, a wider societal challenge as we begin to understand the quantification of our brain processes. The environment has never been smarter. Embedded sensor networks communicate as an ‘Internet of Things’, partially realising Mark Weiser’s vision from 1991, “The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.” [1]. Artificial intelligence algorithms, semantic web and big data stored in the cloud are the key computing technologies of 2015 and beyond; they could relieve us from repetitive domotic tasks. And yet Stephen Hawking, one of Britain's pre-eminent scientists, has recently remarked that efforts to create ‘thinking machines’ pose a threat to our very existence [2], reminding us of Stanley Kubrick's 1968 sci-fi film ‘2001- A Space Odyssey’ and HAL its devious android. The worldwide population is aging at a rapid rate. Many older people have multiple complex chronic conditions. Middle aged people survive strokes and heart failure, living longer, often with a reduced quality of life and ever more reliance upon assistive technology. The cost of care is rising rapidly, partly due to the introduction of expensive new drugs and diagnostic equipment. Significant successes in biomedical engineering have contributed to this demographic. The current and future delivery of healthcare is a huge political issue in many countries. According to Stephen Intille [3], “We must change the way we deliver healthcare. We need better, more holistic, life-long care at lower cost”. Biomedical engineering should help here with the use of pervasive technology and smart environments for wellness monitoring, chronic disease and normal ageing management. The ‘quantified-self’ movement is in the vanguard of this paradigm shift. So how will the human control (or be controlled by) these smart environments? Interaction must be intuitive, straightforward and reliable. Touch screen, voice control and gesture input have complemented traditional keyboard and mouse. Brain-Computer Interface (BCI) is a relatively new paradigm, which aims to empower a person’s capabilities by providing a reliable input modality that does not require the involvement of peripheral muscles. Our increased understanding of the function of the human brain, coupled with advances in electrodes, amplifiers and signal processing has led to viable BCI systems (outside of the laboratory). BCI has hitherto been the subject of research as an assistive technology but it could become a pervasive technology, due to the introduction of new commercial headsets. Whilst the dream of thought controlled devices lives on, reality focuses on algorithms that can recognise specific patterns in acquired brain signals across spatial, temporal and frequency domains. The most commonly employed data acquisition approach is the measuring the electroencephalogram. The mental state of the user is probed to generate the electrical activity patterns, which vary in accordance with the operating protocol. Commonly employed approaches are Event-Related Desyncronisation/Synchronisation, Slow Cortical Potentials, Steady State Visually Evoked Potentials, or the P300 and Error-Related Potential components of Event-Related Potential waveforms. Current limitations are slow information transfer rates, inter/intra-subject variability, inconvenient set up procedures, and the need for carefully controlled environments. These obstacles have been addressed in EU funded projects such as BRAIN, Michelangelo and BackHome. Such limitations adversely affect those that could potentially benefit most; individuals who suffer from disease and traumatic injury which, in severe cases, can lead to Amyotrophic Lateral Sclerosis or “locked-in syndrome”. Recent emphasis has been placed upon hybrid architectures combining different approaches (e.g. motor imagery and sensory stimulation) with complementary technologies, such as eye tracking. These systems have the potential to increase classification accuracy, thereby further promoting the acceptance and adoption of technology. Thus hybrid BCI control through the use of inexpensive, portable and commercially available devices can facilitate human-computer interaction within the smart environment. Better control can then be achieved by devolving (artificial) intelligence to the environment. The human can initiate a task leaving it to the sensor networks in the smart environment to complete it. Quantifying the EEG could potentially allow a smart environment to evaluate the human. This can be beneficial allowing the detection of tiredness (while driving), even brain dysfunction or underlying conditions (prevalence of dementia is set to rise due to the ageing society). So at what point will the devolved artificial intelligence become the dominant factor? And could such an algorithm influence the human mind. These are the ethical concerns at the ‘interface’ that should be addressed as this technology matures. Echoes of HAL indeed, but we are not there - yet!