Biosignal-Driven Art: Beyond Biofeedback
Miguel Ortiz / Niall Coghlan / Javier Jaimovich / Ben Knapp
Biosignal monitoring in interactive arts, although present for over forty years, remains a relatively little known field of research within the artistic community as compared to other sensing technologies. Since the early 1960s, an ever-increasing number of artists have collaborated with neuroscientists, physicians and electrical engineers, in order to devise means that allow for the acquisition of the minuscule electrical potentials generated by the human body. This has enabled direct manifestations of human physiology to be incorporated into interactive artworks. This article presents an introduction to this field of artistic practice and scientific research that uses human physiology as its main element. A brief introduction to the main concepts and history of biosignal-driven art is followed by a review of various artworks and scientific enquiry developed by the authors. This aims at giving a complete overview of the various strategies developed for biosignal-driven interactive art.
The Human Nervous System
It is possible to think of the human nervous system as a complex network of specialised cells that communicate information about the organism and its surroundings (Maton et al, 1994). In gross anatomy, the nervous system is divided into two sub-systems: the Central Nervous System (CNS) and the Peripheral Nervous System (PNS). The CNS is the largest part of the nervous system. For humans, it includes the brain and the spinal cord. It is responsible for coordinating the activity of all parts of the body. It processes information, is responsible for controlling the activity of the peripheral nervous system, and plays a fundamental role in the control of behaviour.
The PNS extends the CNS by providing a connection to the body’s limbs and organs. The PNS provides a means for sensing the outside world and for manifesting volitional actions upon it. The PNS is further divided into: Autonomic Nervous System (ANS) and Somatic Nervous System (SNS). The SNS is a component of the peripheral system that is concerned with sensing external stimuli from the environment and is responsible for the volitional control of the skeletal muscles that allow us to interact with the outside world (Knapp, Kim and André, 2010a). The ANS controls the internal sensing of the various elements that form the nervous system. It regulates involuntary responses to internal and external events and is further sub-divided into Sympathetic Nervous Systems (SNS), which are responsible for physiological changes during times of stress, and Parasympathetic Nervous Systems (PNS) which control salivation, lacrimation, urination, digestion and defecation during the resting state. Figure 1 illustrates the taxonomy and organisation of the Central Nervous System.
There are various techniques and methodologies available to monitor the operation of the nervous system. Changes in human physiology manifest themselves in various ways, ranging from changes in physical properties (e.g. dilatation of the pupils) to changes in electrical properties of organs or specialised tissues (e.g. changes in electrical conductivity of the skin).
Fig. 1 The Nervous System, Taxonomy and Organisation1
Biosignal is a generic term that encompasses a wide range of continuous phenomena related to biological organisms. In common practise, the term is used to refer to signals that are bio-electrical in nature, and that manifest as the change in electrical potential across a specialised tissue or organ in a living organism. They are an indicator of the subject’s physiological state. Biosignals are not exclusive to humans, and can be measured in animals and plants. Excitable tissues can be roughly divided into tissues that generate electrical activity, such as nerves, skeletal muscles, cardiac muscle and soft muscles. Passive tissues that also manifest a small difference of potential include the skin and the eyes. Valentinuzzi defines the latter as ‘non-traditional sources of bioelectricity’ (Valentinuzzi, 2004, p. 219).
Biosignal monitoring has had a large tradition in health-care ever since Italian physician Luigi Galvani discovered ‘animal electricity’ in 1791 (Galvani, 1791; Galvani, 1841; Piccolino, 1998) which was confirmed three years later by Humboldt and Aldini (Aldini, 1794; Swartz and Goldensohn 1998). For a more detailed definition of biosignals and their use in the fields of medicine, psychology and bioengineering instrumentation, please see Cacioppo et al (2007) and Webster (1978).
Galvanic Skin Response (GSR)
GSR is the change of the skin’s electrical conductance properties caused by stress and/or changes in emotional states (McCleary, 1950). It reflects the activity of sweat glands and the changes in the sympathetic nervous system (Fuller, 1977), and is an indicator of overall arousal state. The signal is measured at the palm of the hands or the soles of the feet using two electrodes between which a small, fixed voltage is applied and measured. Changes in the skin’s resistance are caused by activity of the sweat glands; for example, when a subject is presented with a stress-inducing stimulus; his/her skin conductivity will increase as the perspiratory glands secrete more sweat.
The GSR signal is easy to measure and reliable. It is one of the main components of the original polygraph or ‘lie detector’ (Marston, 1938) and is one of the most common signals used in both psycho-physiological research and the field of affective computing (Picard, 1997).
The ECG is a measurement of the electrical activity of the heart as it progresses through the stages of contraction. Figure 2 shows the components of an ideal ECG signal.
Fig. 2 Ideal ECG signal
In Human Computer Interaction (HCI) systems for non-clinical applications, the Heart Rate (HR) and Heart Rate Variability (HRV) are the most common features measured. For example, low and high HRs can be indicative of physical effort. In affective computing research, if physical activity is constant, a low HRV is commonly correlated to a state of relaxation, whereas an increased HRV is common to states of stress or anxiety (Haag et al. 2004).
EOG is the measurement of the Corneal-Retinal Potentials (CRP) across the eye using electrodes. In most cases, electrodes are placed in pairs to the sides or above/below the eyes. The EOG is traditionally used in HCI to assess eye-gaze and is normally used for interaction and communication by people that suffer from physical impairments that hinder their motor skills (Patmore and Knapp, 1998).
Electromyography is a method for measuring the electrical signal that activates the contraction of muscle tissue. It measures the isometric muscle activity generated by the firing of motor neurons (De Luca and Van Dyk, 1975). Motor Unit Action Potentials (MUAPs) are the individual components of the EMG signal that regulate our ability to control the skeletal muscles. Figure 3 illustrates a typical EMG signal and its amplitude envelope.
Fig. 3 Example of EMG signal
EMG-based interfaces can recognise motionless gestures (Greenman, 2003) across users with different muscle volumes without calibration, measuring only overall muscular tension regardless of movement or specific coordinated gestures. They are commonly used in the fields of prosthesis control and functional neuromuscular stimulation. For musical applications, EMG-driven interfaces have traditionally been used as continuous controllers, mapping amplitude envelope to control various musical parameters (Tanaka, 1993).
The Electroencephalogram (EEG) monitors the electrical activity caused by the firing of cortical neurons across the brain’s surface. In 1924, German neurologist Hans Berger measured these electrical signals in the human brain for the first time and provided the first systematic description of what he called the Electroencephalogram (EEG). In his research, Berger noticed spontaneous oscillations in the EEG signals (Rosenboom, 1999), and identified rhythmic changes that varied as the subject shifted his/her state of consciousness. These variations, which would later be given the name of alpha waves, were originally known as Berger rhythms (Berger, 1929, p. 355; Gloor, 1969; Adrian and Matthews, 1934).
Brainwaves are an extremely complex signal. In surface EEG monitoring, any given electrode picks up waves pertaining to a large number of firing neurons, each with different characteristics indicating different processes in the brain. The resulting large amount of data that represents brain activity creates a difficult job for physicians and researchers attempting to extract meaningful information.
Brainwaves have been categorised into four basic groups or bands of activity related to frequency content in the signals: Alpha, Beta, Theta and Delta (Lusted and Knapp, 1996). Figure 4 shows each of the frequency bands as displayed by an EEG monitoring system. This categorisation however, is the source of certain controversy as some researchers recognise up to six different frequency bands (Miranda et al, 2003). Furthermore, the exact frequency at which each band is divided from the rest is not cast in stone and one might find discrepancies of up to 1Hz in various texts dealing with the subject. The following categorisation is taken from the guidelines provided by the International Federation of Electrophysiology and Clinical Neurophysiology (Steriade et al, 1990):
- Alpha rhythm has a frequency range that lies between 8 and 13 Hz. Alpha waves have been thought to indicate both a relaxed awareness and the lack of a specific focus of attention. In holistic terms, it has been often described as a “zen-like state of relaxation and awareness”.
- Beta refers to all brainwave activity above 14Hz and is further subdivided into 3 categories:
1. Slow beta waves (15-20Hz) are the usual waking rhythms of the brain associated with active thinking, active attention, focus on the outside world or solving concrete problems.
2. Medium beta waves (20-30Hz): this state occurs when the subject is undertaking complex cognitive tasks, such as making logical conclusions, calculations, observations or insights (Rosenboom, 1999).
3. Fast beta waves (Over 30Hz): this frequency band is often called Gamma and is defined as a state of hyper-alertness, stress and anxiety (Miranda et al, 2003). It is found when performing a reaction-time motor task (Sheer, 1989).
- Delta waves are slow periodic oscillations in the brain that lie within the range of 0.5 to 4 Hz and appear when the subject is in deep sleep or under the influence of anaesthesia.
- Theta waves lie within the range of 4 to 7 Hz and appear as consciousness slips toward drowsiness. It has been associated with access to unconscious material, creative inspiration and deep meditation.
Fig. 4 EEG frequency bands
Biosignal-driven Interactive Arts
In 1919, German poet Rainer Maria Rilke wrote an essay entitled Primal Sound, in which he stresses the visual similarity between the surface of the human skull and that of early phonograph wax cylinders (Rilke, 1978). He then speculated about the possibility of transducing the skull’s grooves into this primal sound.
Although Rilke never implemented the necessary interface to generate the primal sound, his idea is extremely seductive in its conception and the artistic-aesthetic implications it proposes. Rilke’s text captures the fascination that many artists hold for the possibility of using physiological phenomena to create art. In the 1960s, a whole generation of artists indeed re-appropriated medical tools and develop systems to harness the subtle physiological changes of the human body. These pioneers slowly created a de-centralised movement that sought inspiration in medical science to create works that relate to the human being at a physiological level.
Early Pieces and the Biofeedback Paradigm
In 1964, American composer Alvin Lucier had begun working with physicist Edmond Dewan and became the first composer to make use of biosignals in an artistic context. His piece Music for Solo Performer, scored for “enormously amplified brainwaves”, was premiered at Brandeis University in 1965 (Holmes, 2002).
Lucier’s piece explores the rhythmic modulations of the alpha band of brainwaves by means of direct audification and with the addition of percussion instruments, namely cymbals, drums and gongs, which were coupled to large speakers (Teitelbaum, 1976). High bursts of alpha activity would cause the speakers to excite the acoustic instruments, which in turn activated a disembodied percussion ensemble.
Lucier’s pioneering use of EEG signals for music composition was quickly adopted by other composers, most notably Richard Teitelbaum and David Rosenboom. Teitelbaum had been working in Rome since the early 1960s as part of the group Musica Elettronica Viva (MEV). In 1967, he presented his work Spacecraft, in which EEG and ECG signals of five performers were used to control various sound and timbre parameters of a Moog synthesiser (Arslan et al, 2005). During the following years, Teitelbaum explored biosignals further. His compositions: Organ Music and In Tune incorporated the use of the voice and breathing sounds in order to create a close relationship between the resulting music and the human body that generated it (Teitelbaum, 1976).
David Rosenboom carried on Teitelbaum’s explorations and, in 1970, presented Ecology of the Skin, a work that measures EEG and ECG signals of performers and audience members (Rosenboom, 1999). He was the first artist to undertake systematic research into the potential of brainwaves for artistic applications, creating a large body of works and developing a series of systems that increasingly improved the means of detecting cognitive aspects of musical thinking for real-time music making.
The following year, Musique Concrète pioneer Pierre Henry, began collaborating with scientist Roger Lafosse who was undertaking research into brainwave systems. This collaboration spawned a highly complex and sophisticated live performance system entitled, Corticalart (Henry, 1971). During the same year, Manford Eaton, who was working at Orcus Research in Kansas City, published Bio-Music (Eaton, 1971), a manifesto in which he describes in great detail the apparatus and methods to implement a full biofeedback system for artistic endeavours and calls for a completely new biofeedback-based art in which the intentions of the composer are ‘fed directly’ to the listener by means of careful monitoring and manipulation of the listener’s physiological signals.
Eaton’s system consisted of both audio and visual stimuli for the listener, designed to elicit pre-defined psycho-physiological states which are controlled by the composer. Therefore, his Bio-Music ethos abandons the division between performer and audience. Bio-Music compositions are not to be ‘listened’ or ‘witnessed’ by a large audience, but to be experienced by individual listeners. The composer/performer, adapts his/her algorithms and the presented stimuli to the subject’s individual physiological responses and delivering a consistent ‘message’ or experience for each individual that experiences the work. In Eaton’s Bio-Music, the specific sounds or images presented to the listener are irrelevant as long as they succeed in modulating the subject’s physiological state to that desired by the composer.
Post biofeedback practice
Towards the end of the 1980s, the advent of digital signal processing systems and the wide availability of powerful personal computer systems, made it possible for researchers to further develop the existing techniques for biosignal analysis in real-time applications. In 1988, California-based scientists Benjamin Knapp and Hugh Lusted introduced the BioMuse system (Knapp and Lusted, 1988), which consisted of a signal-capturing unit that sampled 8 channels of biosignals, which were then amplified, conditioned and translated to midi messages. The sensors were implemented as simple limb-worn velcro bands that were able to capture EMG, EEG, EOG, ECG and GSR signals. The BioMuse system, facilitated not only the analysis of the signals, but the ability to use the results of the analysis to control other electronics in a precise and reproducible manner that had not been previously possible (Knapp and Lusted 1990). This allowed Knapp and Lusted to introduce the concept of biocontrol, an important conceptual shift from the original biofeedback paradigm that had reigned unchallenged during the 1970s. Whilst biofeedback allowed for physiological states to be monitored and, relatively passively, translated to other media by means of sonification, biocontrol proposed the idea and means to create reproducible volitional interaction using physiological data as input (Tanaka, 2009).
In order to fully demonstrate the possibilities afforded by their system, Knapp and Lusted commissioned composer Atau Tanaka to write the first piece for their new interface. The BioMuse’s maiden concert took place in Stanford California in 1989. In that concert, Tanaka premiered Kagami, a piece that used EMG signals measuring muscular tension on his forearms (Keislar et al, 1993). This introduced a novel biosignal performance practice that consisted of a highly personal visual and sonic style of biosignal-driven music and stage presence, moving from the archetypal image of the motionless centre-stage-seated bio-performer pioneered by Lucier, to a dynamic musician that explored arm gestures in a highly engaging way.
In 1998, Teresa Marrin-Nakra and Rosalind Picard, who were carrying out research in the field of affective Computing at the Massachusetts Institute of Technology (MIT), created The Conductor’s Jacket, a wearable computing device that facilitated the measuring and recording of physiological and kinematic signals from orchestra conductors (Marrin and Picard, 1998). Even though The Conductor’s Jacket was originally conceived as a recording and monitoring device for scientific enquiry, its ability to stream data in real-time allowed Nakra to use it in performance contexts, where it functioned not as a passive monitoring device but as a disembodied musical instrument.
The turn of the 21st century brought with it a renewed worldwide interest in biosignals for artistic applications, as many favourable factors converged. On the one hand, personal computers became powerful enough to deal with these types of signals. Likewise, the evolution of the BioMuse and other biosignal measuring devices created by the affective computing team at MIT meant that it was now possible to ecologically measure physiological signals from performers in stage situations in a transparent and effective way. Moreover, commercially available medical equipment such as the g.MOBIlab2 , Emotiv’s EPOC3 , MindMedia’s Nexus4 units and Thought Technology’s Inﬁniti5 systems have become more affordable and easy to use.
This makes the issue of meaning and content even more relevant than ever. The various technologies that facilitate the measurement of biosignals as well as their correlates to human emotion have undergone a great development, yet the associated approaches and metaphors that artists use to create works using these technologies remain relatively unchanged.
Following on this tradition of biosignal-based research and creative practice, Benjamin Knapp founded the Music, Sensors and Emotion (MuSE) research cluster at the Sonic Art Research Centre (SARC), Belfast. This group attempts to fuse the fields of Art (with specific focus on sound and music) and Science (with specific focus on physiological and kinematic sensing) through biosignal based installations and performances supported by extensive research and experiments aimed at furthering our understanding of physiological sensing; in particular with regard to emotion and its physiological correlates.
MuSE’s goals and research interests are many but include the following:
- Development of emotion based performance instruments and interfaces (following the approach of the Integral Music Controller postulated by Knapp).
- Quantification of the physiological correlates of emotion.
- Understanding and quantification of factors behind emotional and physiological contagion.
- Greater understanding of the effects of mood and emotion on the performance and appreciation of music.
- Creation of affect aware and affect responsive artworks.
Current approaches and challenges
At the dawn of biosignal-driven art, the existing apparatus and analysis methodologies taken from the medical field were more related to the detection and diagnosis of pathologies and not concerned with emotional assessment or physiological performance interaction.
One of the challenges of physiologically based artwork is making the observer-artwork dialogue meaningful to the participants in the work; a question of translating the input biosignals to visual, auditory or experiential events. Despite the advances in the science and technology behind biosignal art, little progress has been made in the application of these signals in a meaningful artistic fashion.
The lack of conscious control over one’s biosignals means that the aesthetics of the interaction with the work require careful consideration in order to deliver a satisfying experience. Beyond explicit interactions (e.g. Heart Rate to Drumbeat, see figure 5) the mapping of high level behaviours and structures from low level control signals (e.g. pulse, respiration) is difficult to achieve in a fashion meaningful to the viewer.
Fig. 5 The hrtdrm (2009) by Craig Fahner converts heart rate to drum beats6
Artworks that use biosignals as components directly involve the ‘viewer’ in the creation of the work, going a step beyond interactivity, an interaction described by Koch et al (1990) as co-activity. Performers using these signals to create works have a relationship with the sensing system much like that of a performer to an instrument (Tanaka and Knapp, 2002) whereas a biosignal work aimed at a public audience may require more generalised and explicit interaction modes. However the significance of audience comprehension of performer gestures and control should not be underestimated in developing a musical work.
Much of the satisfaction for users exploring biosignal works stems from the experience of exploring the boundaries and affordances of the interaction, learning to ‘play’ the work. In cases such as these, explicit mappings tend to make the most sense to the viewer/user.
In order to develop successful interactions in biosignal and emotion driven works, a deeper understanding of the physiological manifestation of emotion in performance and installation environments is required. Results obtained by research in other fields such as affective computing, psychophysiology and others are of great aid, but more specific and focused research is needed to fully understand what happens during performance situations so that better suited strategies and artistic approaches can be implemented. In the following section, we will discuss instances of such research.
The Paganini Experiments
The Paganini experiments were designed to investigate physiological manifestation of emotion in musicians while performing. Due to the complexity of the human physiology, and its relationship with both music and emotion, as well as the difficulty of investigating emotion in the lab, experiment sessions have been aimed to answer several questions. It is worth pointing out that these experiments are the fundamental basis for the development of interfaces and architectures for adding an emotion interactive channel to musical performance.
Most of the experimental work pursued within the MuSE research group has been based on the results of the Paganini experiments (Jaimovich and Knapp, 2009). This work ana-lysed the data collected in the experiments performed in Casa Paganini in October 2007, which was the first research that looked into the physiological manifestation of emotion in musicians while performing.
In order to address this issue, a novel approach was used by the researchers, which involved an emotion induction procedure performed by a psychologist (Glowinski et al, 2008). This allowed the comparison between performances expressing a specific emotion (elation and sadness in this particular case) and the same performance when the performer is actually experiencing this emotion.
Preliminary results suggest several patterns between physiological signals and the musical score, as well as between the different emotional states. The strongest relationships exist in the performance’s tempo, HR, HRV and GSR (see figure 6 for an example).
Fig. 6 HRV during the performance of the same musical piece in different emotional states (2 takes each). A and B indicate two particular sections of the piece
Based on both the results of this first research and suggestions made by Sloboda regarding the structural characteristics of music associated with bodily manifestations of emotion (Juslin and Sloboda, 2001, p. 90), it became apparent that there is a need to explore specific musical features that seem to have a strong correlation with the physiological response of the musicians. Examples of these are rests, crescendos, harmonic changes; basically any musical structure related to expectation (Wishart, 2009).
For the second set of experiments in Casa Paganini, the authors worked with two different scenarios. On the one hand, an opportunity emerged to record with the renowned Quartetto di Cremona (figure 7, left), with whom we investigated the correlation between music and physiology. In parallel, a different experiment involving the emotion induction procedure was realized with two professional violin players (figure 7, right).
Fig 7. Quartetto di Cremona (left) and two violin performers (right) during experiments in Casa Paganini 2009
The experiment with the quartet focused on exploring the correlation between music and biosignals by selecting scores from the romantic composer Franz Schubert, which were particularly expressive, with strong dynamic changes and sudden rests among other musical elements. Physiological data was recorded for the whole quartet, performing both individually and as a group. By making the violinists play together, the intention was to reduce the possibility of external factors affecting their physiological state. In other words, if there was a particular reaction or pattern found in the data of both subjects in a particular section of the piece, it would happen simultaneously.
The emotions studied in this session were elation and anger, having excellent results with the elation induction, and only minor influence with anger. The audience factor was also explored as a condition with this set of experiments.
Another set of experiments was implemented SARC, Belfast and in the International Music and Emotion Conference at Durham, 2009. For these experiments, biosignals of three performers were captured in a real concert environment to explore the degree of emotional contagion between performer and audience.
Performances included a piano improvisation by Sarah Nicolls, the electroacoustic piece Imago diffused in real-time by Trevor Wishart, and Stem Cells, an interactive piece composed by Eric Lyon and performed by Ben Knapp.
Preliminary results were presented at ICME in Durham (Jaimovich et al, 2009), which indicate a high degree of correlation between performer and audience after simple analysis of the recorded data. Figure 8 shows GSR correlation during the performance of Stem Cells.
Fig. 8 Example of emotional correlation between performer and audience. The plot shows the galvanic skin response of Ben Knapp and an audience member during 5 minutes of Stem Cells
Preliminary results of these experiments as well as on-going artistic endeavours have allowed us to design interaction strategies that are better suited for the use of biosignals. We therefore propose some possible approaches that are better informed by empirical research on the physiological manifestation of emotion as it applies to music performance and installation mediums.
Biosignal artworks can create a dialogue with the viewer/user in which the work is capable of responding to the latter (monitoring affect) but also provoking her/his responses (engendering affect), in effect creating a feedback loop (see figure 9).
Fig. 9 Viewer 1 content feedback loop
The artist should consider strategies to counteract these biofeedback loops in which the work amplifies the physiological or emotional state of the viewer, which in turn amplifies the response of the work, etc. Using responses that contradict the physiological state of the viewer, or time-limited or threshold based rules governing system responses are possible strategies here. The literal representation of physiological input is one of the most direct and comprehensible mappings available to the artist e.g. visualisation of heart rate (see figure 10).
Fig. 10 Affectech (Coghlan et al, 2009): Visualisation of Heart Rate & Reactive Avatars
A further level of abstraction that is still relatively comprehensible is the mapping of biosignals to control non-literal sound or visual output, such as heart rate to tempo or muscle tension to timbre (Ortiz, 2010).
The next step up is to use the viewers’ physiological information to assess emotional or affective state (Haag et al, 2004). This allows us to create emotionally aware works with the potential for deep resonances with the viewer; with the caveat that the more complicated the response, the greater the risk of alienating the viewer. A low-level example of this is the mapping of arousal level (GSR) to lighting colour and hue (D’Andrade and Egan, 1974) as implemented during a performance of The Reluctant Shaman (Knapp et al, 2008).
An even more complex strategy is to map affective state to related imagery, sound or text such as in Chameleon by Tina Gonsalves (Gonsalves, 2009). However it has been pointed out that in order to find accurate correlates of emotion using physiological signals alone, they ‘must be measured in meticulously controlled environments’ (Kreibig et al, 2007, p. 802; Knapp et al, 2010b). One option to counteract this is to work with material with strong psychological resonances such as phobias (e.g. to spiders) in order to provoke strong physiological responses.
Some artists have also chosen to make the biosignals themselves the focus and content of the work, such as the mapping of physiology on to geography seen in Christian Nold’s Emotional Cartography project (Nold, 2009).
The use of biosignal monitoring technologies in interactive art contexts has been present for over sixty years. From Alvin Lucier’s pioneering work Music for Solo Performer to the current practice of biosignal-driven performance and sound installation, the field has advanced both in its technical implementations and the artistic affordances that the medium provides. Developments in medicine and psychophysiology, allow us to understand better the meaning and implication of human-generated electrical signals and their correlation to emotion. Furthermore, the work carried out by the Affective Computing Group at MIT and the Music Sensors and Emotion team at SARC has facilitated the technical aspects of biosignal monitoring for interactive artistic practice. It is now easier than ever to incorporate physiological measurements into the stage; thus, biosignaldriven art can now be carried out in a practical way, without the need for the large and expensive equipment used in the early 60s and 70s. This opens the door for deeper artistic and aesthetic explorations which, in our opinion, should become the central focus of creative work.
Adrian E D and Matthews B H C, 1934. ‘The Berger rhythm: potential changes from the occipital lobes in man’. Brain 57(4): 355-383.
Aldini G, 1794. De animali electricitate dissertationes duae. Typographia Instituti Scientiarum (Bonomiae, Italy).
Arslan B, Brouse A, Castet J, Filatriau J J, Lehembre R, Noirhomme R and Simon C, 2005. ‘From Biological Signals to Music’. Proceedings of the 2nd International Conference on Enactive Interfaces Enactive05, Genoa, Italy. Also online: http://tcts.fpms.ac.be/publications/papers/2005/enactive05_abbacjfjjlrnqsc.pdf. Accessed: 26/4/2010.
Berger H, 1929. ‘Ueber das Elektroenkephalogramm des Menschen. I. Mitteilung’. Archiv für Psychiatrie und Nervenkrankheiten 87: 527-570.
Cacioppo J T, Tassinary L G and Berntson G G, 2007. Handbook of psychophysiology. CUP (Cambridge, UK).
Coghlan N, Jaimovich J, Knapp B, O’Brien D, Ortiz M, Quigley T, 2009. ‘AffecTech - an affect-aware interactive AV Artwork.’ Paper presented at International Symposium on Electronic Arts 2009. International Symposium on Electronic Arts. Belfast.
D’Andrade R and Egan M, 1974. ‘The colors of emotion’. American ethnologist 1: 49-63.
De Luca C J and Van Dyk E J, 1975. ‘Derivation of some parameters of myoelectric signals recorded during sustained constant force isometric contractions’. Biophysical journal 15(12): 1167-1180.
Eaton M, 1971. Bio-Music: Biological feedback, experiential music system. Orcus (Kansas City, USA).
Fuller G D, 1977. Biofeedback: Methods and procedures in clinical practice. Biofeedback Press (San Francisco, USA).
Galvani L, 1791. De viribus electricitatis in motu musculari commentarius. Comment Bonon Scient et art Inst Bologna 7: 363-418.
Galvani L, 1841. Opere edite ed inedite del Professore Luigi Galvani raccolte e pubblicate dall’Accademia delle Science dell’Istituto di Bologna. Accademia delle Science dell’Istituto di Bologna (Dall’Olmo, Bologna, Italy).
Gloor P, 1969. Hans Berger on the electroencephalogram of man: The fourteen original reports on the human electroencephalogram. Elsevier (New York, USA).
Glowinski D, Camurri A, Volpe G, Chiarra N, Cowie R, McMahon E, Jaimovich J and Knapp R B, 2008. ‘Using induction and multimodal assessment to understand the role of emotion in musical performance’. Paper presented at the 4th workshop on emotion in HCI, Liverpool.
Gonsalves T, 2009. TINA GONSALVES DIGITAL FOLIO. http://www.tinagonsalves.com/INTERFrame.html. Accessed: 26/4/2010.
Greenman P E, 2003. Principles of manual medicine. Lippincott Williams & Wilkins (Philadelphia, USA).
Haag A, Goronzy S, Schaich P and Williams J, 2004. ‘Emotion Recognition Using Bio-Sensors: First Steps Towards an Automatic System’. In Affective Dialogue Systems. Springer (Berlin/Heidelberg, Germany): 36-48.
Henry P, 1971. Mise en Musique Du Corticalart De Roger Lafosse (CD). Prospective 21e Siècle: 6521 022.
Holmes T B, 2002. Electronic and experimental music: pioneers in technology and composition. Routledge (New York, USA).
Jaimovich J, Coghlan N & Knapp R B, 2009. ‘Feeling Music: A Quantitative Examination of Contagion Between Performer and Audience.’ Paper presented at International Conference on Music and Emotion. ICME. Durham, England.
Jaimovich J and Knapp R B, 2009. ‘Pattern Recognition of Emotional States During Musical Performance from Physiological Signals.’ Proceedings of the 2009 International Computer Music Conference. ICMAPress (San Francisco): 461-4.
Juslin P N and Sloboda J A, 2001. Music and Emotion: Theory and Research. OUP (Oxford, UK).
Knapp R B, Kim J and André E, 2010. ‘Physiological signals and their use in augmenting emotion recognition for human-machine interaction’. In HUMAINE Handbook, HUMAINE Ed (in press).
Knapp R B, Ford G, Ponce M and Coghlan N, 2008. The Reluctant Shaman. Unpublished music composition.
Knapp R B and Lusted H S, 1988. ‘A real-time digital signal processing system for bioelectric control of music’. Acoustics, Speech, and Signal Processing, 1988. ICASSP-88., 1988 5: 2556-2557.
Knapp R B and Lusted H S, 1990. ‘A Bioelectric Controller for Computer Music Applications’. Computer Music Journal 14(1): 42-47.
Keislar D, Pritchard R, Winkler T, Taube H, Helmuth M, Berger J, Hallstrom J, and Garton B, 1993. ‘1992 International Computer Music Conference, San Jose, California USA, 14-18 October 1992’. Computer Music Journal 17(2): 85-98.
Koch E, Gaw D C, Cooperative C A and San Francisco C A, 1990. ‘Coactive aesthetics and control theory’. I5th IEEE International Symposium on Intelligent Control, 1990. Proceedings: 93-97.
Kreibig S D, Wilhelm F H, Roth W T and Gross J J, 2007. ‘Cardiovascular, electrodermal, and respiratory response patterns to fear- and sadness-inducing films’. Psychophysiology 44(5): 787-806.
Lusted H S, and Knapp R B, 1996. ‘Controlling computers with neural signals’. Scientiﬁc American 275(4): 82-87.
Marrin T and Picard R, 1998. ‘The Conductor’s Jacket: a device for recording expressive musical gestures’. Proceedings of the International Computer Music Conference, Ann Arbor. ICMAPress (San Francisco): 215–219.
Marston W M, 1938. Lie Detector Test. R.R. Smith (New York, USA).
Maton A, Hopkins J, Johnson S, LaHart D, Warner M Q and Wright J D, 1994. Human biology and health. Prentice Hall (Englewood Cliffs, USA).
McCleary R A, 1950. ‘The nature of the galvanic skin response’. Psychological Bulletin 47: 97-117.
Miranda E R, Sharman K, Kilborn K and Duncan A, 2003. ‘On Harnessing the Electroencephalogram for the Musical Braincap’. Computer Music Journal 27(2): 80-102.
Nold C (ed.), 2009. Emotional Cartography-Technologies of the Self. Softbook Published under a Creative Commons Attribution. http://emotionalcartography.net/. Accessed: 26/4/2010.
Ortiz M, 2010. ‘Towards an Idiomatic Compositional Language for Biosignal Interfaces’. PhD Thesis. Queen’s University Belfast (Northern Ireland, UK).
Patmore D W and Knapp R B, 1998. ‘Towards an EOG-based eye tracker for computer control’.
Proceedings of the third international ACM conference on Assistive technologies. ACM (New York, USA): 197-203.
Picard R W, 1997. Affective Computing. MIT Press (Cambridge, USA).
Piccolino M, 1998. ‘Animal electricity and the birth of electrophysiology: the legacy of Luigi Galvani’. Brain Research Bulletin 46 (5): 381-407.
Rilke R M, 1978. Where silence reigns: Selected prose. New Directions Publishing Corporation (New York, USA).
Rosenboom D, 1999. ‘Extended Musical Interface with the Human Nervous System: Assessment and prospectus’. Leonardo 32(4): 257-257.
Sheer D E, 1989. ‘Sensory and cognitive 40–Hz event-related Potentials: behavioral correlates, brain functions and clinical application.’ Springer series in brain dynamics, E. Basar, T. H. Bullock. Springer-Verlag (Berlin, Germany): 339–374.
Steriade M, Gloor P, Llinas R R, Lopes da Silva F H and Mesulam M M, 1990. ‘Basic mechanisms of cerebral rhythmic activities’. Electroencephalography and clinical Neurophysiology 76: 481-508.
Swartz B E and Goldensohn E S, 1998. ‘Timeline of the history of EEG and associated fields’.
Electroencephalography and clinical Neurophysiology 106 (2): 173-176.
Tanaka A, 1993.’ Musical technical issues in using interactive instrument technology with application to the BioMuse’. Proceedings of the International Computer Music Conference, Tokio. ICMAPress (San Francisco, USA): 124-126.
Tanaka A and Knapp R B, 2002. ‘Multimodal interaction in music using the electromyogram and relative position sensing’. Proceedings of the 2002 conference on New Interfaces for Musical Expression, Singapore: 1-6.
Tanaka A, 2009. ‘Sensor based Musical Instruments and Interactive Music’. In Dean R (ed.), The Oxford Handbook of Computer Music. OUP (Oxford, UK): 233-257
Teitelbaum R, 1976. ‘In tune: Some early experiments in biofeedback music (1966-1974)’. Biofeedback and the Arts, Results of Early Experiments. Aesthetic Research Centre of Canada Publications (Vancouver, Canada): 35-56.
Valentinuzzi M E, 2004. Understanding the human machine: a primer for bioengineering. World Scientific Publishing Company (Singapore).
Webster J G, 1978. ‘Medical Instrumentation-Application and Design’. Journal of Clinical Engineering 3(3): 306.
Wishart T, 2009. ‘Transforming Sounds: Confirming and Confounding Expectations’. Paper presented at International Conference on Music and Emotion. Durham, UK.
1 Image source: http://en.wikipedia.org/wiki/File:NSdiagram.png
6 http://www.craigfahner.com/ Accessed: 17/5/2010