References
Papers from Digital Creativity online journal http://www.tandf.co.uk/journals/ndcr
Real-time tracking of the creative music composition process Collins, DavidPages: 239
This paper discusses outcomes from a longitudinal single-case-study where immediately retrospective verbal reporting was combined with real-time digital data collection procedures. By triangulating data from computer 'save-as' files, transcription of the verbal protocol, semistructured interviews and verification sessions with the composer, it was possible to examine aspects of creative problem-solving and moments of insight in such activity. Mapping of the emerging structure was also possible and this, combined with analysis of procedures and strategies outlines in the verbal protocol, allowed a view into the compositional 'flow', and a resultant hypothetical model of composition process to be presented. Practical issues around the use of verbal protocol methodology are discussed.
Creative collaboration between audiences and musicians in Flock Freeman, Jason - Godfrey, Mark 2010 - 21 – 2 : 85
Using music and motion analysis to construct 3D animations and visualisations Chen, Kuen-Meau - Shen, Siu-Tsen - Prior, Stephen: : 91
The effects of the sound-image relationship within sound education for interactive media designYantaç, Asim Evren - Özcan, Oguzhan 2006:91
For this research, we examined projects and the related sound designs of the students taking the Sound Design course in the Interactive Media Design Program of Yildiz Technical University and studied the effects of the way the sound-image relationship has been handled. In respect to the findings of the research, the article reveals the criteria by which students can produce projects with more creative solutions regarding sound.
The Picturing Sound multisensory environment: an overview as entity of phenomenon Williams, Ceri - Petersson, Eva - Brooks, Tony: 106
This paper presents three case studies selected from a sample of teenage children (n = 11) having severe disabilities. Personalised audiovisual environments are created with a targeted goal to encourage interaction, creativity and artistic expression from the teenagers. The feedback stimuli is directly linked to the child's gesticulations for a sense of associated control to be available for recognition. Non-intrusive sourcing of gesture is through camera data mapped to computer vision algorithms. Intervention strategies from staff and helpers within such user-centred environments are questioned. Results point to the positive benefits for these children such as increased eye-to-hand coordination, concentration duration, andimproved communication. These findings corroborate with other research in being indicative of the potentials in utilising such interactive multisensory environments in special schools and institutes as a supplemental tool for traditional methods.
Real-time composition of image and sound in the (re)habilitation of children with special needs: a case study of a child with cerebral palsy Azeredo, MariaPages: 115 ·
Controlling musical emotionality: an affective computational architecture for influencing musical emotions Livingstone, Steven - Mühlberger, Ralf - Brown, Andrew - Loch, Andrew: 43
Extending previous work on the modification of perceived emotions in music, our system architecture aims to provide reliable control of both perceived and induced musical emotions: its emotionality. A rule-based system is used to modify a subset of musical features at two processing levels, namely score and performance. The interactive model leverages sensed listener affect by adapting the emotionality of the music modifications in real-time to assist the listener in reaching a desired emotional state. ……The dynamic MIDI scheduling, emotion agent, rule engine and audio generation unit have all been written in Impromptu, a dynamic programming music environment for MacOS X
The Beatbug - evolution of a musical controller Pages: 3 Weinberg, Gil ·
MusiCam: an instrument to demonstrate chromaphonic synesthesia Yau, Derek - Mccrindle, Rachel : 121
Papers from Digital Creativity online journal http://www.tandf.co.uk/journals/ndcr
Real-time tracking of the creative music composition process Collins, DavidPages: 239
This paper discusses outcomes from a longitudinal single-case-study where immediately retrospective verbal reporting was combined with real-time digital data collection procedures. By triangulating data from computer 'save-as' files, transcription of the verbal protocol, semistructured interviews and verification sessions with the composer, it was possible to examine aspects of creative problem-solving and moments of insight in such activity. Mapping of the emerging structure was also possible and this, combined with analysis of procedures and strategies outlines in the verbal protocol, allowed a view into the compositional 'flow', and a resultant hypothetical model of composition process to be presented. Practical issues around the use of verbal protocol methodology are discussed.
Creative collaboration between audiences and musicians in Flock Freeman, Jason - Godfrey, Mark 2010 - 21 – 2 : 85
Using music and motion analysis to construct 3D animations and visualisations Chen, Kuen-Meau - Shen, Siu-Tsen - Prior, Stephen: : 91
The effects of the sound-image relationship within sound education for interactive media designYantaç, Asim Evren - Özcan, Oguzhan 2006:91
For this research, we examined projects and the related sound designs of the students taking the Sound Design course in the Interactive Media Design Program of Yildiz Technical University and studied the effects of the way the sound-image relationship has been handled. In respect to the findings of the research, the article reveals the criteria by which students can produce projects with more creative solutions regarding sound.
The Picturing Sound multisensory environment: an overview as entity of phenomenon Williams, Ceri - Petersson, Eva - Brooks, Tony: 106
This paper presents three case studies selected from a sample of teenage children (n = 11) having severe disabilities. Personalised audiovisual environments are created with a targeted goal to encourage interaction, creativity and artistic expression from the teenagers. The feedback stimuli is directly linked to the child's gesticulations for a sense of associated control to be available for recognition. Non-intrusive sourcing of gesture is through camera data mapped to computer vision algorithms. Intervention strategies from staff and helpers within such user-centred environments are questioned. Results point to the positive benefits for these children such as increased eye-to-hand coordination, concentration duration, andimproved communication. These findings corroborate with other research in being indicative of the potentials in utilising such interactive multisensory environments in special schools and institutes as a supplemental tool for traditional methods.
Real-time composition of image and sound in the (re)habilitation of children with special needs: a case study of a child with cerebral palsy Azeredo, MariaPages: 115 ·
Controlling musical emotionality: an affective computational architecture for influencing musical emotions Livingstone, Steven - Mühlberger, Ralf - Brown, Andrew - Loch, Andrew: 43
Extending previous work on the modification of perceived emotions in music, our system architecture aims to provide reliable control of both perceived and induced musical emotions: its emotionality. A rule-based system is used to modify a subset of musical features at two processing levels, namely score and performance. The interactive model leverages sensed listener affect by adapting the emotionality of the music modifications in real-time to assist the listener in reaching a desired emotional state. ……The dynamic MIDI scheduling, emotion agent, rule engine and audio generation unit have all been written in Impromptu, a dynamic programming music environment for MacOS X
The Beatbug - evolution of a musical controller Pages: 3 Weinberg, Gil ·
MusiCam: an instrument to demonstrate chromaphonic synesthesia Yau, Derek - Mccrindle, Rachel : 121