[KGVID poster=”https://judithshatin.com/wp-content/uploads/2013/07/thumb_tmp/resize-GestureControl2-web_thumb23.jpg” width=”550″ height=”310″]https://judithshatin.com/wp-content/vids/resize-GestureControl2-web.m4v[/KGVID]
Genesis & Development
Being in Time is a project whose exact genesis is, as usual, mysterious! Rotunda project. So, when the opportunity arose to apply for funding to put together a team to work on a project with collaborative elements, I immediately thought of a composition for wind ensemble with interactive electronics controlled by the conductor, and a visualization of the music based on amplitude and activity levels.
While I have been involved in interactive media prior to this, I have neither created nor found a work for such a large ensemble involving interactive media and visualization, creating a multi-modal experience.The particular idea for the piece, represented by its title, refers to our experience of life flow, so wonderfully embodied in music. While Being and Time is a subject for philosophy, and Being on Time is one of etiquette, Being in Time refers to the rich experience of entering the experience of flow.
Next was putting together the team! In addition to Prof. Pease, I contacted Ellen Bass, then Associate Professof Systems and Information Engineering at UVA and Dave Topper, Technical Director of UVA’s Virginia Center for Computer Music. In addition, grad students are an indispensable part of the team. They include Joe Adkins and Paul Turowski, both PhD students in Composition and Computing Technologies; and Nathan Trantham, who has just completed his MA in Engineering. And, as the project developed, we have added Rachel Beaton, a graduate student in Astronomy, who has interests in photography as well. And, undergraduates Monika Khot and have also joined the party!
We are looking forward to sharing the results of our efforts through a workshop in the spring of 2013, open to area college and high school instructors and students.
We began meeting as a team in January, 2013, and have set up biweekly meetings to discuss ideas and report on progress. In addition, I reached out to the Wind Ensemble for volunteers to record them as the basis for the electronic music component. A variety of students responded, and I recorded a number of the instruments that I would later process using a wide range of digital techniques. These included Oboe, Bassoon, Sax, Trombone, and I previously recorded local performers on Flute, Clarinet and Percussion. Plenty to work with!
After discussion with the team members, we decided to use a Kinect controller to read the conductor’s gestures, that is to read the change in joint motion, and to map them to the electronic music. Paul Turowski has been the point person for this program development. We worked with Bill Pease to decide on The location of gestures that the Kinect could read, but that would not interfere with his conducting. We determined that the gestures should mainly take place outside the conductor’s ‘box,’ that is the frame within which s/he normally moves. So, the gestures generally involve the left hand. For instance, pushing down and back puts the conductor in control mode. Then extending the left arm out to the far left triggers a sound file, and so on. We had also decided to incorporate the option of panning, using a small icon that can be moved in the stereo field. We are going to make the program available on the Being in Time website for others to explore and use for their own non-commercial purposes.
My original idea for the visuals was inspired by the night sky and deep space. While I wanted to make sure we represented the instrumental families, I wanted the background to give a sense of space and contemplation. I also wanted to leave room for more explosive elements and sudden changes. I started looking at a variety of images on line, especially those posted by NASA, and then started thinking about the possibility of local sources. This led me to the Astronomy Department, to my colleagues Bob O’Connell and Mark Whittle, and in turn to Rachel Beaton. We thought we would draw both on Rachel’s photos as well as a variety of public domain material as sources for the visualization. In the end, I decided that the meshing of time and place would be more effective, and based all of the background video on Rachel’s photographs, processed to create a sense of flow that moves from the particular to the abstract. Also, some local elements, ranging from images of the blue ridge to our local observatory can be seen at times.
To visualize the data, we needed a way to collect it and then transform it. The team came up with the idea of using 8 mics to collect amplitude activity and frequency band information. We tested this option with the Wind Ensemble, putting a mic by the principal player of 8 sections, spread through the entire group. While there is bleed through of the audio, for our purposes it should suffice. We are meeting biweekly, and are excited by the new ideas that keep percolating.
In the fall of 2013, we tested and refined our ideas, trying them out on multiple occasions with the Wind Ensemble. I had them go through a series of musical exercises, ranging from playing scales at different dynamic levels, to air sounds, to see whether the gesture control design would work, and whether the interactive video elements functioned the way we hoped. We refined both, with quite a bit of work going into the interactive video. It quickly became apparent that a simple link between color, hue and saturation would not create the kind of visual trajectory I had I’m mind. But after a great deal of experimentation, with the assistance of Paul Turowski and Joe Adkins, we came up with a visual design that changed over each of the eight sections, based on Rachel’s photographs of the Charlottesville sky. During this period, I continued to shape the electronic music as well as the score for the wind ensemble.
Finally, we started putting the entire piece together. In January, 2014, we had our first rehearsals of the wind ensemble part, and plan to add the electronic music and the visuals in late February/early March in preparation for the premiere. Due to so many snow days, the premiere has been rescheduled for the Fall, 2014. Meanwhile, I will introduce the piece at Open Grounds on the Corner in Charlottesville on Friday, April 18, to give an idea of the overall design and the electronic and video elements. And, during the last weekend of May, I will present it to the National Band Director’s Association, hosted by Bill Pease here at the University of Virginia.