The creative lighting, strange sounds and odd look of the project attracted large crowds at the Computer Human Interaction conference. Called the Humanaquarium, the large plexiglass box housed two musicians whose performance could be controlled by audience interaction.
To see and hear the sounds of the Humanaquarium, see a video on YouTube.
“We’re trying to figure out ways of engaging people with interactive art,” said Robyn Taylor, a researcher at the Digital Interaction at Culture Lab at Newcastle University. “We’re trying to express that performance is a dialog between the performer and audience members.”
Taylor, who sang inside the box with a fellow researcher, invited people to come up to the plexiglass and touch it. Their movements controlled different aspects of the performance.
“We shine infrared into the sides of the acrylic and when you touch it that lights up your finger and the camera sees your finger lit up,” explained John Shearer, a researcher with the same group.
“We use those points to affect various audio and visual effects.”
The touches control the audio on the outside and inside of the box as well as place various effects on Taylor’s voice. With her background in both music and technology, she and the team could build an interesting exhibit.
“I’ve written some software that visualizes my voice and visualizes different timbres in my voice,” she said “If I change the vowel sounds or the brightness or the intensity I can change the color inside the box.”
Shearer said the Humanaquarium has done a number of performances and that the team is working on another project that replaces human performers with robots.
Nick Barber covers general technology news in both text and video for IDG News Service. E-mail him at Nick_Barber@idg.com and follow him on Twitter at @nickjb.