Project Proposal - DAT201

We’ve got an exhibition coming up for DAT201, set in the cross-point space of Roland Levinsky. I’ve been toying around with the idea of merging music with technology in a more interactive way. I plan to use a Kinect (or a normal webcam) to detect the motion or patterns of the people walking beneath. As someone enters the space, music will start playing and as the number of people increases different layers of the track can come up in the mix. It might also be possible to detect ‘gestures’ of the people walking below. If I were able to detect those patterns, it would be possible to control the music more intuitively; if a person was to walk in circles it could increase the tempo of the track.

Hopefully I’ll be able to acquire track stems (or compose some myself) for use in the project. I plan on using Processing for the brunt of the computing, as there are currently a number of open libraries that assist the manipulation of sound and others that allow access to the Kinect’s capabilities, such as stereo-3D, allowing a higher accuracy when detecting new people. It might also be possible to detect the height of the people below, which might also be used to manipulate the sound in some way.

As well as aural, I hope to provide visual stimulation in some way, either by generating visuals using Processing’s built-in graphics libraries or by linking up an Arduino to control a series of lights.

The final stages of testing can be done within the Roland Levinsky building itself, as it’s open access, otherwise the majority of development can be done from a computer lab.

© 2020 Dec Norton