Spatialized sound reproduction for telematic music performances in an immersive virtual environment

Chabot, Samuel R.V.
Thumbnail Image
Other Contributors
Braasch, Jonas
Xiang, Ning
Krueger, Ted (Theodore Edward), 1954-
Issue Date
Architectural sciences
Terms of Use
This electronic version is a licensed copy owned by Rensselaer Polytechnic Institute, Troy, NY. Copyright of original work retained by author.
Full Citation
Telematic performances connect musicians and artists at remote locations to form a single cohesive piece. As these performances become more ubiquitous as more people have access to very high-speed Internet connections, a variety of new technologies will enable the artists and musicians to create brand new styles of works. The development of the immersive virtual environment, including Rensselaer Polytechnic Institute's own Collaborative-Research Augmented Immersive Virtual Environment Laboratory, sets the stage for these original pieces. The ability to properly spatialize sound within these environments is important for having a complete set of tools. This project uses a local installation to exemplify the techniques and protocols that make this possible. Using the visual coding environment MaxMSP as a receiving client, patches are created to parse incoming commands and coordinate information for engaging sound sources. Their spatialization is done in conjunction with the Virtual Microphone Control system, which is then mapped to loudspeakers through a patch portable to various immersive environment setups.
August 2016
School of Architecture
School of Architecture
Rensselaer Polytechnic Institute, Troy, NY
Rensselaer Theses and Dissertations Online Collection
Restricted to current Rensselaer faculty, staff and students. Access inquiries may be directed to the Rensselaer Libraries.