Thursday, April 28, 2011

pure data live music visualizations


I really want to do something like this at my recital next year. If I can find/design my own simple visualizer I'm going to give it a try this weekend to a few movements of Bach Cello suite 2. Most of the work I find in this style uses recorded music or live performance of computer generated music to trigger the visualizations. What I'd like to do, and what this example demonstrates, is using microphones and acoustic instruments as the triggers. The program will analyze different pitch classes and dynamic ranges to control the animation.
From the link....
"Pianist Hugh Sung demonstrates the use of Pure Data/GEM as a method for computers to react to live audio input and generate responsive visualizations. In this clip, Hugh performs selections from Charles B. Griffin's "Vernacular Dances". This performance took place on Sept. 30, 2008 at Wallenstein Castle in Prague, Czech Republic, presented by Music Bridges International"

No comments :

Post a Comment