Comment by joshavant
I want to build a large format, real-time, physical music visualizer that could orchestrate an artistic light symphony for any song.
I'm imagining physical visualizers that are columns of multiple, discrete light nodes, each able to have variable brightness and color.
The real-time music processing is the hard part (for me) to crack.
There's some standard tricks here: FFTs, bandpass filters, etc.
But I want to do more: Real-time stem separation, time signature and downbeat tracking, etc.
Imagine hearing Sweet Caroline and, when the horns kick in, the whole installation 'focuses' on the horns and bright yellow light jumps between each column on each horn note, before returning to tracking the bass line or something.
I've been noodling on this idea for a long time and slowly digging into the music and CS fundamentals. The rise of LLMs might finally be the piece that enables me to close my intelligence gap and finally build this thing...