Quote:
Originally posted by rabidhamster
If you look at the box filter, it detects the beat. Now you could work out BPM from this (but not phase). If you multiply this by the incoming data though, it emphasises things that happen in time with the beat.

Im not sure I get your description but from what I gather it is similar to a reasarch paper I saw that detected BPM (but not phase). The method was to take an entire song and compute the selfsimilarity matrix by convolving the entire fft spectrum  each element x,y in the matrix is the dot product of the fft vector at time index x and the fft vector at time index y. You end up with a gridlooking image with lines spaced according to the BPM. I cant find the paper I'm thinking of but here is one that uses something similar and it has a picture of what im talking about
http://www.ipem.ugent.be/mami/Public...rofuse2002.pdf
Another approach is to use the fft data to detect loud drum hits and such, and then perform some kind of combfilter analysis to get a bpm and phase based on where the peaks in this signal are.
It is fairly complex but is also more robust to nonrepetative musics than selfsimilarity methods are. There are some impressive results with this, check out this site and click results page for some examples.
http://web.media.mit.edu/~eds/beat/