View Full Version : How does it work?
24th July 2001, 02:12
I'm insanely curious - What sort of rendering technique are you using to do this? MilkDrop is the second plugin I've ever seen that used 3D hardware acceleration (with the exception of the dancing-whatever-it-may-be plugins), and the first that used it for anything other than scaling a "standard" non-accelrated plugin to a larger size with antialiasing. (Specifically, the "jakdaw" plugin for XMMS. It renders the plugin output to an OpenGL texture, which is then mapped to the screen size.) That plugin still needs the CPU to handle the processing of delta maps (I'm using the term that G-Force uses here - There are tons of other names for it, but essentially a map of vectors that say "pixel x1,y1 in frame n moves to pixel x2,y2 in frame n+1".)
How is Milkdrop doing these mapping functions? I'm interested in writing a similar application myself. (Don't get me wrong - Milkdrop is amazing. But it's Windows-only, and I'm a hardcore Linux user and don't want to have to reboot for my vis.)
Plus I'm currently doing something that's just plain impossible under Windows using the Linux port of Andy O'Meara's G-Force 1.1.6 (2.0 isn't open-source. :() - Currently, XMMS (Linux equivalent of Winamp) is running on my desktop machine, with sound coming from that machine. It's displaying visualization locally to its own display. The main XMMS window and playlist window are displayed on my laptop via VNC. Eventually I plan to use this for parties - The desktop in a secluded area hooked up to my LCD projector, with my laptop as a wireless remote control.
Windows has no native concept of remote displays, which makes tricks like this impossible. With MilkDrop and any other fullscreen vis plugin, you can only do basic control functions like play/stop/pause/prev/next, etc, unless you're running a multiheaded system.
The only deficiency my Linux box has compared to Windows for my purposes these days is music vis... G-Force 1.1.6 is cool, but not nearly as cool as G-Force 2.0, Geiss 4.x, or MilkDrop. I'd love to know how MilkDrop was written so I could try writing something similar for Linux. (Unless Ryan wants to try porting it. :)
27th July 2001, 08:36
I'm obviously not the authority on this, but from what I understand, Milkdrop uses the frame buffer on the vid card to store the texture, while it calculates the next frame. It's possible that it might also use the vid card to store z values for pixels (which can be translated into red/blue for 3d mode)
I do know that the reason Milkdrop renders so fast is that it only calculates a relatively small number of pixels using the actual perframe and perpixel codes. This is the size of the matrix. Once all these pixels have their xy velocity and xy stretch calculated (and hence their position in the next frame), all neightbouring pixels have their velocities calculated using a weighted average of the nearby pixels that were actually calculated.
I don't know if these calculations are done on the CPU of the vid card. Both devices are extremely good at doing vector math these days, though cards like the GeForce are hyper-accelerated for it (T&L engine and all that)
1st August 2001, 19:04
Well, what you say about calculating the X-Y velocity and stretch for only a small matrix and using a weighted average to get the velocity/stretch for other pixels helps a bit, but this doesn't help too much with the application of the actual transform itself (copying all those pixels)
Even with vis plugins that use a set of fixed delta fields that are only calculate once at initialization (look at the source for some XMMS plugins, such as Infinity and Jakdaw to get a good idea of the basics of vis), you still need a LOT of CPU just to push those pixels around from one frame to another.
MMX can help if you're doing the transforms in RGB space and not a palettized color space (do all 3 color components of a pixel at once). If you're using a palettized color space, it's not very useful, as you're RARELY going to copy a row of 8 pixels to an identical row of 8 pixels somewhere else on the screen. (Trust me - I've tried. I forget right now whether it was Infinity or Jakdaw that used RGB colorspaces at all points, but I was able to get a small performance boost from it using MMX. I could probably get more, but NOTHING like Milkdrop's ability to run wicked-fast at 1152x864.)
Jakdaw actually does use hardware accelration to a slight degree - It renders to an OpenGL texture, which is scaled up and antialiased by the hardware. (This leaves the interesting thought of doing plugin work in YUV color spaces - could make for some interesting effects, and most modern video cards support hardware scaling/conversion of YUV overlays in order to accelerate video applications, as almost any video compression codec in existence works in one of the YUV colorspaces.)
It seems to me like Ryan has somehow convinced the hardware to perform the inter-frame transform. (I'm sure the hardware can do it, as it's a similar technique to moving walls, etc. in 3D games).
A simple "waterfall" effect could be achieved by texture-mapping your waveform onto a square. Every frame, the square moves down the screen, and is faded a bit. (Don't know if the hardware can do the fading itself, but subtracting a small amount from every pixel in a given frame/texture CAN be easily accelerated a great deal with MMX, since all the pixels are staying together in the frame and it's easy to operate on 8-pixel blocks.) You'd have to keep moving your wave upwards on the texture map so that it always appears in the same position, and you'd also have to occasionally start a new square above the one scrolling off the screen.
Now the question is, is there a technique for making 3D hardware perform a 2D pinch/whirl/other transform given a set of X and Y shifts for each portion of the texture or object... Hmm, time to go figure out where my OpenGL book is and maybe actually starting to learn OGL. :)
6th August 2001, 10:34
Windows has no native concept of remote displays, which makes tricks like this impossible.
Yes it does. Netmeeting works excellently for something like that. Also Windows XP has a built-in Remote Desktop facility. (The security implications aside...)
30th August 2001, 01:13
I know of the application-sharing capabilities of Netmeeting, but that's very rudimentary. You can only share applications that are displaying on the local display also.
Same goes for VNC.
In both cases, you can't use VNC/Netmeeting as one remote display, while Milkdrop displays on the local monitor.
I'm not even sure if you could do this with NT Terminal Server, as the Win32 API just doesn't support applications choosing which display they want to connect to.
I only had to change 2 lines of code to have G-Force use a display other than the one used by XMMS. I don't think there's any way to do the same thing in Windows...
An interesting thought- Someone could code a seperate input/playlist management GUI that displays using VNC's RFB protocol - The braindead-ness of the Win32 API is bypassed.
vBulletin® v3.8.6, Copyright ©2000-2013, Jelsoft Enterprises Ltd.