If you simply drop in a song, it will render in realtime. You don't have to render it to a video. I am working on a "live" mode that will act much like Winamp, where you would select a playlist of songs and have it cycle effects.
Uneducated suggestion, feel free to ignore if inapplicable:
I don't know if this is common cross platform, but my Windows has an audio in device called "Stereo Mix" which is just all audio output played by all applications mixed together and looped back to an input device. If you could set your live mode to listen on any input device, then people can select Stereo Mix and use whatever existing software they have for playing/playlisting and have astrofox produce the visuals. You'd be a sort of Winamp-less Milkdrop!
I guess I could just export the audio visual effects based on the audio track on a green background and then key them out. So not a major issue just extra steps.
Hi everybody, I'm the creator of Astrofox. Surprised to see this here, but I'll be glad to answer any questions.
Astrofox has been my side project for several years now. It's basically my playground for trying out things like Electron, React and WebGL. It's open-source, MIT licensed, and totally free to use.
This one took less than 20 minutes from scratch (never seen this app before) (rendering into 4k is another story - it is hours very slow, which is probably understandable with web rendering involved.)
Quick skim reveals it uses React and WebGL shaders to create these effects. React part is easy. Is there something recommended I could read about shaders in order to be able to contribute objects/effects to Astrofox?
The current standard for making live visuals for events (i.e. 'vee jay') is Resolume. The one thing that Resolume lacks is effective support for importing 3D objects.
Its nice to see blend modes in this app. That offers the possibility of building up cool effects layer by layer.
Looks cool! Any plans to support headless operation? What I had in mind was to design a scene in the UI (for example a daily podcast), then on command line take the project file and the audio file and generate the video file output.
Yes, the UI is all from scratch. My inspiration was from seeing other audio-reactive videos on Youtube, but those were rendered with either Adobe After Effects or a 3D program like Blender. I wanted to build something that was easy for the average person to use.
Agreed, saying it's a one to one alternative to Google Analytics is probably a misnomer. I think a lot of people, myself included, used GA because there were no simpler alternatives and better overkill than nothing.