I took some time off from my day job to work on a creative tool for fractal and algorithmic art. Here’s what led me to this project.
Animation Workflow with Ultra Fractal
I’ve used Ultra Fractal (or UF for short) for over 8 years, first for making images but later I also became interested in fractal animations. While UF is very customizable and excels in making complex, layered fractal images, performance and timing feature limiations make it badly fit for complex animations. With simple formulas, only one layer and a decent CPU, I can get a 940×540 preview @ 10fps while I work on the fractal parameters. Rendering a 30-second clip at 1280×720 can take anything from 15 minutes to several hours. With more layers and more complex formulas, animations can take days or weeks to render.
UF can’t play audio so I need to use a video editor to add a soundtrack to the animation clips. Tedious timing calculations are needed in order to synchronize fractal animation with music. And if the timing is off, it can take hours to change the fractal and re-render the clip.
In the heart of the application are brushes and the background flow field. Brushes are used to paint colors on the flow field with mouse. The brush colors move around and fade away on the flow field.
A flow field is a feedback texture effect that uses a mathematical formula to offset the texture contents from the previous frame. Colors appear to flow on the screen when the flow field texture is updated, say, 30 times per second and the offset is adjusted accordingly. The way this offset is calculated determines where and how fast the colors move. Flow fields have been fairly popular in music visualizers in the past. Cf. G-Force for iTunes and MilkDrop for Winamp.
The application provides a number of parameters to adjust the visual output. Examples include brush size and color, flow field zoom speed, zoom center, output color palette texture, palette scale and palette offset. The parameters can be adjusted with immediate feedback and any changes can be recorded, replayed and looped. Beat-based timing is used instead of seconds to simplify synchronizing animation with music. I can pick an audio file to be played on the background as I edit the parameters.
The application uses shaders to compute the graphics on the GPU, providing an immediate FullHD view of the animation. While UF provides arbitrary precision on CPU, using GPU only uses single precision numbers but it turns out you can do all kinds of cool stuff with it anyway.
The prototype editor has a number of issues and limitations that makes it unfit for public distribution. For example, adding new audio tracks and palette textures that determine output colors require access to source code. At the moment of writing I’m undecided whether this software is worth developing further.