Under the Hood of Dashboard Baby

I recently published a fractal animation named Dashboard Baby that also illustrates the capabilities of my fractal animation software prototype.

The creation process took over a month from the first animation draft to the final release and we exchanged a number of versions of both the animation and the soundtrack with Dj Tabora.

Interpolating Fractal Details

My previous post introduced brushes and the background flow field that are the main visual components of the application. I’ve also developed a method to smoothly adjust the level of detail of a fractal based on the contents of the flow field.

You can see this varying level of detail in use throughout the animation. Most of the time there are more details on the areas that I’ve painted with brushes. Just before the outro scene you can see how the painted areas begin to show less details than the peripheral areas. This is reverted back during the final outro scene.

The image below illustrates the idea of interpolating iterations.

The fractal set is shown for iterations ranging from 8 to 14. The flow field contents used to interpolate between iterations is shown at bottom left. Black corresponds to 8 and white 14 iterations. The bottom right image shows smoothly interpolated iterations using the flow field on the left.

The fractal set is shown for iterations ranging from 8 to 14. The flow field contents used to interpolate between iterations is shown at bottom left. Black corresponds to 8 and white 14 iterations. The bottom right image shows smoothly interpolated iterations using the flow field on the left.

A couple of mathematical tricks were needed to hide the iteration boundaries and make the interpolated result look smooth. I’ll discuss these tricks and provide the associated CG shader source code on a later post.

The Ducky Nova Fractal

The following presumes you are familiar with computing fractals, so buckle up and watch out for equations. The fractal set used throughout Dashboard Baby is available for Ultra Fractal (UF) with the name Nova Ducky. It combines a convergent fractal known as Nova with a mirroring feature that has been used in a number of UF formulas labeled as Ducky. The fractal set is generated by iterating the function

\hat{z}_{n+1} = z_n - R \frac{z_n^2-1}{2z_n} + C

z_{n+1} = |Re \; \hat{z}_{n+1}| + |Im \; \hat{z}_{n+1}| i

where R and C are complex parameters called relaxation and seed, respectively. The outro scene uses values R = (3.3, 2.68) and C=(-2.6, 6). New iteration values z_{n+1} are given by the componentwise absolute value of \hat{z}_{n+1}.This flips negative components back to the positive top right quad of the complex plane, which gives rise to the symmetrical supercharged kaleidoscope appearance characteristic to Ducky/Talis fractals.

Here’s an upr for Ultra Fractal users from the final scene of Dashboard Baby.

 DashboardBabyFinalScene {  
  title="Dashboard Baby Final Scene" width=960 height=540 layers=1  
  credits="Kevin Kerttunen;11/19/2013"  
  caption="Background" opacity=100  
  center=8.7076952615/-0.451219253 magn=0.94692544 angle=-81.0557  
  maxiter=15 percheck=off filename="jh.ufm" entry="jh-NovaDuckyJulia"  
  p_bailout=1E-20 p_power=2/0 p_relaxation=3.3/2.68 p_seed=-2.6/6  
  p_symmetryMode=Abs p_absOffset=0/0  
  density=2 transfer=linear filename="jh.ucl"  
  entry="jh-InverseDistanceToPoint" p_point=0/0 p_offset=0  
  comments="Simple grayscale gradient." smooth=yes rotation=-200  
  index=10 color=16448508 index=51 color=5002588 index=87 color=0  
  index=204 color=2961975 index=-157 color=5792620 index=-94  
  smooth=no index=0 opacity=255  
Improved Fractal Animation Workflow

I took some time off from my day job to work on a creative tool for fractal and algorithmic art. Here’s what led me to this project.

Animation Workflow with Ultra Fractal

I’ve used Ultra Fractal (or UF for short) for over 8 years, first for making images but later I also became interested in fractal animations. While UF is very customizable and excels in making complex, layered fractal images, performance and timing feature limiations make it badly fit for complex animations. With simple formulas, only one layer and a decent CPU, I can get a 940×540 preview @ 10fps while I work on the fractal parameters. Rendering a 30-second clip at 1280×720 can take anything from 15 minutes to several hours. With more layers and more complex formulas, animations can take days or weeks to render.

UF can’t play audio so I need to use a video editor to add a soundtrack to the animation clips. Tedious timing calculations are needed in order to synchronize fractal animation with music. And if the timing is off, it can take hours to change the fractal and re-render the clip.

Improved Workflow

I wanted to make an application that simplifies the process of combining music with algorithmic animations. After 7 months of development I had a prototype that provides a more streamlined workflow.

In the heart of the application are brushes and the background flow field. Brushes are used to paint colors on the flow field with mouse. The brush colors move around and fade away on the flow field.

A flow field is a feedback texture effect that uses a mathematical formula to offset the texture contents from the previous frame. Colors appear to flow on the screen when the flow field texture is updated, say, 30 times per second and the offset is adjusted accordingly. The way this offset is calculated determines where and how fast the colors move. Flow fields have been fairly popular in music visualizers in the past. Cf. G-Force for iTunes and MilkDrop for Winamp.

The application provides a number of parameters to adjust the visual output. Examples include brush size and color, flow field zoom speed, zoom center, output color palette texture, palette scale and palette offset. The parameters can be adjusted with immediate feedback and any changes can be recorded, replayed and looped. Beat-based timing is used instead of seconds to simplify synchronizing animation with music. I can pick an audio file to be played on the background as I edit the parameters.

The application uses shaders to compute the graphics on the GPU, providing an immediate FullHD view of the animation. While UF provides arbitrary precision on CPU, using GPU only uses single precision numbers but it turns out you can do all kinds of cool stuff with it anyway.

The prototype editor has a number of issues and limitations that makes it unfit for public distribution. For example, adding new audio tracks and palette textures that determine output colors require access to source code. At the moment of writing I’m undecided whether this software is worth developing further.