Raspberry Pi Video Frame Manipulation @ Vibrations ’14

Vibrations 2014 posterFor Cairo Gallery‘s Vibrations Fest ’14 I built a video synthesizer app for Raspberry Pi with openFrameworks.  Christian Petersen, Nick Bartoletti and I set up a video projection area off in an open area of the forest near the main stage in Volunteer Park Seattle.


The video program was an interesting endeavor.  Raspberry Pi’s don’t offer tremendous processing power and although openFrameworks compiles on the Pi, not everything works as expected.

I have written this sort of program at least three times starting with my ‘Pungis‘ series last year.

The basic Pungis algorithm was:


For each pixel:
>  Choose a nearby pixel to blend with
For each pixel:
>  Blend with selected neighbor
For each pixel:
>  Shift color (in some mathematical way that changes over time)

Sometimes this program starts with a source image and slowly erodes it and sometimes it takes input frames over time from a webcam or some other source.  Additionally, when I have written this app in Processing, I have run the first step, choosing nearby pixels, on a background thread to keep the relatively long time to complete from blocking the draw thread.

frame00708My first hope for this project was to optimize the program by doing some or all of the pixel modification work in shaders because the Raspberry Pi uses OpenGL ES which I am fairly familiar with from my work on iOS and Android.  I managed to wrangle the of_v0.8.0 GL shader example to ‘work’ on the Pi.  However, I couldn’t reliably get incoming video frames from the webcam out of openFrameworks and onto the video card as textures, nor did the actual shaders themselves appear to work properly.  In addition, during the period I was still pursuing this solution to the problem I was getting extremely erratic capture speeds (between .3 and 60 seconds).

All-in-all, I had to fall back to an implementation almost identical, algorithmically, to my previous for the Pi: use of a background thread for per-pixel work and use of a cached sine function in order to get reasonable performance with a webcam giving me a 160×120 image.  Also, because I am less used to the available image tools in openFrameworks, I imagine there are some tools & techniques I could use to further boost performance.

In the end the code, ‘raspberry pi video test,’ manages to run a new version of the ‘pungis’ algorithm at realtime speeds on a low res webcam input.  It runs fantastically on a typical mac laptop or desktop.

Check out the app code at:

https://github.com/BenVanCitters/RaspberryPiVideoTest

Leave a Reply