Video Feedback Kinetic Sculpture Overview (Phase I)

On mobile, tap image to play or VIEW ON VIMEO

This contraption uses two HD monitors, a Nikon DSLR camera and a sheet of beam splitter glass to produce video feedback and fractals in real-time. You'll see naturally occurring Sierpiński triangles, and patterns and shapes found in nature.

This is part performance art, part kinetic sculpture. In this video you see me operating the device, and the images that are created by it.

The unique thing about this is that it uses HD video, and not one, but two monitors, plus a sheet of beam splitter glass to create a reflection that gets folded back in to the image.

It’s a delicate art to operate the device, an interplay between the camera and monitors, the position of the monitors, and the monitor control dials (hue, saturation, brightness and contrast). Doing controlled feedback like this requires these control dials, but most HD TVs and monitors don’t have analog knobs like old CRT TVs did, making it difficult to create controlled feedback in HD.

Now, for the first time, I’ve been able to do this. I found used older HD field monitors at a relatively good price (they cost thousands of dollars new). These HD monitors have the analog control dials, plus the additional benefit of the dials being in a separate module. The cable to this module was very short, but I was able to lengthen the cable so the dials can be operated simultaneously as the position of the camera is changed relative to the monitors.

All the images in this video are created by video feedback only - no computers are involved. The upper and lower monitors both display the same thing - the image from the camera, which is looking at the upper monitor. This creates a video feedback loop (much like a microphone next to a speaker creates an audio feedback loop).

Although what's displayed on each monitor is the same, the lower monitor's image is mirrored when it gets mixed in with the upper monitor by the half-mirrored (aka beam splitter) sheet of glass that is at an angle between the two monitors. This mirroring depends on the rotation of the lower monitor (you can see me turning the lower monitor from 360° to 180° in the video). Some of the fractals created in this video are with the lower monitor at 90° - at a right angle to the upper monitor (a configuration not shown in this video).

The angle of the glass (which can be made askew), the height of the upper monitor and position and rotation of the lower monitor, plus the position of the control dials of both monitors, all affect the final image that's created. It looks like magic, but it's really mathematics (maybe magical mathematics).

The last few images were created with the two-input incarnation of the device. More of this can be seen here: Dual Integrated Feedback Loops

Credit goes to Peter King for his 1997 diagram of a sheet of glass used between two monitors to create fractals that inspired this configuration of the device.

Music by Nils Petter Molvær
 (closing loops)

Phase II - Synchronized Rotating Blue Screen/Monitor Keyed 2nd Input Proof of Concept


On mobile, tap image to play or VIEW ON VIMEO

This is the first test using the keyed input from the synchronized rotating blue screen and monitor. 

The Roland switcher/keyer gets Input 2 from the Canon, which is looking at the rotating blue screen/Elecrow monitor. The Elecrow monitor gets its input from the phone. This image is keyed over the switcher’s Input 1. 

The switcher’s Input 1 is from the Nikon, which is looking at the upper Panasonic monitor, which gets its input from the switcher’s Program Out. The lower Panasonic monitor get its input from the Nikon.

This adds movement synchronized to the Nikon to an otherwise static keyed object.

Check out the video HERE to see this setup. 

Music by Rubba

Phase II - Flower Power

On mobile, tap image to play or VIEW ON VIMEO

Playing around with Phase II of the Device

Music by Greta Van Fleet

Phase II

The beginning of Phase II HERE

Perfecting the Image

On mobile, tap image to play or VIEW ON VIMEO

In this video I play around with variations on one camera / monitor setup.

This marks the end of Phase I

Stranger things to come in Phase II...

Music from Bach's Musical Offering as a tribute to Douglas Hofstadter and his book Gödel, Escher, Bach

A Creature Comes to Life - Audio Integration Weirdness

On mobile, tap image to play or VIEW ON VIMEO


The audio on this was made at the same time the feedback was. To see how check THIS out.


Read Rich Walkling's full review HERE

Feedback Sculpture with Dual Integrated Feedback Loops

On mobile, tap image to play or VIEW ON VIMEO

In this video the upper and lower monitors have different input sources. Instead of both monitors displaying what the Nikon camera sees, now the lower monitor is getting its input from an iPhone camera. Check out to get a better idea of what's going on.


Originally, both monitors were showing the same thing - the output of the Nikon camera on the rod (what I'll call the main camera). This created the feedback loop between that camera and the upper monitor. The glass between the two monitors is beam splitter glass with a rating of 50/50 - that is, 50% transmission, 50% reflection (this is the kind of glass used in teleprompters). This glass allows the camera to see through to the upper monitor, and also see a reflection of the lower monitor.

Depending on the rotation of the lower monitor you get different fractal patterns. The jellyfish-like images are created when the lower monitor is positioned so the upper and lower monitors' bottom Panasonic logos are next to each other. This makes the top right corner of the upper monitor reflected on the top left corner of the glass. The fern-like patterns are created when the lower monitor's bottom logo is facing out. This makes the top right corner of the upper monitor reflected on the bottom right corner of the glass. The Sierpiński triangles are created when the bottom monitor is perpendicular to the upper monitor. This might be a bit confusing when read, but if you look at the photo it becomes easier to understand.


In this video, instead of the lower monitor having the same input as the upper monitor, it gets its image from an iPhone. Originally, I was going to put something on the lower monitor like a photo of a face, and have that, in effect, be half-dissolved with the feedback in the upper monitor. The way this works is the main camera would be looping with the upper monitor, but also see the face from the iPhone in the lower monitor in the reflection of the glass. This image of the face would influence the feedback. When I made the above-mentioned video on my tumblr site explaining the new second input setup, I realized that having two feedbacks loops would be more interesting than just having the image of a face on the lower monitor.
Phone camera looks through glass to lower monitor
The phone was looking down, through the beam splitter glass, to the lower monitor, and the phone was close to perpendicular to the lower monitor. I zoomed that camera in so it started making a feedback pattern. So, the thing being mixed in with the feedback loop of the main camera and the upper monitor is another feedback loop - the one between the phone's camera and the lower monitor.

Since the phone's camera is looking down through the beam splitter glass, it is not only feeding back with the lower monitor, it's influenced by what is on the upper monitor in the reflection of the glass. And, since the main camera is also looking through the glass, it is not only feeding back with the upper monitor, it's influenced by what is happening on the lower monitor in the reflection of the glass - which, of course, is influenced by what is happening on the upper monitor, and on and on...

This creates a feedback loop of a feedback loop. I'm not sure if this is the infinite squared or the infinite to the power of infinity, or what. I just discovered this setup a few days ago, and haven't really been able to wrap my mind around everything that's going on here. I'm not sure if anything like this has been tried before.

There are so many more things I'd like to do with this new two input setup. For instance, it might be interesting for the second camera to be looking at the entire structure while I'm using it, or just at the main camera as it moves around, or even at my hand on the rod as it moves, so the movement of the structure as it is being used influences the feedback made by the structure.

And it occurs to me that pointing the second camera at the device itself (either the full thing or a piece of it), is also a feedback loop of sorts - not a direct one like a camera looking at a monitor that is displaying what the camera sees - but one involving the operator as an intermediate step. So let's say the second camera is looking at my hand on the rod that moves the main camera. This image is mixed in, and influences the feedback created by the main camera, which I, as the operator then see, which influences how I move my hand (and on and on).

More to come.


I chronicled the building of the new Video Feedback Kinetic Sculpture. You can check that out in REVERSE CHRONOLOGY or FORWARD CHRONOLOGY.

Practice Four

On mobile, tap image to play or VIEW ON VIMEO

This is one of the first tests of the new HD device using both monitors.

Practice Three

On mobile, tap image to play or VIEW ON VIMEO

This is a test of the HD device using only one monitor. 

It's difficult to maintain feedback in an interesting way without the screen going all white or all back. Here's a pretty good 20 minute block.

Practice Two

On mobile, tap image to play or VIEW ON VIMEO

Testing the HD device with one monitor.

Practice One (2020 Incarnation)

On mobile, tap image to play or VIEW ON VIMEO

I've built a third incarnation of the Video Feedback Device. This one uses two High Definition monitors and a Nikon D810 for the camera. This video is practice using the device with just one monitor.

For a complete chronicling of this build, go HERE for reverse chronology and HERE for forward chronology.

Original Light Herder Header (2010)

"The Video Feedback Machine allows the Operator to create a small universe in a plexi-glass box. The plexi-glass box contains a small HD camera and HD monitor that displays what the camera sees. This creates a video feedback loop.

You may have seen video feedback as wild spinning colors in '70s Hendrix videos. But when the feedback is tightly controlled, as with the Feedback Machine, it can be quite sophisticated and intricate, creating beautiful morphing organic shapes found in nature.

This amount of control comes from the camera's ability to move smoothly in relation to the monitor. The Operator sits in the chair and uses something much like a yoke on an airplane to very smoothly move the camera forward, backwards, and 360 degrees around its axis. Small changes in degrees of rotation and distance create amazing changes in the feedback image.

On this yoke are four control dials: Brightness, Contrast, Color saturation and Tint. These affect the monitor in the plexi-glass box and allow even more control over the created image.

The image being created on the monitor in the plexi-glass box is mirrored on the large HD monitor, which is what the Operator (and others in the room) will be viewing. I say this creates a small "universe" because the world we live in is a complex feedback loop. All biological functions operate on a feedback loop and it is no wonder that the images created using video feedback are so organic looking.

Ecosystems, geological systems and social systems all operate on feedback loops, and they operate according to the inherent rules of that system. With the Feedback Machine, these rules, or laws of the universe, are the camera's angle, distance from the monitor and control dial positions.

But where does the image come from you might be thinking, and why does it actually exist? It comes from itself, and exists only because it exists. Something worth pondering.

To me this must be what the brain is like. What I mean is, if some scientists, say from another planet, came in to the room where the feedback was being created, and tried to derive where it came from, they'd be at a loss. If they dissected the apparatus, the monitor, the camera, the wires, they'd find no clue to the origin of the pattern they saw on the screen.

This is like opening up the brain, poking around, and trying to find the mind or soul. These things grow through iterations, through cycles that start small and flourish, and can't be seen once they are gone (or the screen has gone blank). Imagine a dark room where the camera is looking at a dark screen. It will stay dark this way forever (no life, no soul) until a "spark of life" (say the lighting of a match) brings forth an image, which will then perpetuate itself on and on. But, then imagine something blocking the camera's view of the screen, just for an instant. All of a sudden, the image goes out (death), never to return exactly as it was. To me, this is what the mind is, just a complex pattern.

So the question of "where do you go when you die" is as meaningless as the question "where does the snowflake pattern go when the snowflake melts?" To complete the analogy, the Universe is the monitor, matter and energy are the image, and God is the camera, the all seeing eye in the sky. Or something like that."

One (2007 Incarnation)

On mobile, tap image to play or VIEW ON VIMEO

This is old-school analog light herding using a standard definition prototype of the Feedback Machine. No computers were used (or injured) in the creation of these images. Music by Einstürzende Neubauten.

This was created with the second incarnation of the device in 2007 with an old standard definition CRT TV and a small standard definition security camera. 

Two (2007 Incarnation)

On mobile, tap image to play or VIEW ON VIMEO

Music by Jerry Sneede & Harmon Leste (yes, lots of feedback in the music, too).

This was also made with the second (2007) device.

Three (2003 Incarnation)

On mobile, tap image to play or VIEW ON VIMEO

This one starts with a "spark of life." Music by Russill Paul.

This was made with the first feedback device in 2003 with a standard definition CRT TV and a Sony Mavica MVC-CD500.

First Prototype

This is the first incarnation (2003). The second (2007) had the television's brightness/contrast/saturation/tint knobs de-soldered, rewired, and on the yoke.

This one is in the back house of the Watership Inn, Provincetown, Massacusettes in 2003. This photo is slightly faked. Since I had only one camera, this picture was taken with the camera that would normally be sitting in the wooden box (with the output wire going to the TV), and the image on the TV is playback from VHS of a previous recording session.

The camera in this first prototype was a Sony Mavica MVC-CD500, and the camera in the second prototype was a small mid-90s era security camera.