Interactive Art

Most of these projects are my own personal experimental research and development into interactive art using a combination of skeletal trackinig using the xbox kinect sensor, MIDI from a midi controller and also midi routed from ableton live for some audio-visual fun. These were created from 2012 to 2014 so the graphics are limited.

Here is an explanation as to what is happening in the video :

Firstly my right hand controls the xyz coordinates of the cube.

My left hand y position controls the rate the cube spins, it also controls the rate of a low frequency oscillator controlling the cutoff frequency for the synthesizer you can hear in the video.

My left hand's x position controls the hue for the cube.

My head's Z position controls the feedback effect on the cube and also the pitch for the oscillator generating the initial sound.

This is how it was achieved : I made this using the synapse kinect application and several maxxforlive patches (within ableton live) to convert my hands' and my head's xyz coordinates into midi control change signals. I then used the cc signals to manipulate parameters on a virtual vst synthesizer within ableton live as well as routing the same midi signals using iac (inter application control) bus to control visual software vdmx.

Within vdmx I had a quartz composer patch I created (in quartz composer) that had several published inputs (xzy coordinates for the cube) that were also controlled via these midi cc signals . In addition to this I also controlled a quartz feedback effect within vdmx with the cc signals.

Credit goes to Ryan Challinor for making the initial synapse kinect software and maxforlive devices that I used to create this. http://synapsekinect.tumblr.com/

This is the fifth in a series of experimental research pieces focusing on realtime 3D, interactive visuals.

In this video the xbox kinect depth sensor camera is used to track the user's hands which in turn then control the 3d geometry in realtime.

Right hand xyz position controls the first sphere, left hand xy position control the feedback/trails effect.

As with the previous videos the initial 3d geometry was created in maya, realtime composition created in quartz composer but now with motion tracking using NI mate and an xbox kinect.

This is a work in progress development preview of our interactive realtime 3D visual Show.

Some of the features demonstrated in this video are:

realtime 3D geometry with dynamic interactive lighting which can be controlled via midi, OSC and/or sound reactive control.

Projection mapping of these realtime 3D visuals.

Dynamic interactive video textures pre UV mapped onto the geometry.

This is the fourth in a series of experimental research pieces focusing on realtime 3D, interactive visuals.

In this video a midi controller is used to manipulate the 3d geometry in realtime.

As with the previous videos the initial 3d geometry was created in maya, realtime composition created in quartz composer.


This is a quick video of a composition I made in quartz composer using the xbox controller to control the position of a sphere in 3d space, the small sphere also controls the position of the point light in the scene as well as the larger sphere's shatter style effects based on it's proximity.

This is really just a test video to show proof of my concept working.

This is in no way a finished project, merely a test.

In this video I demonstrate I am capable of two things;

Projection mapping - warping images to fit geometry of objects I’m projectiing onto

Control of video triggering using an external sequencer - in this case it is midi sequenced using Ableton Live (a music sequencer) which then sends the midi data to resolume avenue (vj software) which triggers the playback of the images in realtime.

Previous
Previous

Virtual Reality

Next
Next

3D Modelling & Photogrammetry