Autonomous Mobile Camera/Drone Interaction | 2016
Jieliang (Rodger) Luo | rluo [at] mat.ucsb.edu
In Collaboration with Weihao Qiu, Christopher Chen, Dan Wang, Mengyu Chen, and Prof. George Legrady
Inspection is a drone-robotic interactive installation that explores the boundaries between the physical and virtual worlds. A continuously evolving virtual 3d photomontage results from the interaction between a drone, a ground robot, and eight ultrasonic sensors. The dynamic virtual 3d assemblage constructed by an adaptive Voronoi algorithm not only explores novel computational aesthetics in space but also evokes questions about the relationship between objects and space.
The ground robot is programmed to autonomously move at a restricted area and take photos at intervals. The approximate positions of the robot are calculated in real time according to the data from eight pair of ultrasonic sensors in a handmade sculpture in the center. The position data are sent to the drone so it can continuously make turns to face the robot and also take pictures every time it turns. Both the images taken by the robot and the drone are reassembled in a virtual space to construct an aesthetic reflection of the physical world as a reactive collective system to connect the two worlds.
The idea behind the visual representation came from Assemblage, an artistic form to rearrange 3D elements to explore new meanings. Unlike traditional assemblage to arbitrarily pick and place objects by artists, this project is interested in how algorithms can generate new visual aesthetics in terms of spatial structure. All the images taken by the drone and the robot are placed asymmetrically in a 3D virtual environment followed by an adapted Voronoi Diagram. As the structure keeps evolving by adding the most recent taken image to itself, it also serves as a non-linear narrative to document what has just happened in the physical environment. Please see the animation below to have a better understanding of how the dynamic structure was constructed.