“Channels” is a full-body immersion installation in which you can navigate through a virtual water scene by physically interacting with tanks of water. Sit in a boat and experience being “on the water” by organically controlling your virtual environment with natural gestures — paddle, row, and float your way through space and time.
Control your environment through physical and organic touch, and transport yourself to a magical place.
This installation invites visitors to connect to nature, creating a sense of tranquility and serenity, and allows people to control their environment through physical and organic touch.
HOW IT WORKS
Sit down on a boat seat between two water tanks, in front of a virtual water world, and use your hands to paddle in buckets of real water. Moving the real water with your hands controls your speed and direction as you explore a virtual water world. The faster you paddle, the faster you will glide forward through the virtual scene. You can also glide backwards in the scene by paddling in reverse. In addition, the left and right water buckets are individually programmed to respond to subtle changes in direction, mimicking the physics of a real boat, so if you paddle more on the left side, your direction will shift right in the virtual scene, and vice-versa. Watch a counter of your speed and distance traveled in the scene in the top right corner of the screen, and row to the horizon — and back!
As you navigate through the 3D world, you hear nature and wildlife sounds and you encounter occasional objects in the water, such as floating ducks, cattails and lily pads, and other objects that you can collide with and paddle around, such as a bridge, trees, rocks, and logs. Along the way, you can choose whether to paddle closer to these things, or continue on the water without disturbing the natural order. To give you a sense of space in the virtual world, some of the objects, such as birds, fireflies, and fish, naturally fly, flit, or swim across the scene, making you feel truly immersed in a place that is teeming with life.
Watch our demo video:
This physical interface — a boat and water — allows you to actually move through virtual 3D space, while paddling in real water. You can change your direction the same way you would in a real boat, by paddling more on one side than the other, or by paddling in reverse. You can slow down the same way, as well.
To detect the speed, or rate of flow, of water in the buckets, we designed our own simple sensor using a flex sensor strip with half of a plastic spoon taped on the end of the strip. We tested many different designs for this sensor, with various sizes and weights of paddles attached to the flex sensor, and this design proved to be the most accurate and reliable.
We connected our two flex sensors to the Arduino microcontroller as analog inputs. We started out using 220 ohm resistors, but we were not getting very nuanced sensor readings, so we increased our resistors to 10K ohm and this seemed to give us a more specific range of values. Once we were able to get dependable sensor readings, we tested the flex sensors in water to see if we could detect subtle changes in the amount of water pressure and speed.
Then, we added serial communications to our virtual water scene in Processing, and mapped the incoming values of the sensors to a range of numbers in our sketch representing “speed” and “direction”. As the incoming sensor values are read in Arduino, Processing converts these values using the map() function to a speed value along the “z” axis. Then, Processing adds the two speeds together to get a total speed, which moves the user in a forward or backward direction in the scene. [speed = (speed1/10) + (speed2/10);] Using these incoming values, Processing also calculates in which direction the user is moving along the ‘x’ axis, i.e. how far left or right. [pan = ((speed2 – speed1)/10)*2;]
We also added a “colliding” algorithm to detect when a user is about to run into objects in the 3D world, such as rocks, trees, bridges, and logs, in order to block the user from moving through these real weight objects. This makes the paddling experience more realistic, and encourages the user to maneuver around the objects.
Read more about our work making a 3D scene in Processing using OpenGL (Open Graphics Library), which allows us to draw a 3 dimensional scene from our 2D primitive graphics, in previous blog posts.
We created our own audio piece combining sound effects of water rippling, wind, and birds, and used the minim library imported into Processing to play the mp3 file in a continuous loop while the water scene plays.
We tested our physical construction and user interaction models over a period of roughly 10 days, and we received some really good feedback along the way. People, especially women, reacted negatively to the cold temperature of the water, so we made sure to put room temperature water in the buckets.
People also really liked the surprise image of an old bridge in our virtual scene, and they requested that we put more one-of-a-kind objects in the scene. So we added fireflies, ducks, and frogs on logs as “Easter Eggs” in the scene.
We tested the size of the boat / seat with a man who is over 6 feet tall, and he was able to fit inside the construction. Wow, success!
We also asked several people to paddle through the scene to see if they could navigate around various objects, under the bridge, in order to test whether our sensor mappings were accurate. Based on these tests, we concluded that we should add semi-flexible plastic flaps inside the metal buckets to make more resistance against the back current of the water in order to enable more accurate control of the water. When we added these flaps, people were able to control their movements through the water even more precisely. At one point, I actually managed to steer through the bridge pillars.
Read about the background and inspiration for this project in previous blog posts.
Our project will be featured in the ITP Winter Show 2010 on December 19th and 20th!