Tuesday, May 12, 2009

Final ReacTable Mock-Up

Below is a mock up of what my ReacTable table will look like. I have the Macbook on the bottom shelf which powers the whole experiment through using ReacTIVision software and PurData. Pure data is a programming language which can read MIDI files via Open Sound Control which is the main audio signal for my ReacTable. The ReacTIVision software is programmed to recognise the Fiducials on the top of the table, when these are moved or rotated through user interaction, PureData distorts the sound through changing the level of the Sin wave. Therefore creating a rather flexible interface for a number of users to use at the same time and create a piece of music. This is exactly what I wanted to make as an example of interaction without the need for the spoken language, opening barriers and crossing boundaries for people all over the world.




Wednesday, May 06, 2009

Sound and Reactivision

After days of struggling to get the sound working with ReacTIVision, XML and Ableton Live I finally decided to use PureData and Reactivision to get the fiducials to link the sound to a Pure Data with help from Musa's blog

This proved to be much easier than using ableton as this send a message to the open sound source to play an oscillator sound so when the fiducials are rotates the sin wave increases.

Now I will have to make my own smaller version of the ReacTable to use to discover whether the table brings people together to make a song with out spoken communication but purely through interacting together and using the senses.

Composition on the Table



Four white tables have various user interfaces such as switches, dials, turn-tables and sliding boards that a player can touch. Projectors suspended from the ceiling project computer generated images onto the tables and interfaces. Projected images change in real time as if they were physically attached to the interfaces when players operate them. Also sounds are produced in relation to the movement of images.

Jam-o-drum


By combining velocity sensitive input devices and computer graphics imagery into an integrated tabletop surface, six to twelve simultaneous players are able to participate in a collaborative approach to musical improvisation. The Jam-O-Drum was designed to support face-to-face audio and visual collaboration by playing on drum pads embedded in the surface to create rhythmical music and effect visual changes together using the community drum circle as a metaphor to guide the form and content of the interaction design. The Jam-O-Drum is a permanent exhibit in the heart of Sound Lab at the EMP.

FTIR Musical Application


A musical interface for a large scale, high-resolution, multi-touch display surface based on frustrated total internal reflection. Multi-touch sensors permit the user fully bi-manual operation as well as chording gestures, offering the potential for great input expression. Such devices also inherently accommodate multiple users, which makes them especially useful for larger interaction scenarios such as interactive tables.

AudioTouch


AudioTouch is an interactive multi-touch interface for computer music exploration and collaboration. The interface features audio based applications that make use of multiple users and simultaneous touch inputs on a single multi-touch screen. A natural user interface where users can interact through gestures and movements, while directly manipulating musical objects, is a key aspect of the design; the goal is to be able to interact with the technology (specifically music based) in a natural way. The AudioTouch OS consists of four main musical applications: MultiKey, MusicalSquares, Audioshape sequencer & Musical Wong

MUSICtable


An ubiquitous system that utilizes spatial visualization to support exploration and social interaction with a large music collection. The interface is based on the interaction semantic of influence, which allows users to affect and control the mood of music being played without the need to select a set of specific songs.

realsound



An interactive installation combining image and sound through a tangible interface. By operating 24 buttons in on the table top he visitors can create an audio-visual composition of their own design. Each button has its own unique sound that is linked to an image.