The VIBE was part of the MABOS Exhibition at Hanover Quay in Dublin. Lots of people and good VIBES.
The “Social Vibes” project, renders collective human emotional states, mined from the Twitter network, as musical tones on a physical artifact, that resides in a public space. The physical artifact, known as the ‘Vibe’, presents itself as both sound sculpture and musical instrument.
When the ‘Vibe’ extracts user’s current emotional states, based on their tweets, it’s being played by unsolicited users and it can be perceived as an ambient musical instrument/ sculpture. It can be taken over and ‘played’ directly as an instrument, by Twitter followers. Hijacking the system, allows the user to indulge in playfulness, artistic exploration and mischief. The ‘Vibe’ mimics human characteristics by both enticing people to come and play with it, and deliberating on the current emotional status of the collective Tweeters, by posting tweets to its own Twitter account, through computer appropriated responses.
As part of our Masters degree we exhibited our projects in a public space in UL. These are a few photos from the exhibit.
This video demonstrates the Vibe being played ambiently, by mining the Twitter network for emotional expressions related to tweets from people. The mapping between the emotions expressed and the tones played are displayed on the LCD. Once a remote user tweets the Vibe directly, they take over from the Twitterstream and in a way hijack the instrument. Once the direct tweet has been played, the Vibe automatically returns to playing ambiently by mining human expressions on the Twitterstream.
The “Social Vibes” project comprises of a physical artifact designed and created specifically for an installation, to be situated in a public place. It is a custom designed and built physical structure, adopting the fundamental sound mechanisms used in a vibraphone, know also as a ‘Vibe’.
The instrument consists of twelve musical tones of different pitches. The music created on the instrument is derived from a continuous stream of input via multiple users on Twitter and the explicit interaction from Twitter users, tweeting the instrument directly to the project’s, “@vibe_experiment” Twitter account. Data associated with the emotional status of Twitter users, is mined from the Twitter network via Twitter’s open source, application programming interface (API).
For example if a user tweets “The sun is out, I’m happy”, the code I’ve written will strip out key words and strings associated with the user’s emotional state, within the tweets, ie “I’m happy”, and translate this to a musical notation. Mining Twitter’s API, allows a continuous stream of data. These emotional states are then mapped to specific notes on the physical musical instrument, located in a public space. The tempo of the musical expression will be entirely based upon the speed and volume of the incoming tweets on the Twitter API.
Twitter users who are both followers and non followers of the musical instrument’s Twitter account (@vibe_experiment) can tweet directly to the instrument and this direct interaction will be given precedence, allowing user’s who tweet directly to have their emotional state ‘played’. This allows users to hijack or take over the instrument and experiment with it in a playful manner, but also allows those with musical knowledge the potential to compose simple musical arrangements. When users are not tweeting the instrument directly, then the instrument will revert to mining the Twitter API.
To entice users to interact and observe the action of the instrument there is a live streaming broadcast of the instrument via Twitcam on the Vibe’s Twitter account. This is a live streaming broadcast of the instrument via Twitcam on the @vibe_experiment account. Twitcam, is Twitter’s built in live-streaming platform. This simply requires a webcam and a valid Twitter account.
The instrument constantly tweets back updates to it’s own Twitter account to not only inform people of the general status but also to engage users to interact directly with the ‘Vibe’.
When tweets come in they’re parsed for certain expressions within twelve emotional states, namely anger, hate, anxiety, fear, surprise, hope, love, happiness, shame, sadness, despair, desire. Tweets containing expression such as “I’m happy about….”, “I’m angry ….. that..”, “I really hate….”, “I love it when…. ” are stripped of the emotional expression and this emotional state is mapped to it’s own specific tone one the instrument. Each time someone expresses one of the twelve emotions a note is played. The small LCD screen on the Vibe is to show a viewer which emotions are being ‘played’.
If a person tweets the Vibe directly at it’s Twitter account “@vibe_experiment” then their tweet is displayed on the screen and the emotional expression is ‘played’. This allows a user to create a musical composition on the Vibe remotely. Or just piss off the viewer by creating annoying repititious sequences of tones. Once a person tweets the Vibe directly they take it over giving them the main stage for their 140 character concerto..!!
Considering the fact that there is a broad difference between emotional states, notes were specifically chosen across four different octaves as opposed to choosing one octave. The lower tones represented emotions like sadness, shame and despair, the mid range tones which are more sonically pleasing represented emotional states such as, happy, hope, love and the higher end tones whose pitches that tended to be more intense and sharp, represented the emotions that lie in an significantly more agitated state, such as anxiety, hate and anger.
I cut a 20 foot 3″ copper pipe into 12 calculated lengths for each of the resonators
The finished parts after drilling routing and sanding
Resonators cut into lengths corresponding to the wavelengths of the key frequencies.
I drilled and cut the resonators to hold the butterfly valve’s axles.
Each resonator is sealed and plugged inside with a copper headed plug.
The supports for the aluminum keys are cut and sanded.
I cut the keys from 4 foot lengths of 7075 alloy aluminum. The were cut to slightly longer than their required frequency.
I arched the underside of the keys by grinding away material. This lowers the tone of the key by up to an octave. Reducing the length and raising the volume of the bar.
The bars were continuously ground at the ends and the arch to achieve the required tone.Grinding at the ends raises the pitch and cutting at the underside lowers the pitch.
Assembly of the three primary parts.
Supports for the keys were added above the resonators.The top support beam is used to hold the solenoids that sit above each of the keys. When activated they strike the keys.
The keys are added. They are suspended above the support beam by custom made bolts. Each key is drilled at it’s nodal point (0.224 x Length) and threaded with nylon rope and tied to each of the bolts.
The solenoid support beams are added.
The solenoids were added. I attached felt dampers to some of the solenoid hammers. The butterfly valves were set into the top of the resonators. I linked a 12V 60RPM motor to each of the butterfly valves. This creates a tremelo effect for each of the tones.
Finally I assembled all three parts into a frame and suspended the entire piece within another custom built steel frame.
Testing the code.I wrote all the code in Processing and Arduino. Twelve circuits control the 12 solenoids. An Arduino microprocessor controls the entire 12 circuits.
The finished Vibe. A GLCD display was added and is controlled using a second Arduino microprocessor. Everything is controlled by the Processing sketch.