Build a Pop Song

Build a Pop Song is another great example of technology being the first line of audience, but also of the object being split into two different activity systems–the performance and the mix. Both are aspects of composition, and converge on the same object–the digital track–but activate different aspects of the object.

Crystal Method

This shot uploaded to Facebook by Crystal Method is the perfect image for stomp box logic, object-oriented rhetoric, or materialist composition: first line of addressed audience is the gathered technologies. Human bodies are objects in the background, co-producing tertiary affects that feed back into the performative system.

Crystal Method - Live

Stomp Box Logic – Live!

 

 

Björk’s Biophilia

Björk’s Biophilia is the first ever app-album. The idea originated in 2008 as a physical building in Iceland that would function like a museum or school for kids to study music and nature. But when the iPad came out, it offered a new way to combine education, music, and interactivity in a 3D virtual world that would be workable, affordable, and available worldwide.

Björk began writing songs for Biophilia as a kind of curriculum, with lyrics about crystal formation, plate tectonics, and DNA replication, and started contacting top app developers to build iPad games for each song. Each song has a small visual toolbox that interactively combines some facet of music theory and a lesson on nature. For example:

  • Mutual Core” lets users arrange geological layers to form chords.
  • “Virus” lets users fight off an army of mutating rhythms by swiping invading pathogens away from an innocent cell.
  • “Hollow” lets users build a basic drum machine from DNA bases by lining up various proteins—different proteins change the rhythm, time signatures, tempo, or beats per minute.
  • “Crystalline” lets users tilt and swivel the iPad to add colorful crystals to a growing aggregate as they fly through neon tunnels.
  • “Dark Matter” lets users take control of a sound-creation tool, tapping pools of light to combine and mix tones and scales.
  • “Thunderbolt” lets users augment the song with flashes of lightning and booms of thunder to create a kind of beat box.
  •  “Solstice” lets users take control of vocals and layers of harps to create their own remixes.
  •  “Moon” lets users play a harp-like instrument.

 

On the “Mother App” or main interface, a 3D galaxy unfolds that lets users twist, zoom, and pan through the digital space. Each of the 10 major stars represents a song. When users tap a star, they are given ways to explore and interact with the music and concepts. As users pan through the terrain sound loops from the 10 majors stars overlap and fade into and out of each other, creating emergent sonic compositions.

 

 

Biophilia is significant not only because it gives the stalled music industry a glimpse of a possible future, but also because it’s based on an assemblage of human agents—musicians, programmers, visual artists, and users—nature, music, and technology to co-produce an emergent, affective experience through a physical object and a layered digital environment. Most of the musical arrangements are fairly spare and unmelodic in order to allow users the ability to play with and layer the music. The game player is expected to feedback into the system to create more complex compositions and co-produce a feeling, a vision, a passion, or an idea—not just through sound and words but also through the digital-material tools.

 

Ashanti’s Beat Jazz

Onyx Ashanti is a musician who is developing a genre he calls “beat jazz” through his own technological inventions. His earlier attempts displayed in his TED talk performance included two handheld controllers, or boxes, an iPhone, and a mouthpiece. Accelerometers on each hand read hand position. Colored lights on the boxes signify the instrument being played: red-drums, blue-bass, green-chords, orange-lead solos, purple-pads. The mouthpiece—a button glued onto a guitar pick with another guitar pick on top to help him click—provides another controller. And the smart phone is a display for system parameters. Here is his TED talk audition video.

He has developed some more recent controllers and physical interfaces, but I like this earlier one because the controllers are more clearly boxes. Regardless of controllers, the genre of beat jazz combines:

  • Live Looping
  • Jazz Improvisation
  • Gestural “Sound Design”

What I like about this example is the clear object-human-digital-soundscape assemblage and its primarily improvisational and emergent mode of operation. Ashanti selects instruments with the hand controller, plays them with his mouth and hand gestures, and loops them through a digital pedal or sequencer. The audience for his gestures and “speech” is primarily the technologies. The human response to music is a tertiary after-affect of the “secondary” audience—the object-human-digital-sound assemblage. Here the body in 3D space adds another layer on to the 3D space of the soundscape. (We could make a similar “assemblage” argument for the Boss RC-30, but it is foregrounded more in this example.)

Boss’ RC-30

The Boss RC-30, and other pedals like it, allows musicians to record small segments of music on multiple tracks and layer them to individually create songs or add color to a band’s performance.

The Boss RC-30

The Boss RC-30

This Boss pedal has two synchronized stereo tracks, three hours of recording time, 99 onboard memory presets, a rhythm guide with a selection of real drums beats, whose tempos and time signatures can be adjusted. It has multiple jacks for instruments and microphones, and a USB jack for both importing and exporting WAV files from a computer.

There are tons of demo videos online, both professional and user-generated. BOSS also hosts the Loop Station World Championships every year with many video performances online. This one, from Dub FX, is one of the most direct and succinct when it comes to showing how the loops are layered to produce songs. He hosts a number of NAMM (National Association of Music Merchants) demos for Roland and Boss with the RC-30, but I like this one from the street.

 

What we have in these looping pedals is a physical object with a digital architecture that is based on loops and layers. So much of our thinking about digital writing is two-dimensional, or influenced by a flat visual space, whether for images or words. These kinds of technologies for digital sound open up a layered third-dimensional space that produces feedback as part of the writing itself rather than waiting for audience feedback after the text has been produced and circulated.