Onyx Ashanti is a musician who is developing a genre he calls “beat jazz” through his own technological inventions. His earlier attempts displayed in his TED talk performance included two handheld controllers, or boxes, an iPhone, and a mouthpiece. Accelerometers on each hand read hand position. Colored lights on the boxes signify the instrument being played: red-drums, blue-bass, green-chords, orange-lead solos, purple-pads. The mouthpiece—a button glued onto a guitar pick with another guitar pick on top to help him click—provides another controller. And the smart phone is a display for system parameters. Here is his TED talk audition video.
He has developed some more recent controllers and physical interfaces, but I like this earlier one because the controllers are more clearly boxes. Regardless of controllers, the genre of beat jazz combines:
- Live Looping
- Jazz Improvisation
- Gestural “Sound Design”
What I like about this example is the clear object-human-digital-soundscape assemblage and its primarily improvisational and emergent mode of operation. Ashanti selects instruments with the hand controller, plays them with his mouth and hand gestures, and loops them through a digital pedal or sequencer. The audience for his gestures and “speech” is primarily the technologies. The human response to music is a tertiary after-affect of the “secondary” audience—the object-human-digital-sound assemblage. Here the body in 3D space adds another layer on to the 3D space of the soundscape. (We could make a similar “assemblage” argument for the Boss RC-30, but it is foregrounded more in this example.)