I liked music as a kid because it felt creative and spontaneous. Ideas for songs popped out of my brain and surprised me. But as I muddled through piano lessons, squinting at inky lines that read like formulas while my teacher tisked over my shoulder, I started to realize that music is about structure. Soon, all that structure was programmed into me. Certain notes belong together in a scale and certain chords are meant to follow one another. You can write something creative, but if you want it to sound good, better run it through the structure-o-meter first.
Composers throughout history have relied on patterns and rules as they write. But in an age of superfast technology and sophisticated computer algorithms, art and science are fusing to producing some very interesting music. I saw this firsthand at a performance by the students and faculty of UCSC’s Electronic Music Studios called “Making the Electrons Dance.”
To my ear – liberated from piano lessons but still married to my good ol’ fashioned music theory values – these compositions broke a lot of rules. Some harmonies were jarring, many sounds were manipulated beyond recognition, and most pieces were quite unpredictable. But they also followed their own sets of rules. Many drew inspiration from principles of computer science and math that weren’t obvious on the surface.
For the first performance, called “Swing,” the program’s director Peter Elsea played an instrument of his own design: a pendulum hanging from an upside-down cup-shaped sensor. As Elsea swung the pendulum or tipped the instrument from side to side, chimes and ethereal tones sounds rang out from the speakers above the stage.
As he explained to me later, the instrument built harmonies using fuzzy logic. Unlike in binary systems — the kind you imagine controlling that flood of zeroes and ones behind most computer programming – variables in fuzzy logic can have values somewhere between zero and one. In Elsea’s instrument, algorithms make the decisions about what notes should come next. They depend on the position of the pendulum: how far and how quickly it deviates from its resting point in the center. Because he knew the rules, Elsea had a certain virtuosity with this instrument, even though its outputs weren’t entirely predictable.
In another surprising and pattern-inspired performance, Louis Johnston programmed his piece to play itself on a lone piano onstage. The keys hammered out his work, reaching intervals and combinations of notes that a set of human fingers would be hard-pressed to press. Elsea told me this piece drew inspiration from Markov chains, another mathematic concept for creating sequences.
“Musicians have always been at the cutting edge of technology,” he points out, “be it the industrial revolution or the computer revolution.” He said many of the technologies behind that night’s performance have only become available in the past few years.
Computer-driven music has made some people uneasy, not just for the experimental sounds it can produce, but for what it says about human composers. UCSC professor David Cope drew attention and criticism in the ‘90s when he debuted software called EMI (Experiments in Musical Intelligence), which used sets of rules to write music in the style of various masters. Some found these pieces indistinguishable in style from the musical greats they imitated, raising questions about the role of human creativity in an age of artificial intelligence.
But if the UCSC performance was any indication, smart computers aren’t going to elbow smart humans out of the music scene. While guest artist Ronald Alford rolled around the stage in his wheelchair, letting sensors in the wheels and casters send signals to his laptop, I wasn’t thinking about the computer code, but about the impulses in the mind of the performer. Computer science offers adventurous musicians a bewildering array of tools. They can open up new creative spaces… if you’re willing to learn the rules.