Is there a corner of the technology sector that has been taking everyone else to school without sounding off about it?
PARTs 1–3 COMBINED.
This Easter weekend don’t feel bad about sugar. It’s been nature’s honey trap for millions of years – it’s exactly how the bees do it, tempted in with sweet sweet nectar to fill their boots and spread the flower love. You’re wired to want it.
Just like you’re wired to want what tech is seducing you with. If only it was to help life bloom.
There is a view of technology as one of the very roots of our age of crisis. It’s my view. Of a dominant culture of tech as we mostly have to work with it. Only, musicians have been working with it rather differently, and for a long time.
Which it’s taken me, a synth nerd, a long time to realise could teach the world to sing very differently, as we try to picture human tech futures.
You might have camped outside the Applestore in your time, but I think what we’re all judging you on there, as we join you in the queue, is that you’re hooked on the eternal promise of “better” and the dirty addictive hit of shiny and new. Which you know. I’d argue, as I trample you through the doors, that we’re a globalised culture of addicts to a very unhealthy view of the world – a robot world view, built by a kind of cult of engineering. Not of flower power.
But while our shared economics might have been driven significantly by the push to make and improve and throw away products with no instincts to biological, social or emotional contexts, music technology developers have quietly made some remarkable examples of how we COULD do it.
Matt Ballentine asked me to appear on the WB40 Podcast this week, and tempted me with this idea. One I’d not thought about at all before: “How did the big music technology companies get everyone to sign on to the development of MIDI? And why has it lasted so well?
Musical Instrument Digital Interface sounded like the 1980s were going to finally kill music. Many classically trained musicians and a lot of guitarists thought so at the time and had car stickers to prove it. Even I, as a teenager discovering I that I felt oddly drawn to those already creepy sounding 70s synth hero records that were saturated with the arpeggios of ancient-seeming modular systems and the woody tape glitch of echo boxes and Melotrons – even I thought the new sixteen-channel five-pin sync system seemed just too reductive and plastic and soulless to do much for wild musical creativity.
But MIDI is now essentially forty year old technology. And still connects products by all the world’s manufacturers so that gear heads can run hugely complex live set ups and all finish on the same streamer cannon pop.
This is not how Apple does it. How Microsoft does it. TV manufacturers might share HDMI leads but they certainly don’t share app platforms for watching the shows made by different content studios. But all the soft and hard music technology names expect you to mix up everyone’s products and get creative.
How did they do it?
I’d say forget thinking like tech or business. Look at the wider human context the work is being developed in.
Are you selling a product or creating an experience? As the history of music tech often shows, good design thinking can’t help but foster collaboration, because it taps into the same thing as any storytelling – shared emotional context.
It’s hard to see much difference between the nerdy, open-minded enthusiasm of some computer giant founders in the late 60s and that of music pioneers at the time. There are always big corporations setting standards and scrappy young upstarts blindsiding them by spotting a passionate opportunity, and passions were running high in the summers of late modernism, dreaming of new social, environmental and technological freedoms. But maybe there is one key difference in the experiences these two market groups created. A musical instrument is made to make music. Computing tech is made to do a million other practical things as well.
Music is a transportive, emotional, ineffable encounter with your feelings. Whatever the theory, practice, coding or soldering that enables your fingers to make sound, the point of all of that is to lift the soul or jump start the imagination. It’s not project management software.
After Covid, millions of us are longing for the experience of sharing music in a big crowd again, because it can – when it works – make group experience memories that shape part of your identity. When you’re using music tech to actually make that music with other people, it can help you feel born to do it. It adds up to a flow that can’t be measured by the sum of its parts. At least up until Steve’s solo.
Now, at this point you might rightly be deeply cynical about the music business, most passionately if you’ve ever been part of it probably – “one big ponzy scheme” as Matt reported a 70s music person saying to him once, off air on last week’s WB40 podcast that sparked all this. But as I heard Factory Records legend Tony Wilson say directly: “People think industry execs go to work for the money and musicians go to work to live near the art. Let me tell you, in my experience it’s more often the other way around.”
I wonder if this was much of the energy driving Ikutaro Kakehashi – founder of Roland, and perhaps the key instigator of the development of resiliently effective universal synchronisation system MIDI.
He was not musically trained, and like me had a passion for music anyway. But I also can’t help wondering about his even deeper emotional context, as to why he pushed to make musical instruments that would enable millions of people to take part in creation – and an open source technical interface to power it.
Young Ikutaro’s parents both died of disease when he was young. Tuberculosis. Raised by grandparents, exploring an interest in electrical engineering during much of his childhood, his home was destroyed by allied bombing during the war. He failed to get into university on health grounds and moved to a new city to open a clock repair shop. Moving back to Osaka to try for university again he experienced the city’s mass food shortage and caught TB himself, finding himself in a sanitarium for years and eventually feeling improved health thanks to clinical tests of a new drug effectively donated to him for trial. Loving radio, that evocative combination of music and engineering and far off places, and fixing home organs on the side, at 28, so the story goes, he opened a new store, determined to develop the ideal electronic musical instrument.
If you want to know why, in 1980, Kakehashi approached various instrument makers to help him work up a new interface for electronic instruments – thankfully including Dave Smith of Sequential Circuits, based across the Pacific in San Francisco – and how he convinced the bosses of Kawai, Yamaha and Korg to join in, I think it seems a bit clearer in his personal context: The lack of shared standards was not only holding back business technically but keeping out a lot of potential new music makers. Untrained, technically excluded, passionate artists in waiting. Who just might find common creative language in future facing music, no matter where in the world they were from.
After co-launching MIDI by hooking up a Roland JP-6 with a supposed rival Sequential Prophet 600, with Smith, to gasps at NAMM 1983, his products under Roland and Boss went on to define a new era of inclusive music making and, with MIDI, essentially unlocked epic worlds of creativity to my generation of unschooled dreamers.
Big business finds it eternally hard to stay collegiate, of course. Yamaha bought Dave Smith’s iconic Sequential company in 1989 and shut it down. But twenty five years later, it was an elderly Ikutaro Kakehashi who convinced the technology giant to simply give the name back to Smith to bring back the legendary brand of the Prophet V. Allegedly saying: “I feel that it’s important to get rid of unnecessary conflict among electronic musical instrument companies. That is exactly the spirit of MIDI.”
Imagine if Amazon, Apple, Microsoft, Google… name them all, acted like this from the centre.
Right to repair isn’t even a thing in most tech on Earth in 2021, yet music tech seems to go way beyond it – to cherishing technical heritage as a generationally relevant relationship. And, oh yeah, most music tech still just works.
Music has often been at the cultural forefront. Giving expression to social and technical advances alike. Music artists might suffer under streaming valuations today but the internet has also freed them from the need for a traditional record label for reach. And in doing so, plenty of them have demonstrated better marketing knowledge than most businesses – find your niche and love them back. Enough to make real community, listening to each other.
Relationships are about truth, just like storytelling – which music is. Bedroom sessions too now make the musical world go round and micro businesses thrive. Blockchain relationships are already re-evaluating the way artists define their work, with more of them turning pieces into NFTs as objects of value more like exhibits in a gallery. As well as amazing festival experiences and soundtracks to our formative years.
All this may be fundamentally tech enabled, and it might make good business. But it’s driven by more richly connected human rhythms than a less emotionally nutritious lust for shiny.
And there’s one other decidedly Not Typical Tech expression of the music tech scene I find interesting. The right to repair, and beyond that – passion for old kit as living icons.
It’s interesting as I reflect that the synth scene, of which I am instinctively a part, absolutely adores “vintage” gear. And not as some fashion fad, but as a timeless love of particular instrumentation. The only weird fashion blip towards synths in the past nearly sixty years of their existence was the last few years of the 1980s when the technology of making noises took a conspicuous digital shift and had musos selling analogue beasts with huge sounds and playful physical interfaces for peanuts in exchange for fiddly little black rack-mounts with a tiny LCDs and a line of thin black buttons. It’s when I could afford to buy my first ever synthesiser – the blessed Moog Liberation. Such junkstore prices didn’t last long.
Today, though all music production can be done on the drive of a laptop, synth nerds half my age love connecting keyboards with big knobs to A-frames of beat boxes and physical sequencers so they can play easily and expressively. And they’ll retrofit old machines with MIDI to be able to use the originals as well as the beautiful new machines made in their image.
As deliberately artificial as electronic music delights in sounding, there is something fundamentally human to be learned by other tech developers from this.
Of course, as a last point I’d add one thing. If you’re paying attention here, you’ll realise. I’m talking about all this in the context of centuries of western music making. A technological codification of tone developed over a thousand years that robot thinking today can easily appreciate – 12 notes in a scale, fixed tuning, recognised time signatures. What about all the musical traditions OUTSIDE this, going back thousands of years before Pope Gregory’s music school, Guido D’Arezzo’s notation, or the orchestrations of the European Enlightenment?
NOW you might be asking a good future technology question. A question asked, incidentally, by some music technology artists 90 years ago.
How can we design technology that reflects the human art instinct for connection, community, testimony, continuity? And go to all the effort of manufacturing tech products that tempt us with the sugar-rush of being real pollinators, not consumers?
Matt, Chris and I ride through a stone/beats-skipping history of music from Pythagoras and tetrachords to the TR808 and multi-timbral polyphony, before wondering why music tech might be a bit different to the rest of the the gear sector.