Bone Conduction: Adventures in Speakerless Sound
Producer Joanie Thompson tests out our bone conduction prototype.
At Bluecadet, we’ve made our name designing and building content-rich interactive media in the cultural space. What this has meant in a lot of cases is the introduction of visual media—a.k.a. screens. While we love screens and the interactions they make possible, we also realize that our daily lives have become ever more saturated with them. As an agency, we want to ensure that when our audience visits an institution like a museum, they’re met with a powerful and unique experience they couldn’t get elsewhere. More and more, this drive has led us to non-traditional canvases in search of tangible interactions and surprising non-screen-based media.
Recently, a client tasked Bluecadet with looking at a suite of technologies to integrate interpretive media into a public, outdoor space that would augment, but not distract from, the natural landscape. The client came to us with a few great ideas, including the use of bone conduction to integrate sound in surprising ways.
While we love screens and the interactions they make possible, we also realize that our daily lives have become ever more saturated with them.
If you’re unfamiliar with bone conduction, it describes the method of using the bones in your head to transmit sound into your inner ear, bypassing your eardrum completely and applying vibrations directly to your head. Imagine sitting in a quiet room, pressing your head to a surface—like a table or a wall—and hearing music. That is the potential of this technology, and our research turned up some amazing projects (including the Laurie Anderson illustration below we found on kunstradio.at) using bone conduction to transmit sound in unexpected and powerful ways.
Laurie Anderson, “The Handphone Table,” 1978.
The more we looked into bone conduction, the more excited we got about it. First, it’s an audio-only, physical interaction capable of establishing a hyper-personal relationship between visitors and content without adding screens. Second, it requires hardware that most people don’t have access to, so we would get to experiment and do some prototyping. And third, the end product is a novel, hyper site-specific experience with the potential to surprise and challenge an audience to think differently about a space. In short, it checked a lot of boxes for what we look for as we imagine the future of cultural institutions and social spaces. So we rolled up our sleeves and started building.
Building an early bone conduction prototype in our New York office.
Prototyping is a process. It requires time, energy, and, more than anything, the willingness to allow the process to lead you into unexpected territory. The range of projects we tackle at Bluecadet means we experiment with many kinds of prototypes: building low-fidelity cardboard mockups of physical exhibits, crafting paper prototypes of digital UI elements, coding “MVP” software sketches, testing the feasibility of specific technical hardware…the list goes on.
Our prototyping practice is not a “one-size-fits-all” methodology, and often our only goal is to answer the question: “Is this hardware any good?” Just as likely, we might be frantically testing out lateral solutions to a very specific software bug. (Just kidding, our software is always perfect!) The truth is, we love both of these scenarios, and everything in between.
But, as often happens when building prototypes, each decision led to a new set of questions.
Occasionally, clients will ask us to tackle prototyping as a project in and of itself. These jobs are a dream, allowing us space for unbound experimentation and creative problem-solving, and our bone conduction work was one of these moments: a chance for us to explore, with our client, ways this unique interpretive tool might work, from functional, technical, and experiential perspectives.
Through our research, we uncovered some amazing resources that led us to select our initial hardware to play with: a medium-sized bone conduction transducer. But, as often happens when building prototypes, each decision led to a new set of questions.
Our first prototype.
What else do we need to make this work?
First, a backtrack. The transducer we selected came after we experimented with more directly engineered bone conduction elements. These are great when applied directly to the bone behind your ear—which is how many bone conduction headphones and hearing aids work—but our dream was to achieve a bigger range of interactions (more on that below), so we upgraded to a larger transducer.
Bone conduction transducers convert electrical signals (sound, in this case) into mechanical energy, specifically vibration. The more intense the vibration, the “louder” a sound will be in your head. To get the transducer vibrating strong enough to work, it became immediately obvious that we would need an amplifier to boost the sound signal. Without the amp, our transducer just whispered tiny sounds to us (adorable, but useless!).
So, our general parts list ended up being short + sweet (drop me a note if you want the full detailed shopping list):
- Medium-sized bone conduction transducer
- Small amplifier
- A Raspberry Pi to play sounds with some logic
Testing a couple transducers.
So, what’s the best way to conduct sound into your head?
Does it really need to be directly against your head? It became really clear, really quickly, that the answer is “yes.” The fact is, the transducer’s job is to vibrate something, anything, and turn it into a speaker. Placing any material between the transducer and your head causes that material to become a speaker. The bigger the item (for example, a metal bar pictured above) the more speaker-y it became.
We learned a lot by starting with what others have done in the past—specifically the “elbow” interaction (explored by Anderson, Markus Kison’s “Touched Echo,” and even KFC). In this model of interaction, the transducer is vibrating, the vibrations are transmitting up from your elbow to your wrist, and your wrist bones are transmitting the vibrations to your head. As you might imagine, you lose a certain amount of fidelity this way, but for music, it was totally fine. It even felt magical with the right content.
Placing any material between the transducer and your head causes that material to become a speaker.
For our project, the aim was to tell a larger range of stories about the history of our client, including first-person stories from soldiers, groups of people singing around a campfire, and howls of coyotes. Some of the content was atmospheric, but it still needed to be clear and comprehensible in order to be a satisfying experience. So, this prototype told us we’d need to stick with the tried-and-true direct-head-contact model.
And yes, these phrases still sound as weird to me as they must to you.
A diagram of our final circuit. The tiny speaker at the bottom represents our transducer; the rest of the elements were specific to our application, including items to change track, incorporation of a small pressure sensor, etc.
What makes bone conduction interesting? Challenging?
Now that our prototype was working and we had an idea of what it could and couldn’t do, we started to dive into content possibilities. We cast a wide net, working with sound designer Andy Green to gather a range of samples, from single-person interviews to music to wildlife sounds. The results were fascinating.
Remember, what we’re doing is turning sound into vibration. Anyone who’s ever been to a loud concert or stood near a car with beefy subwoofers knows that bass shakes things. High sounds, not so much. As obvious as it seems, we didn’t think about this principle until we tried the sound of a bird tweeting. Sadly, our bird’s tweets were too high; too high = no vibration = no bone conduction! Conversely, though, sounds with rich dynamics, like people talking or instruments playing, were really effective. As is often the case in prototyping, we were testing both the technical parameters and the experience itself. Luckily, in this case, the experiential and technical aligned, bird sounds not withstanding.
So, our constraints had become clearer: we needed sounds with good dynamic range, and, like it or not, we needed to find a way to get our visitors to put our transducers right against their heads.
"Bodystorming" ideas for interaction.
What are ways to seamlessly integrate the technology into the experience?
This leads us to our final question: How, exactly, can we get visitors to interact with these pieces in a way that makes sense? To figure this out, we used experiential prototyping, or “bodystorming” (to borrow IDEO’s term), which simply means we tried to inhabit the mindset of our users, acting out (improv style) different scenarios.
As you can see, we tried a host of interactions. Some were fascinating and, we felt, worth exploring at a later date, like the tree interaction (top left, inspired by “Listen Tree“). Others were fun but asked too much of our audience, like embedding speakers in the ground and requiring people to lay down. In the end, we all loved scenarios around a park bench (bottom right), a classic feature of communal, outdoor spaces. We found that creating a deliberate “head rest” with an imbedded transducer allows for a natural and comfortable way to deliver bone conduction, while still allowing the user the option to opt-in…or not.
Our final “bench” prototype.
Prototyping can be messy, frustrating, fun, and rewarding; it is often all of those things at the same time. Taking the time to “sketch” experiences with bone conduction technology allowed us to expand our knowledge on the topic, and make sure we avoid any dead-ends or traps during upcoming design phases. We now feel confident with the technology and ready (or dare I say “excited) about the challenges that still await us as the project moves forward.