Where’s the sound design in UX design?
Last Thursday, I went to the conference Designing products and started to think again about sound in web design and on our devices in general.
First, a little background. I have two passions in life (besides my wife and daughter): music and computers. I decided a long time ago that music would not be my source of income, just because I wanted it to stay, well… a passion. But when I started to look for a job, my first intention was to try and find work as a sound designer. Life had another plan for me though… and I’ve now been building websites and other multimedia projects for over 13 years.
Why sound over other senses?
The first and simplest reason is: because we can do something about it. Most of our devices now are talking to only 2 of our senses: sight and hearing, and that’s pretty much it.
Yes, we can talk about the feeling (and the buzzing!) of a device but it’s very small compared to the 2 others and nobody has tried to introduce a vanilla-flavored smartphone yet.
The second reason is linked to the first: if there’s a hierarchy in our senses, sight is by far the first but I would very much say that hearing is second. That sense has a powerful way of immersing a user in a given universe. I’m thinking about movies of course, but also video games or even stores. Another good example is the effort that car makers provide so that the clank of your car door makes you feel safe and cozy, whether or not it’s true.
Finally, the latest standards for bandwidth, combined with sound compression technologies and low-rates of modern audio production, make it possible for us now to enrich our projects with delightful and immersive sounds and music.
The sound of music
The palette of sound is enormous: from a tiny glitch to an entire symphony, sound designers to classical composers to tribal percussionists have made so much material that could be incorporated in our projects. Yet, too few web designers will know what to do with sounds. They have been more keen to open a graphic art book than a musical dictionary. Therefore, audio is often considered when we have a bit of development time left at the end of the project rather than being a key element from the beginning. But that’s not the only problem…
Stop that noise!
Trouble is: it would be a nightmare if all of our devices made sounds all the time. I’m a strong defender of the user’s peace of mind and I don’t think that providing a 10 second loop of music in the background of your site was ever a good solution to immerse your user in your universe. Plus, it touches on another problem: people have strong opinions about music. Trying to shovel some music in your user’s ear might turn them off forever. People love music, but they hate music too. Remember the last time on your commute that you were annoyed by the guy next to you and his smartphone bawling awful rap/rock/folk/younameit. That’s where you wish God had made eyelids for your ears.
So, sound is a powerful tool, but it’s dangerous. It’s interesting to observe how we assume that sounds will be present in a video game interface but not on a bank website. There are already conventions. Do we have to change those conventions? I think we will for some of them, because the digital world is still young and also because the possibilities offered by the hardware will force us to.
Use of sounds and music should be smarter and better thought out. Here’s a list of examples that could imply a change of strategy for using audio:
- does the user have headphones on?
- is there already audio playing on this device?
- how loud is the environment?
- do you have to play a sound one time (like an alert) or several times in a row?
I believe that these sorts of considerations will be determinant for the future success of web applications – big or small – and should be incorporated by development teams sooner rather than later.