Why Our Brains Love Physical Buttons
Ever since Douglas Engelbart first popularized the computer mouse in his legendary 1968 Mother of All Demos, people have tried to replace it. It hasn’t gone well.
Gesture recognition, trackpads, voice commands (remember Dragon Naturally Speaking?), touchpads, styluses — all these have come and gone. The modern mouse is wireless, laser-equipped, and bristling with buttons. But its core design and functionality is basically identical to that of the original mouse Engelbart demoed 50 years ago.
It took a change of format to create something that even comes close to rivaling the mouse — the touchscreen on your smartphone. Touchscreens are perfectly suited to a tiny mobile device like the iPhone. But like the mouse, they’re still fundamentally a physical, tactile technology. Voice assistants like Siri and Alexa are nice for some things, but how often do you use them to navigate an app or send an email?
Physical technologies like the mouse and touchscreen have persisted because the best tech products aren’t purely digital. They’re hybrids — technologies that blend the digital and the analog, the virtual and the physical, or the automated and the human. Understanding our love of hybrids requires delving deep into our evolution and brain anatomy. It explains why your iPhone feels like an extension of your arm, why the Amazon Kindle continues to exist, and why humans can’t help but love big, chunky, physical buttons.
The Original Kindle Was Crazy
What the design of the first popular e-book reader can teach us about innovation
I’m a photographer, so I spend a lot of time thinking about and using cameras. I shoot both analog film and modern digital photographs, so I own both vintage cameras and modern digital ones. My favorite camera, though, is somewhere in between: the Leica Q. The Q is a digital, mirrorless camera, but its design and user interface aren’t based on fancy new technology, but rather on the original controls found on a Leica film camera from 1935 — including physical shutter speed and aperture dials and a physical shutter.
Is this some kind of hipster throwback — impractical, but intended to impress other photographers? No. Leica built the Q that way because it’s the most logical way to set up a camera. All the functions you need are laid out for you in an intuitive, physical way. After taking 550,000+ photos on my Q, I can literally operate it with my eyes closed. I know the increments on the shutter speed selector by heart and can click it from speed to speed without having to look at it. When I press the shutter button, I can feel the tiny click of the physical shutter firing. If it doesn’t fire, I know the Q hasn’t locked focus, and I’ll have to override its autofocus and find a focus point manually.
The Q is such a fantastic camera in part because it’s a hybrid. It blends the best elements of the digital world (a digital sensor, insane light sensitivity) with the best elements of the analog one (an intuitive set of physical controls, a real shutter). Other great tech products are hybrids, too. Steve Jobs reportedly hated buttons, and did his best to eliminate them from Apple’s devices. But even he couldn’t kill Apple’s prominent Home button, which takes users backward in menus, or all the way back to a consistent home screen (and continues to exist even on the latest generation iPads, though it was eliminated on iPhone after the iPhone X).
Having the grounding and reversibility of a Home button makes navigating a new app or website feel like meandering down a winding garden path in the physical world, rather than drowning in the infinite set of possibilities presented by an open-ended voice interface. You can try out new functions in an unfamiliar app, emboldened by the knowledge that you can always go Home with one press on a button that’s right in front of you.
Other devices exist entirely in order to give digital information a physical, analog presence. Most of us are totally capable of reading e-books on a cellphone screen. But many of us (myself included) are perfectly willing to plunk down up to $350 for the Amazon Kindle, a digital device that does its best to capture the experience of reading a physical book. High-end Kindles use E-Ink displays, which shunt actual ink around a fluid-filled panel electronically, duplicating the look and resolution of a real printed page.
The newest Kindle even comes with an optional premium leather case that makes the device look (and smell) like a handsome bound volume from some aging professor’s personal library. Amazon says that the case “develops a natural patina, making each cover unique” and “opens and closes just like a book.” The Kindle is a perfect hybrid — it combines the best aspects of digital books (portability, ease of purchase) with the reading experience of physical ones.
Why do people love hybrids so much? It likely comes down to our brains. Humans are highly tactile, physical creatures. Our skin contains four different kinds of mechanoreceptors, which allow our brains to perceive touch. Three of them detect basic touch sensations, like pressure and the stretching of our skin. But we also have a fourth kind of special, exquisitely sensitive touch receptor: the Pacinian corpuscle.
Pacinian corpuscles detect vibrations. Unlike other mechanoreceptors, they’re quickly overwhelmed by direct skin contact. But their sensitivity to vibration allows us to do something remarkable — it allows us to interact with tools as if those tools were part of our own bodies.
Imagine you’re holding a hammer. When you use it to hit a nail, tiny vibrations travel up its handle and into your hand. Your Pacinian corpuscles detect those minuscule vibrations, encode them into electrical signals, and send them up to your brain. Your brain then processes them using the same regions it uses to process stimuli that directly touch your skin. It’s as if you’ve embedded mechanoreceptors into the hammer itself. Especially if you hammer things a lot, your brain becomes adept at understanding the vibrations that your Pacinian corpuscles detect. The hammer starts to feel like an extension of your body.
Pacinian corpuscles allow you to interact with other kinds of tools, too. They allow you to feel the surface of the road while you’re driving (every tiny bump generates a vibration that travels through the steering wheel into your hands), read Braille with your fingertips, and detect the rough texture of a piece of wool, or the smoothness of a glass windowpane, even the glass screen of your iPad. Pacinian corpuscles likely evolved to allow us to wield physical tools, like a rock hammer (or a sewing needle). But today, they’re engaged (both passively and actively) when we use the physical interfaces of tech tools. When I turn the shutter speed dial on my Leica Q (and especially when I feel the tiny “thunk” of its shutter firing), I’m using my Pacinian corpuscles. The same is true when I press the physical page turn button on my Kindle and feel its satisfying click.
Even tech devices without real physical controls find ways to engage your Pacinian corpuscles and other mechanoreceptors. Cellphones’ virtual keypads are easy to use in part because they produce artificial vibrations when you touch a virtual key. Scientists have tuned these artificial vibrations obsessively so that pressing a fake on-screen key or button feels (to your Pacinian corpuscles and thus to your brain) exactly like touching the real thing.
The field responsible for optimizing these physical sensations is called haptics (as are the sensations themselves), and is a subset of the field of ergonomics. Knowing that haptics can make or break a device or app, hardware companies like Apple provide detailed guides for other designers about haptic best practices. A good haptic, for example, should be completely predictable. As Apple says in their guide, “If a haptic doesn’t reinforce a cause-and-effect relationship, it can be confusing and seem gratuitous.”
This predictability allows your Pacinian corpuscles and your brain to learn the patterns of sensations that a specific device or app produce (much as a physical Home button allows for predictable paths through an unfamiliar interface). Over time, these become more and more ingrained in your brain, enhancing the perception that the app or device is an extension of your body. If haptics are unpredictable, the sensations risk confusing your brain. If a hammer vibrated in a different way every time you picked it up, you’d have a hard time learning to use it deftly. The same applies to an app that changes its haptics with each new update.
Good haptics — or good physical design, in the case of devices with real physical interfaces — is what makes great hybrid products so pleasurable to use and so successful. Via our Pacinian corpuscles, these devices interface directly with our brains, leveraging the same hardware that evolved over millennia to allow us to use tools.
The lack of haptics also explains why virtual assistants can seem like a strange, alien presence. Interacting with Alexa doesn’t feel like anything. Even a baby can pick up a physical device like an iPad and immediately begin interacting with it. Voice interfaces, though, take practice. The mouse has persisted for so long because it makes your arm and your fingers feel like part of the computer you’re controlling, and piggybacks on neural hardware humans have used for eons. That kind of connection is hard to replicate with any other input device — especially one that relies on audio alone.
To create great tech products, designers must deliberately build hybrids. This means embracing the digital advantages that tech products bring, while also taking heed of the basic, human need for something physical to pick up, hold, and touch. Tiny details — like how a button clicks or the haptics of a virtual keyboard — may seem minor. But they connect to something deep, physical, and fundamentally human. Getting hybrid interactions right can mean the difference between building a decent tech product and creating a life-changing one.