Emerging Tech

Come on Feel the Data (And Smell It)

Africa Studio/Shutterstock.com

In February of 1960, Scent of Mystery flopped. The film was touted as the first Hollywood production to deploy Smell-O-Vision, inventor Hans Laube’s automated system for introducing scents into a movie theater in coordination with the pictures on screen. While critics savaged Scent of Mystery’s plot, they also complained that Laube’s glitch-prone machine wasn’t worth the hype: “Motion pictures and synthetic smells do not mix,” fumed the New York Times. Smell-O-Vision became a punchline.

Scent of Mystery print ad, featuring prominent
mention of Smell-O-Vision

More than fifty years later, Haruka Matsukura and his colleagues at Tokyo University of Agriculture and Technology announced they had reinvented scented media for the digital age. In March of 2013, Matsukura unveiled a system for producing scents built into LCD advertising screens, creating the illusion that a Big Mac smells as good as it looks. Meanwhile, Spanish chef Andoni Luiz Aduriz recently announced a collaborative project with Scentee, a smartphone accessory that signals alerts and alarms via scent. Like the Smell-O-Vision before them, these products suggest more potent possibilities for our digitally mediated lives: transforming a sea of disembodied information we struggle to interpret visually or aurally into more “visceral” data that we see, hear, feel, breathe and even ingest.

The Internet of Things promises to bring network connectivity and ubiquitous digital sensors in a wide variety of everyday materials and devices. This plethora of inputs produces data, and lots of it. We already stretched to the limit processing, internalizing, and understanding the data we have today. In the future, the sophisticated data visualizations—graphs, flowcharts, and infographics—that are staples of contemporary digital media products will become increasingly insufficient. Instead, the burgeoning Internet of Things will rely increasingly on what I call “data visceralizations.” Data visceralizations are representations of information that don’t rely solely and primarily on sight or sound, but on multiple senses including touch, smell, and even taste, working together to stimulate our feelings as well as our thoughts.

Of course it’s possible to prompt a visceral reaction through sight alone. Designer Sha Hwang recently suggested that designers strive to make data visualizations hit harder emotionally, by reframing the scope of their graphics and employing more emotion in their products. But relying on seeing by itself isn’t enough: influential design theorist Donald Norman’s theory of emotional design helps explain why.

In his 2004 book Emotional Design, Norman suggests that design elicits an emotional response at three different levels: the visceral, the behavioral, and the reflective. The reflective level connects a design with our conscious mind; the behavioral level is how we use a product; the visceral level, according to Norman, is made up of a design’s “look, feel and sound,” its “heft” and “sensuality.” Great visceral designs, like Apple’s original iMac, are all about “immediate emotional impact” because they engage multiple senses: Apple’s candy-colored computers were also rounded and smooth to the touch, distinguishing their feel as well as their look.

Experiments in visceral design have a long history in the creation of computing technologies. In the middle of the last century, cybernetics pioneer Grey Walter was exploring the possibilities of multi-sensory perception and interaction with computing. In the early 1950s Walter developed Flicker—a device that used EEG hookups to synchronize a user’s brainwaves with colored lights and music in a self-regulating feedback loop. Though his project never made it into large-scale production, today’s scientist-entrepreneurs are bringing a similar vision to the market.

The Muse EEG headband was exhibited at the 2014 Consumer Electronics Show in Las Vegas and is available now for pre-order. It allows a smartphone or tablet to take a person’s brainwaves as direct input. In an earlier product, Toronto-based Interaxon, Inc. created an app that gives a person instant feedback on their brain’s state in order to teach meditation techniques to calm and focus the mind. The company imagines a wide range of uses from immersive gaming to pouring a “nice cold mug of beer” “simply by using the power of your brain.”

Introducing Muse: Changing The Way The World Thinks from InteraXon on Vimeo.

The theory of sentics, developed by Dr. Manfred Clynes in the early 1970s, offers another prescient example. A neuroscientist and musician, Clynes developed a machine that correlated and tracked the relationship between bodily gestures and different emotional states. Clynes argued that particular motions were universally associated with particular emotions, and that these movement patterns could be seen in art and even heard within musical compositions.

Along with his colleague Nathan S. Kline, Clynes had coined the term “cyborg” in 1960. While he made engineering contributions to a number of medical fields, his vision of engaging multiple senses in the pursuit of a richer experience and understanding of the data we produce was ahead of its time compared to developments in consumer electronics.

Nintendo Rumble Pak attached to controller.
(Wikimedia Commons)

Today however, mainstream digital media products and interaction designs increasingly strive to create visceral reactions as a primary experience. Part of this new emphasis stems from attending to multiple channels of interaction: not just visual and aural cues, but also gestural and touch-based (haptic) feedback. Digital gaming has played a major role in incorporating tactile feedback, among other visceral design methods, into its products. The Rumble Pak, introduced commercially by Nintendo in 1997, makes game controllers shake and vibrate in time with the on-screen action. While initially derided as a clunky frill, rumble quickly became a standard of game play. Contemporary mobile devices like the iPhone and iPad lack active tactile feedback, but their gestural interfaces have allowed the designers of games like Angry Birds and Candy Crush to exploit visceral design principles effectively. Game designer Kyle Gabler uses the term “juiciness” to describe the positive feedback games like Candy Crush provide their players: shimmering colors and visual elements that move organically with some, but minimal, built-in resistance are hallmarks of a “juicy” gaming interface. Increasingly, interface designers of all kinds are combining this visual “juiciness” with appealing feedback for the other senses: products like Sifteo interactive tiles combine tactile, auditory and visual reactions in an attempt to captivate users by injecting the principles of “juicy” feedback into the domains of touch and hearing.

Sifteo Cubes (Sifteo Inc.)

All of the design strategies described above seek to prompt a visceral response in users through engaging one sense or another in order to trigger a physical or emotional reaction, making an end-run around rational thought.  A commitment to the concept of visceral data, in contrast, involves focusing on how these different tools and technologies can work together to help make abstract information have a meaningful visceral impact on users—one that’s appropriate and compelling for the context and the data involved.

Take online cookies, the markers websites place on our computer when we visit and which are often used to track our browsing behavior online. Cookies are left on our computers without any sort of warning at all: what if instead, our devices shuddered slightly, made a noise of protest, or in extreme cases—as prototyped by Robert R. Morris and Daniel McDuff, two MIT students, under the name Pavlov Poke—delivered an electric shock to the user? By keeping the idea of making data visceral and not just visible in mind as a design principle, future technologists will be able to make products that aren’t just more compelling, but also more beneficial to the choices we want to make in our online and offline lives.  

Why does making data more visceral matter? Simply put, we’re too often disconnected from the information we put out into cyberspace. As digital connections become more ubiquitous, from pacemakers to refrigerators, we’re poised to produce new sets of data in new situations: whenever we use our Internet-enabled coffee maker or crockpot, for instance. Interfaces that prompt us to use our multiple senses in different ways to understand this data, including novel ones like taste and smell, promise to rise above mere gimmickry.

For one, interfaces that make data sets viscerally engaging could result in a more holistic process of individual decision-making, grounded in both our thoughts and our feelings. While visceral design in material products is often intended to produce feelings or desires that overcomes our reasoned second thoughts, visceral data has the potential to level out our reactions the opposite way: as well as appreciating a problem or issue rationally, users prompted to engage viscerally will have a well-rounded sense of their own intellectual, emotional and physical stance on the matter at hand. BevLab, a Toronto startup, recently showed off a machine that matched emotional keywords on Twitter with fruit juice flavors, and mixed them together for a unique “taste” of Twitter’s current mood—visceral data that gives “juiciness” a whole new meaning.

Perhaps most importantly, interfaces that present data viscerally have strong potential to improve our understanding of broader social and political problems, and by extension to encourage us to act. There’s been an ongoing debate ever since the advent of social media as to whether platforms like Facebook and Twitter encourage civic action and protest, or divert our attention and enthusiasm into digital dead ends. Giving data sets a visceral dimension could help upend this dichotomy. Thoughtful use of design strategies for making data more visceral could help prolong and strengthen our engagement with social causes at the level of everyday practice. Two colleagues at NYU, Michael Karlesky and Xiaochang Li, are currently creating SensD, a neckband that translates GPS and data from the environment into physical sensations—lights, sounds, or vibrations. While intended for use by sailors and cyclists, such a device could also be used to translate flows of personal information into stimuli close to the skin, making it easier to keep tabs on particular data flows. Online privacy is more appreciable when you’re feeling its importance to your body on a daily basis. 

Visceralizing data isn’t a panacea for the future of interaction design, especially if it’s executed crudely or clumsily. But making data visceral through multiple sensory channels is a design principle worth exploring as the Internet of Things takes shape. Exploring all the ways humans experience and perceive the world isn’t just a good business opportunity: it’s also a chance to make digital technologies more surprising, engaging, and stimulating for all sorts of human endeavors. In the age of the Internet of Things, “seeing is believing” just won’t cut it.

(Image via Africa Studio/Shutterstock.com)

Threatwatch Alert

Thousands of cyber attacks occur each day

See the latest threats

JOIN THE DISCUSSION

Close [ x ] More from Nextgov
// October 24
X CLOSE Don't show again

Like us on Facebook