New Tech Promises to Predict Your Moods. That Might Not be a Good Thing

Christian Mueller/Shutterstock.com

Your wristband or phone would serve as a sensor, helping you ward off depression, supporters of the new technology say.

If your phone could warn you of impending stormy internal weather, you could theoretically do the emotional equivalent of grabbing an umbrella on a cloudy day to ensure you don’t get doused later.

That’s the basic idea behind a number of new technologies, many still in development, that attempt to predict emotions based on certain biomarkers. Psychologists and technologists are together trying to build emotional databases that teach machines how to read human feelings by compiling a bunch of data about biological signals that indicate impending changes in order to digitally predict moods. Your wristband or phone would serve as a sensor, helping you ward off depression, supporters of the new technology say.

For some of these researchers, it’s a personal mission, informed by their own struggles. “I suffered from depression early in my career and I do not want to go back there,” Rosalind Picard, an electrical engineer and computer scientist at MIT, tells Nature. “I am certain that by tracking my behaviors with my phone I can make it far less likely I will return to that terrible place.”

Picard works with Andrew Nock, a psychology professor at Harvard University, who was inspired by a friend’s suicide, leading a team that tracks the moods of undergraduates at New England universities, using phones and wristbands. Nock and Picard say their research has shown that digital mood forecasting can predict episodes of sadness a day before students feel the symptoms themselves.

An app for that

So far, the research on these technologies is scant, as Scientific American pointed out in October. For example, a California company called Mindstrong has raised $29 million in venture capital, but hasn’t produced any publicly verified results. “Health by Mindstrong” is available on the Apple app store and Google’s Play store but only participants in the company’s studies can access a code for it. Mindstrong says it will soon publish studies in peer-reviewed journals that prove the promise of its new technology.

Beyond the technological and scientific hurdles that need to be overcome for these tools to be used reliably, there’s also a philosophical problem. Should we, societally, be focusing on making machines that teach us to read feelings when reliance on technology itself seems to be turning us into addicts and making us more anxious? We are becoming more aware of the connections between rising prevalence of technology use and anxiety and depression, but by force of habit we are turning to new tools to help us contend with the very effects that technology is amplifying.

Relying on machines to tell us how we feel leaves us vulnerable. For one, it means we need to always be connected to devices for information about ourselves, and we are already arguably overly-reliant on these tools. While having a tool that predicts moods and alerts you to a possible crash could be useful in some instances, it also isn’t fostering emotional independence but reinforcing a sense that we are lost without a connection to an external source of information.

Even if the technology was perfect, you’d still have to be connected to an electronic device or tool of some kind for all or most of the time, and through that tool probably to the web, too, in order for it to collect enough data for its use to be meaningful. “The accuracy and validity of a device is only the very first step. If it wasn’t, we would all buy bathroom scales and immediately lose weight,” Maribeth Gandy Coleman, director of the Interactive Media Technology Center at Georgia Tech, tells the American Psychological Association.

She argues that a stress-detector is of no use if a person stops wearing it, or doesn’t change their behavior when alerted to a dangerous mood. In other words, the information alone won’t teach people to deal with their moods and adjust their actions to avoid bad feelings. Just like wearing a FitBit doesn’t actually make you more active, mood predicting tech won’t necessarily make you healthier.

Know thyself

On the other hand, learning to understand and manage our own emotions, becoming self-aware, is a skill we can take anywhere and use forever that requires no wifi or web access and stands a good chance of changing behavior. Self-knowledge frees us from reliance on external devices. Self-awareness fosters independence rather than a sense of helplessness. And when we learn about ourselves and what makes us feel better or worse, it’s more likely we’ll respond to internal alarms than to yet another notification from a device that’s also telling us about a friend’s birthdays and new likes on social media.

There are ancient technologies that already accomplish what the new apps purport to do. Meditation, for example, is a method developed to study the self, examine emotions as they arise, and cultivate skills to predict mood signals. Meditators develop their own personal datasets which they employ to manage moods.

There’s science to back that natural approach. Lisa Feldman Barrett, a neuroscientist and psychology professor at Northeastern University, received a National Institutes of Health Director’s Pioneer Award for her work on emotions and the brain. In a Ted Talk last December she explained her conclusions after 25 years of scientific research: “[Emotions] are guesses that your brain constructs in the moment where billions of brain cells are working together, and you have more control over those guesses than you might imagine that you do.” In other words, what’s happening when we feel is that we’re unconsciously predicting what might be, based on past experiences.

That’s why, she believes, tech companies are on the wrong track when they try to build digital tools to measure feelings. “[They] are spending millions of research dollars to build emotion-detection systems, and they are fundamentally asking the wrong question,” says Feldman.

Physical signals have no intrinsic emotional meaning, in her view, so measuring them yields no reliable information. The psychologist argues that our brains construct the meanings that dictate our moods based on past experiences and guesswork. The better we get to know ourselves, the easier it is to adjust our own moods and interpret contexts as positive, she argues.

We aren’t prisoners to our feelings, in other words. Instead, we can be our own mood monitors, transforming sensations of “wretchedness,” say, into the ingredients for a good day. She explains:

Now I am not suggesting to you that you can just perform a couple of Jedi mind tricks and talk yourself out of being depressed or anxious or any kind of serious condition. But…you have the capacity to turn down the dial on emotional suffering and its consequences for your life by learning how to construct your experiences differently. And all of us can do this and with a little practice, we can get really good at it, like driving. At first, it takes a lot of effort, but eventually it becomes pretty automatic.

Emotional focus

Current mood-predicting technology forces us to focus on negative emotions and could even compound them. If your app tells you that you are about to feel bad based on the way you’re banging on your phone, that may not be such a great thing.

Psychologist Victor Johnston, author of Why We Feel: The Science of Human Emotions, argues that emotions are like magnets, drawing attention to particular issues. Giving too much attention to negative signals could lead to a preoccupation with bad moods before they even arise, and send us on a negativity spiral.

Even if the mood apps signal accurately and help us develop a kind of mindfulness, rather than making us more stressed and depressed, there’s still the problem of what, if anything, to do with the information they provide.

Harvard biostatistics professor Jukka-Pekka Onnela has developed a smartphone-based research platform that collects information about behavioral patterns, such as sleep, social interactions, physical activity, and speech, to establish baselines for individuals and then predict dangers based on changes in signals. Onnela and John Torous of the digital psychiatry program at Beth Israel Deaconess Medical Center are trying to determine whether sensor data can predict relapses in people with schizophrenia.

“We notice that before people become ill, they have changes in their own daily patterns. We’re not finding a universal signal, but each person might have his or her own personal relapse signature,” Torous tells the American Psychological Association. But even if the platform can predict a relapse, there’s no guarantee this will help. Much will still depend on individual patients and their own motivation. “In a lot of mental illnesses, where lack of motivation is a symptom, people may be even less engaged with the technology,” he says.

The technology won’t replace the need for training and motivation, for experts who help individuals understand themselves better, or for personal efforts to change behavior when it seems that stormy weather is coming. How should we respond to news of impending bad moods when the “right response” will vary for individuals?

It’s possible new mood-predicting technology could be combined with traditional mindfulness methods to create a holistic approach to emotional intelligence. Perhaps a person who gets messages about their moods from the apps will become more conscious of the nuances in their feelings and start to develop a sense of how to contend with their fluctuating emotions.

The data can be helpful to clinicians, psychiatrists, and public health workers because it gives them a sense of where risks arise, according to Munmun De Choudhury, an assistant professor of interactive computing at Georgia Tech. But, she tells PBS’s Nova, “currently the landscape is really, for lack of a better word, ‘primitive,’ in how algorithmic inferences can be incorporated into interventions.” And ultimately, intervention is the goal, not just knowing that people’s moods fluctuate, sometimes dangerously.

Data-privacy invasions

Collecting mood data also raises ethical and legal questions. Will health insurers of the future judge us on our mood data? Will they want to review records of our gloom? Even proponents of mood-prediction tech note there could be privacy concerns. Picard at MIT believes regulators need to consider these issues now to ensure that in the future corporations are limited in their access to the new kind of data, so that they can’t use it to target consumers through advertising, and so insurers can’t demand to review the information before determining pricing. Your insurer can’t look at your mood journals now to determine what kind of mental health risks you present, but when lots of data is aggregated in convenient and relatively easy to read ways, that may change.

Yaniv Altshuler of the MIT Media Lab, a researcher who helped pioneer “reality mining” smartphones as an approach to studying human behavior warned MIT Tech Review in 2014 that there will be downsides to mobile data troves. Using mobile devices to collect sensitive information about people raises new privacy risks.

We know from other contexts that data which seem innocuous can end up finding uses that violate legal protections. Take cell phone location data, which providers collect from users’ devices and which do not reveal substantive information about users’ conversations, for example. For years, law enforcement authorities were able to request this data from corporations as it wasn’t considered private information or a constitutional violation. Then in June, the U.S. Supreme Court ruled that law enforcement needs a warrant to see these data because the cumulative effect of the collection, the aggregation of all the data points, provides private information about where people have gone that would otherwise not be available.

We may still be a long way from facing mood-data privacy invasions in our own lives, but the possibilities and risk of the predictive technology are worth considering now. The power to predict our emotions and work with them already exists and is available to anyone willing to get to know themselves, for free. And by the time sophisticated and effective mood predicting wristbands become widely available, you could already be your own best emotional meteorologist.