It’s Time For a Serious Talk About the Science of Tech “Addiction”

To hear Andrew Przybylski tell it, the American 2016 presidential election is what really inflamed the public’s anxiety over the seductive power of screens. (A suspicion that big companies with opaque inner workings are influencing your thoughts and actions will do that.) “Psychologists and sociologists have obviously been studying and debating about screens and their effects for years,” says Przybylski, who is himself a psychologist at the Oxford Internet Institute with more than a decade’s experience studying the impact of technology. But society’s present conversation—”chatter,” he calls it—can be traced back to three events, beginning with the political race between Hillary Clinton and Donald Trump.

Then there were the books. Well-publicized. Scary-sounding. Several, really, but two in particular. The first, Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked, by NYU psychologist Adam Alter, was released March 2, 2017. The second, iGen: Why Today’s Super-Connected Kids are Growing Up Less Rebellious, More Tolerant, Less Happy – and Completely Unprepared for Adulthood – and What That Means for the Rest of Us, by San Diego State University psychologist Jean Twenge, hit stores five months later.

Last came the turncoats. Former employees and executives from companies like Facebook worried openly to the media about the monsters they helped create. Tristan Harris, a former product manager at Google and founder of the nonprofit “Time Well Spent” spoke with this publication’s editor in chief about how Apple, Google, Facebook, Snapchat, Twitter, Instagram—you know, everyone—design products to steal our time and attention.

Bring these factors together, and Przybylski says you have all the ingredients necessary for alarmism and moral panic. What you’re missing, he says, is the only thing that matters: direct evidence.

Which even Alter, the author of that first bellwether book, concedes. “There’s far too little evidence for many of the assertions people make,” he says. “I’ve become a lot more careful with what I say, because I felt the evidence was stronger when I first started speaking about it.”

“People are letting themselves get played,” says Przybylski. “It’s a bandwagon.” So I ask him: When WIRED says that technology is hijacking your brain, and the New York Times says it’s time for Apple to design a less addictive iPhone, are we part of the problem? Are we all getting duped?

“Yeah, you are,” he says. You absolutely are.”

Of course, we’ve been here before. Anxieties over technology’s impact on society are as old as society itself; video games, television, radio, the telegraph, even the written word—they were all, at one time, scapegoats or harbingers of humanity’s cognitive, creative, emotional, and cultural dissolution. But the apprehension over smartphones, apps, and seductive algorithms is different. So different, in fact, that our treatment of past technologies fails to be instructive.

A better analogy is our modern love-hate relationship with food. When grappling with the promises and pitfalls of our digital devices, it helps to understand the similarities between our technological diets and our literal ones.

Today’s technology is always with you; a necessary condition, increasingly, of existence itself. These are some of the considerations that led MIT sociologist Sherry Turkle to suggest avoiding the metaphor of addiction, when discussing technology. “To combat addiction, you have to discard the addicting substance,” Turkle wrote in her 2011 book Alone Together: Why We Expect More from Technology and Less from Each Other. “But we are not going to ‘get rid’ of the Internet. We will not go ‘cold turkey’ or forbid cell phones to our children. We are not going to stop the music or go back to the television as the family hearth.”

Food addicts—who speak of having to take the “tiger of addiction” out of the cage for a walk three times a day—might take issue with Turkle’s characterization of dependence. But her observation, and the food addict’s plight, speak volumes about our complicated relationships with our devices and the current state of research.

People from all backgrounds use technology—and no two people use it exactly the same way. “What that means in practice is that it’s really hard to do purely observational research into the effects of something like screen time, or social media use,” says MIT social scientist Dean Eckles, who studies how interactive technologies impact society’s thoughts and behaviors. You can’t just divide participants into, say, those with phones and those without. Instead, researchers have to compare behaviors between participants while accounting for variables like income, race, and parental education.

Say, for example, you’re trying to understand the impact of social media on adolescents, as Jean Twenge, author of the iGen book, has. When Twenge and her colleagues analyzed data from two nationally representative surveys of hundreds of thousands of kids, they calculated that social media exposure could explain 0.36 percent of the covariance for depressive symptoms in girls.

But those results didn’t hold for the boys in the dataset. What’s more, that 0.36 percent means that 99.64 percent of the group’s depressive symptoms had nothing to do with social media use. Przybylski puts it another way: “I have the data set they used open in front of me, and I submit to you that, based on that same data set, eating potatoes has the exact same negative effect on depression. That the negative impact of listening to music is 31 times larger than the effect of social media.”

In datasets as large as these, it’s easy for weak correlational signals to emerge from the noise. And a correlation tells us nothing about whether new-media screen time actually causes sadness or depression. Which are the same problems scientists confront in nutritional research, much of which is based on similarly large, observational work. If a population develops diabetes but surveys show they’re eating sugar, drinking alcohol, sipping out of BPA-laden straws, and consuming calories to excess, which dietary variable is to blame? It could just as easily be none or all of the above.

Decades ago, those kinds of correlational nutrition findings led people to demonize fat, pinning it as the root cause of obesity and chronic illness in the US. Tens of millions of Americans abolished it from their diets. It’s taken a generation for the research to boomerang back and rectify the whole baby-bathwater mistake. We risk similar consequences, as this new era of digital nutrition research gets underway.

Fortunately, lessons learned from the rehabilitation of nutrition research can point a way forward. In 2012, science journalist Gary Taubes and physician-researcher Peter Attia launched a multimillion-dollar undertaking to reinvent the field. They wanted to lay a new epistemological foundation for nutrition research, investing the time and money to conduct trials that could rigorously establish the root causes of obesity and its related diseases. They called their project the Nutrition Science Initiative.

Today, research on the link between technology and wellbeing, attention, and addiction finds itself in need of similar initiatives. They need randomized controlled trials, to establish stronger correlations between the architecture of our interfaces and their impacts; and funding for long-term, rigorously performed research. “What causes what? Is it that screen time leads to unhappiness or unhappiness leads to screen time?” says Twenge. “So that’s where longitudinal studies come in.” Strategies from the nascent Open Science Framework—like the pre-registration of studies, and data sharing—could help, too.

But more than any of that, researchers will need buy-in from the companies that control that data. Ours is a time of intense informational asymmetry; the people best equipped to study what’s happening—the people who very likely are studying what’s happening—are behind closed doors. Achieving balance will require openness and objectivity from those who hold the data; clear-headed analysis from those who study it; and measured consideration by the rest of us.

“Don’t get me wrong, I’m concerned about the effects of technology. That’s why I spend so much of my time trying to do the science well,” Przybylski says. He says he’s working to develop a research proposal strategy by which scientists could apply to conduct specific, carefully designed studies with proprietary data from major platforms. Proposals would be assessed by independent reviewers outside the control of Facebook etc. If the investigation shows the potential to answer an important question in a discipline, or about a platform, the researchers outside the company get paired with the ones inside.

“If it’s team based, collaborative, and transparent, it’s got half a chance in hell of working,” Przybylski says.

And if we can avoid the same mistakes that led us to banish fat from our food, we stand a decent chance of keeping our technological diets balanced and healthy.

Your Technology and You

  • Wired’s editor-in-chief Nick Thompson spoke with Tristan Harris, the prophet behind the “Time Well Spent” movement that argues are minds are being hijacked by the technology we use.

  • One writer takes us through his extreme digital detox, a retreat that took him offline for a whole month.

  • Technology is demonized for making us distractible, but the right tech can help us form new, better digital habits—like these ones.

from Wired Top Stories http://ift.tt/2DTNgHU
via IFTTT

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.