Skip to main content

If tech addiction is screwing up our kids, what should tech giants be doing?

social media addiction teens on phones
Klaus Vedfelt/Digital Trends
A strange thing happened when New York Times tech writer Nick Bilton interviewed Apple CEO Steve Jobs in 2010. At the end of the conversation, Bilton asked Jobs what his kids thought of Apple’s new tablet, news of which was dominating websites, newspapers, and magazines. Jobs’ answer surprised him: it turned out Steve’s kids hadn’t tried the iPad yet. “We limit how much technology our kids use at home,” Jobs said.

Bilton, stunned, reached out to Walter Isaacson, Jobs’ hand-picked official biographer, to find out whether he believed this to be true. Isaacson said that it was. “Every evening Steve made a point of having dinner at the big long table in their kitchen, discussing books and history and a variety of things,” he said. “No one ever pulled out an iPad or computer. The kids did not seem addicted at all to devices.”

Teenagers who spend upwards of five hours a day are more likely to have a risk factor for suicide.

It would be easy to write off Jobs’ behaviour as being unique to him, among tech executives. After all, wasn’t Apple’s iconic co-founder famous for “thinking different?” But he’s not alone. In 2007, the year that the modern smartphone emerged as its own distinct entity, former Microsoft CEO Bill Gates put a screen time cap on his 10-year-old daughter when he feared she was getting addicted to a particular video game. He also barred his own kids from getting cell phones until they turned 14: at least four years later than the average age of a child’s first cell phone.

As people working on the cusp of technology, both Steve Jobs and Bill Gates would more than qualify for the tastemaker status of what marketing expert Geoffrey Moore would call “early adopters.” Ten years later, however, it seems that a large number of other people are starting to catch up with their concerns about what technology is doing to us — and particularly to our kids.

In her bookiGen: Why Today’s Super-Connected Kids Are Growing Up Less Rebellious,More Tolerant, Less Happy, and Completely Unprepared for Adulthood, and What That Means for the Rest of Us, psychologist Jean Twenge lays out some of her concerns about the impact that tech addiction, particularly smartphones, are having on the so-called iGeneration. For those keeping track at home, that refers to the post-millennial generation (also sometimes called Generation Z), born between the mid-1990s and the mid-2000s.

Exposure-response curve of hours of electronic device use and % with at least one suicide-related outcome, bivariate and with demographic controls, YRBSS survey of 9th to 12th graders in the U.S. (from Clinical Psychological Science article out today). pic.twitter.com/zjekXk6oyO

— Jean Twenge (author of GENERATIONS, iGEN) (@jean_twenge) November 14, 2017

“There are three primary concerns,” Twenge told Digital Trends, summarizing her arguments. “First, digital media use seems to be decreasing the time we spend socializing with people face-to-face. Second, screen time interferes with sleep. Third, there are the direct effects of digital media, such as the social comparison of social media where we all think other people’s lives are more glamorous than ours. All of these are linked to less happiness and more depression.”

The book is filled with statistics backing up these claims — such as the suggestion that teenagers who spend upwards of five hours a day are 71 percent more likely to have a risk factor for suicide than those who spend under one hour a day. While correlation is not necessarily causation, iGen nonetheless paints an unsettling picture of a generation whose ever-connected world, and the lack of real world socialization that comes with it, is having a significant negative effect.

What role do tech companies play?

The question, therefore, is what should be done about it. Exactly what responsibility do tech giants have to us, and to society as a whole? Ironically, tech firms are far more likely to cast their work in these terms than just about any other industry. We don’t hear Wal-Mart talking or ExxonMobil talking about what they do in utopian terms, but Google has no issues putting its work in moral terms (“don’t be evil”), while Apple’s CEO Tim Cook happily waxes lyrical about making Apple “a force for good” in the world.

Recently, we got a glimpse of the kind of shareholder pressure that may force tech companies’ hands. Two investor groups with a total of $2 billion shares in Apple sent the company an open letter, voicing their concerns about this subject. Activist shareholders are not, as a rule, vocal about social change — which means that this represents something of a momentous occasion. They want Apple to do two things: to develop software that lets parents limit their kids’ phone use, and to carry out a study investigating the impact of smartphone overuse on mental health. Apple quickly responded to say that at least the first of these two goals is in the works.

“It’s a social-validation feedback loop […]  you’re exploiting a vulnerability in human psychology.”

Right now, it is still early days for this topic. Books like Twenge’s (and some notable others) have began to join the dots, but there are still accusations that examples are being cherry picked to suit an agenda. But people are speaking out. Recently, Sean Parker, the first president of Facebook, told the news website Axios that, “God only knows what it’s doing to our children’s brains.”

Expanding on the subject of social media addiction, Parker said that, “It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology. The inventors, creators — it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom on Instagram, it’s all of these people — understood this consciously. And we did it anyway.”

Should findings like Twenge’s be borne out in subsequent research, most notably with some form of attributed causation, tech giants could find themselves occupying a similar space to fast food giants or tobacco companies. True, both fast food and tobacco remain powerful industries, but they have also been subject to far more scrutiny. Tobacco advertising, for example, is now among the most heavily regulated forms of marketing. In the European Union, all tobacco advertising and sponsorship on television has been banned since 1991, and only Germany and Bulgaria allow it to be advertised on billboards. In the U.S., billboard and public transportation advertising of cigarettes is banned in 46 states, and there are stringent laws prohibiting advertising aimed at young people.

social media addiction sean parker
First president of Facebook, Seaon Parker (Theo Wargo/Getty Images)
First co-founding president of Facebook, Sean Parker (Theo Wargo/Getty Images)

The fast food industry isn’t so stringently governed, but it is easy to see many of the concerns — particularly the promotion of sedentary lifestyles among customers — could be extrapolated to the tech world. The responses of both are certainly similar. Companies like Pepsi and McDonald’s have both attempted to counteract accusations by sending representatives to schools to promote the benefits of regular exercise. Coca-Cola, meanwhile, launched a fitness campaign depicting two people sitting together, cuddled up, on a beach. “Are you sitting on a solution?” the ad read. An article published on Alternet scoffed: “The thing is, they’re drinking the problem: Coca-Cola.”

Possible solutions to the problem

If we sympathize with the fundamental disconnect of a fast food or sugary beverage company also telling us to live a healthy life, should we apply that same skepticism to tech giants? What is the difference between the actions of Coca-Cola and, say, the Apple Watch’s regular notifications that we should go outside or stand up? Part of the reason we are sitting around looking at screens, instead of going out, is because of companies like Apple, which first helped popularize the personal computer and, perhaps more fundamentally, the smartphone.

“Ideally, Apple could integrate the age of the user into the set-up process for the phone”

Twenge said that she is not pinning the blame on tech giants, whether those are the companies which make the phones or the ones that run the social media platforms used on many of them. The right answer, she suggests, is a combination of parenting and, perhaps, a bit more social awareness on the part of today’s tech leaders. “To be clear, it’s not that companies are responsible for this,” she said.  “It’s that companies should give parents better tools for limiting their kids’ screen time.”

“Ideally, Apple could integrate the age of the user into the set-up process for the phone,” she continued, giving an example of one possible solution. “If you say the phone is for a 12-year-old, for example, it could give you the option to restrict the apps used, shut down the phone at night, limit the number of hours it could be used, and/or allow communication only with a short list of phone numbers. Parents might be more willing to buy their children smartphones if they were easier to regulate.”

It will be fascinating to see what happens next. Will tech companies offer tokenistic gestures to placate concerned parents, or will this represent the beginning of a bigger change? If tech figures like Mark Zuckerberg plan on a possible career in politics, we’d hope it will be the latter.

As Twenge’s iGen book points out, one of the most notable characteristics of today’s young people — in addition to their love of technology — is their emphasis on the importance of safety and mental health. When these two areas clash, which is going to win out? The question of how much responsibility tech companies actually have when it comes to shaping the world is a battle that is still being fought. From whether Facebook has any responsibility for the news it helps disseminate to whether iPhones play a role in depression among young people, these are complex issues to be unpacked.

We’d certainly like to see tech giants live up their world-changing ideals by addressing them head-on, though.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
BYD’s cheap EVs might remain out of Canada too
BYD Han

With Chinese-made electric vehicles facing stiff tariffs in both Europe and America, a stirring question for EV drivers has started to arise: Can the race to make EVs more affordable continue if the world leader is kept out of the race?

China’s BYD, recognized as a global leader in terms of affordability, had to backtrack on plans to reach the U.S. market after the Biden administration in May imposed 100% tariffs on EVs made in China.

Read more
Tesla posts exaggerate self-driving capacity, safety regulators say
Beta of Tesla's FSD in a car.

The National Highway Traffic Safety Administration (NHTSA) is concerned that Tesla’s use of social media and its website makes false promises about the automaker’s full-self driving (FSD) software.
The warning dates back from May, but was made public in an email to Tesla released on November 8.
The NHTSA opened an investigation in October into 2.4 million Tesla vehicles equipped with the FSD software, following three reported collisions and a fatal crash. The investigation centers on FSD’s ability to perform in “relatively common” reduced visibility conditions, such as sun glare, fog, and airborne dust.
In these instances, it appears that “the driver may not be aware that he or she is responsible” to make appropriate operational selections, or “fully understand” the nuances of the system, NHTSA said.
Meanwhile, “Tesla’s X (Twitter) account has reposted or endorsed postings that exhibit disengaged driver behavior,” Gregory Magno, the NHTSA’s vehicle defects chief investigator, wrote to Tesla in an email.
The postings, which included reposted YouTube videos, may encourage viewers to see FSD-supervised as a “Robotaxi” instead of a partially automated, driver-assist system that requires “persistent attention and intermittent intervention by the driver,” Magno said.
In one of a number of Tesla posts on X, the social media platform owned by Tesla CEO Elon Musk, a driver was seen using FSD to reach a hospital while undergoing a heart attack. In another post, a driver said he had used FSD for a 50-minute ride home. Meanwhile, third-party comments on the posts promoted the advantages of using FSD while under the influence of alcohol or when tired, NHTSA said.
Tesla’s official website also promotes conflicting messaging on the capabilities of the FSD software, the regulator said.
NHTSA has requested that Tesla revisit its communications to ensure its messaging remains consistent with FSD’s approved instructions, namely that the software provides only a driver assist/support system requiring drivers to remain vigilant and maintain constant readiness to intervene in driving.
Tesla last month unveiled the Cybercab, an autonomous-driving EV with no steering wheel or pedals. The vehicle has been promoted as a robotaxi, a self-driving vehicle operated as part of a ride-paying service, such as the one already offered by Alphabet-owned Waymo.
But Tesla’s self-driving technology has remained under the scrutiny of regulators. FSD relies on multiple onboard cameras to feed machine-learning models that, in turn, help the car make decisions based on what it sees.
Meanwhile, Waymo’s technology relies on premapped roads, sensors, cameras, radar, and lidar (a laser-light radar), which might be very costly, but has met the approval of safety regulators.

Read more
Waymo, Nexar present AI-based study to protect ‘vulnerable’ road users
waymo data vulnerable road users ml still  1 ea18c3

Robotaxi operator Waymo says its partnership with Nexar, a machine-learning tech firm dedicated to improving road safety, has yielded the largest dataset of its kind in the U.S., which will help inform the driving of its own automated vehicles.

As part of its latest research with Nexar, Waymo has reconstructed hundreds of crashes involving what it calls ‘vulnerable road users’ (VRUs), such as pedestrians walking through crosswalks, biyclists in city streets, or high-speed motorcycle riders on highways.

Read more