Stay informed
Advertisers can no longer turn blind eye to social media harms, prof says
When American sociologist and economist Herbert A. Simon first introduced the idea of the attention economy in 1971 , the model was based on the “information-rich” mediums of the day — television, radio and print.
Boiled down, the theory posits that in a world with more and more information for humans to digest, the scarcity of their attention becomes a valuable commodity.
Over 50 years later, while older media survive and claw for scraps of advertising revenue, today’s attention economy exists predominantly in the online ecosystem and nowhere more so than on social media sites.
Like TV and radio before them, as largely free-to-use services, behemoths like Meta, Snapchat, TikTok, and others don’t generate their billions in ever-growing annual revenue from users — it comes from advertisers. And with children and teenagers among the largest cohorts of users, all manner of companies have shown little compunction about using the websites and apps to market their products and services specifically to one of society’s most vulnerable populations.
In the matrix of harms posed to Canadian youth through regular or excessive social media use, targeted advertising may not be as worrisome as cyberbullying, predators, pornography or the host of other dangers, but it’s no less pervasive.
The risk of harm doesn’t come from the targeted advertising-based business model alone, according to Vikram Bhargava, but from the “conjunction” of those ads simultaneously being “hyper-tailored in real time toward the individual.”
“When you take the two of them … it renders it such that the user is just a sort of entity to be manipulated with respect to the advertiser’s interests,” the assistant professor of strategic management, public policy and philosophy (by courtesy) at George Washington University in D.C. told National Post in an interview.
“And this is, in some ways, what generates this wide range of problems that have come about in part to these platforms.”
But while tech CEOs and leaders are semi-regularly called to testify before the U.S. Congress about troublesome aspects of their products, often at the expense of its reputation and their own, Bhargava said advertisers seem to escape that scrutiny, even though they are the primary social media customers.
“Many people would think that if you were sort of consuming from a different kind of business that was engaged in another untoward practices, you’re, at least in some way, expressively endorsing support or contributing to their ability to undertake these untoward practices,” he said.
“In the context of addiction, you might think of it as akin to a beer company going to and advertising at an Alcoholics Anonymous convention.”
‘Using you against yourself’
The addictive nature of social media is a problem that’s long been on Bhargava’s radar.
Unlike traditional advertising, every time they log on and start scrolling, offering up scores of data about their likes and dislikes, people willingly but unknowingly train adaptive algorithms to serve up more of the former, keeping the user dialled in and exposed to more targeted ads.
“That’s what adds this insult to injury, this using you against yourself dimension,” Bhargava said of the product changing.
Their addictiveness was established in a California courtroom last week, where a jury ruled in favour of the complainant in a civil case who said she became addicted to Google’s YouTube and Meta’s Instagram due to their design, eventually leading to depression and suicidal ideation that required medical intervention. The 20-year-old was awarded $6 million in damages, but had previously settled out of court with TikTok and Snapchat, entities she also sued, but settled with out of court.
A day earlier, a case brought forward by New Mexico ended with Meta ordered to pay $375 million after a jury determined it knowingly harmed children’s mental health and concealed knowledge of child sexual exploitation on its platforms.
Google is appealing, as will Meta with both.
Bhargava went on to say that social media exerts a “unique kind of exploitiveness” over everyone, not just children and teens, because its use is ingrained in school settings, community engagement, how people stay well-informed citizens, and even how people look for romance online.
“Even if you’re not yet addicted, it’s so entrenched in different parts of life that there are just sort of innumerable opportunities to take advantage of this fact.”
As part of its campaign to have social media use restricted to Canadian kids 16 and older, Unplugged Canada is advocating for legislation that strengthens youth privacy rights by banning the sharing and sale of children’s data and which limits “profiling and ad targeting.”
Robin Sherk, a mother of four and member of the parent-led movement’s national advocacy team, said targeted advertising to young users can be problematic, particularly as it relates to the most vulnerable among them.
“What’s different between social media ads and traditional ads you saw on Saturday morning cartoons is that these are tuned to the person,” Sherk told National Post. “And it can be tuned objectively to those insecurities as well.”
She used a young girl self-conscious about her appearance being served advertisements for makeup, plastic surgery or dieting ads as an example.
“When they’re younger, and they’re still developing their sense of self, sense of who they are in this world, there’s some lines that shouldn’t be crossed in terms of influencing or making them feel that they need a product to feel a certain way or be a certain way,” Sherk said.
If there’s hope for their future, sociologist Kara Brisson-Boivin said that in her experience as the director of research for MediaSmarts, Canada’s Centre for Digital Media Literacy — a charitable organization that develops digital and media literacy resources for homes, schools, and communities — young people are “far more self-reflective and self-aware” when it comes to how much time they spend in front of a screen.
And when educated on how social media companies use their data, they bristle with the same indignation as older generations would upon learning of untoward tactics by the government, media or big business.
As part of MediaSmarts’ “Young Canadians in a Wireless World” research study on youth attitudes, behaviours and opinions regarding the internet, technology and digital media, they’ve used a program to educate kids about algorithms and how they work.
In the first phase, participants would often describe algorithms as a friend serving up a steady dose of the content they like most. In phase two, it’s explained how social media companies profit from content, including advertisements. But by phase three, as they learned how their age and gender data is used in concert with “inferred data” based on their content preferences, Brisson-Boivin said their “tone had changed significantly.”
“Once they started better understanding how algorithms were learning from their behaviours, from their data, so often without their permission or awareness, it drastically shifted from the algorithm as my friend, to ‘I have deep, deep concerns for the kinds of biases that algorithms are perpetuating.’ They were describing the kinds of privacy violations as theft, as criminal, as wrong.”
Consumerism and consent
But MediaSmarts, a not-for-profit always fighting for funding to continue its advocacy and research, is only able to enlighten so many kids about this potential harm and others.
The reality is that most youth, even those whose parents are wise to the industry’s tactics, are so inundated that they have to figure it out for themselves, and Brisson-Boivin said they’re not as equipped to do so.
“The primary shift is that consumerism is now embedded in the experience,” she said, highlighting unboxing videos, content that mirrors advertising and product placement as egregious offenders.
“The mechanism for delivery has made it so that it’s really difficult for kids to recognize when they’re being advertised to.”
She said there’s still a lot of work to do in that regard, but isn’t sure it will be seen as the responsibility of the platforms or the advertisers.
For his part, Bhargava said advertisers should consider that children and teens, whose brains and identities are still developing, are not the same consumers as adults.
“The whole market economic system and its justification depends on the assumption of voluntary interactions between consenting parties,” he said, “but it’s not exactly clear whether children or early teens have the full capacity to consent in these kinds of ways.”
The Canadian Marketing Association’s code of ethics and standards explicitly state that “marketers must not knowingly exploit the credulity, lack of knowledge or inexperience of any consumer,” particularly those from vulnerable groups, including children and teens.
Specific to that cohort, it also underscores the responsibility to use age-appropriate marketing techniques and to foster trust by not exploiting “their naivety, lack of experience, sense of loyalty, their impressionability, or susceptibility to peer or social pressures.”
But the sheer volume of advertising on social media and the lack of strict age restrictions muddies those waters significantly.
Sherk said finding the balance for advertisers and protecting youth will require “collective action.”
“It’s the question of what age is it okay for children to be on these platforms, and then afterwards, at what age is it okay to start targeting them and exposing them to what I’ll call more sophisticated or more compulsive use features.”
The age restriction conundrum
The potential harm of targeted advertising to youth on social media is just one of several that Sherk and Unplugged Canada say demonstrate the need for an age restriction, a move made by Australia late last year and anticipated to follow soon in France, Denmark, and other nations.
Here in Canada, Saskatchewan Premier Scott Moe said he plans to engage with parents on the topic but added it would be best if the federal government leads the charge. In Ottawa, Prime Minister Mark Carney has said the topic will be debated during the April caucus meetings.
Meanwhile, three-quarters of Canadians — and 70 per cent of those with kids at home — support a full social media ban for those under 16, according to an Angus Reid survey this week. While not their chief concern, 82 per cent of respondents caring for youth cited privacy and data collection among their worries.
Sherk also highlighted a joint investigation into TikTok by Canada’s Office of the Privacy Commissioner and the provinces of B.C., Alberta and Quebec, which found that despite explicitly stating the platform is not for those under 13, the company has been collecting and using personal information of hundreds of thousands of Canadian children on the platform each year.
“It also found that TikTok did not adequately explain its data practices to teen and adult users, nor did it obtain meaningful consent for the collection and use of vast amounts of user data, including sensitive data of younger users, as required under Canadian privacy laws,” the office wrote in a news release.
Sherk said an age restriction is one solution, but there are further considerations around youth privacy and data.
That “very fine-grained data,” Bhargava said, should give the tech giants the capacity to easily implement restrictions, even for kids enterprising and sharp enough to circumvent them.
“It doesn’t take the revelation of exactly how old they are for their algorithms to be able to detect with a high probability what age band they fall under,” he said. “There’s going to be extremely straightforward data-related tells related to that.”
Like Sherk, MediaSmarts sees “age gating” as one solution, but Brisson-Boivin said data indicates it’s an “imperfect” one and noted Australia’s initiative has yielded mixed efficacy reports.
Moreover, in their experience, the language of banning alone can cause youth to seek out other potentially “less regulated and more problematic” platforms or engage in “closeted” use, something she said kids are already reporting.
The renewed desire for stricter restrictions, she said, points “to the appetite that parents and caregivers and educators and so many of us have for change in the kinds of environments that our children are engaging in and I think that we can’t not acknowledge that.”
In the absence of a restriction, Brisson-Boivin advocated for increased education for youth and said social media companies can make better design choices to make their platforms safer and more child-friendly.
“One of those examples can be choosing to dis-embed the advertising experience from the platform experience and making it more of an obvious choice that this is an advertisement,” she said.
The question of complicity
To what degree huge corporations like Canadian Tire or Lululemon, small and medium businesses, or even independent influencers are blameworthy for their part in the negative effects and potential harms of social media is difficult to establish.
But, in the face of growing societal concern, Bhargava argued they can no longer turn a blind eye to alleged wrongdoings.
“If you were to consider how advertisers respond to other things, if an athlete or an actor or an actress does something untoward, advertisers will immediately sort of pull their endorsements related to that person,” he said.
But because the relationship on the surface feels like it’s primarily between the user and the platform, he said it’s “resulted in advertisers not fully appreciating the degree to which their contributions to these platforms may render them … sort of constitutive of contributing to these very wrongdoings in an important way.”
To that extent, he said, “they plausibly might be considered as complicit in the harms that come about.”
In terms of practical advice to advertisers, Sherk, like Bhargava, said they should be “thoughtful in the platforms” they choose to reach kids and find healthier spaces to be associated with instead.
In the meantime, she suggested they funnel advertising dollars away from the internet and into the community.
“Whether it’s sports teams or different types of community clubs or libraries, there are lots of other places that children are also going to that could use other types of sponsorships or other types of connections or ways to support as well,” she said.
Research lagging
Firmly embedded in the camp that believes there is nothing about social media that is “child safe,” Sherk said she doesn’t need another 20 years of research to prove that point.
In fact, she highlighted a recent preliminary report out of the University of Washington that found almost half of social media studies published in top journals had ties between authors and the industry — funding, collaboration or employment — and a third of those were not disclosed.
The others, however, said more research on social media’s ill effects in general is needed to fully grasp the extent of the problem.
Bhargava said an effort similar to the campaign against smoking and big tobacco from 30 years ago is warranted.
“There is mounting evidence and there’s mounting data, but the case is not decisive yet in the way that it was with cigarettes, and I think that is going to be important to continue to invest in,” he said.
MediaSmarts has been at the forefront of research on youth and the digital world for more than 20 years, and Brisson-Boivin said their Wireless World is Canada’s longest-running study on young people’s technology use. But funding has dried up in recent years.
“On the one hand, what we hear about almost every day are concerns about young people’s well-being online and yet there just seems to be a misalignment with wanting to invest in actually studying that,” she said.
Brisson-Bovin added that it doesn’t help that social media companies are unwilling to hand over their troves of data that might better inform research on the ever-shifting and “incredibly opaque” target.
Another hurdle in studying kids’ online well-being is that the “conclusions preclude the research.”
“T here’s a lot of information, I’ll say, in the political sphere right now that ‘social media is bad, full stop, therefore, we don’t need to study it,'” she said.
“And I think that’s doing, especially young people, a huge disservice. Because we don’t know enough about in what ways it’s harmful for whom. And we know that there are different equity needs and implications, but we don’t know exactly what those are.”
Our website is the place for the latest breaking news, exclusive scoops, longreads and provocative commentary. Please bookmark nationalpost.com and sign up for our daily newsletter, Posted, here.




Comments
Be the first to comment