Australian children are being treated like data farms by digital platforms such as Instagram, TikTok and Snapchat, as complex terms-and-conditions make it difficult for kids to understand how their information is being used, experts fear.
Professor Elizabeth Handsley, a children’s media expert at University of Western Sydney, says urgent intervention is needed to protect children from a future where companies could use their data for life-changing decisions like university admissions and access to insurance schemes.
“This data is going to be floating around for years and decades into these people’s lives,” Professor Handsley told The New Daily.
“Someone that’s eight now will be in the world for the next 80 years. What will be the potential uses of data then?”
No informed consent
A new report by advocacy group Reset Australia has found terms and conditions (T&Cs) offered by large digital platforms are overly complex and impossible for many users to understand – if they read them at all.
The analysis of T&C agreements offered by digital platforms found the likes of Google, Facebook, Twitter and Fortnite owner Epic Games are hiding dozens of product documents from users when they sign up.
It found the terms and conditions shared would take hours to read in full and required a tertiary education to comprehend.
“Ten of the most popular digital services accessed by Australia’s young people do not appear to present T&Cs in accessible ways that promote meaningful consent,” Reset Australia researchers said in their report.
“Troublingly, they also seem to employ design techniques that actively nudge users to act against their own best interests (‘dark patterns’).”
Tiktok, Google, Facebook T&Cs open to catastrophe
Despite failing to obtain meaningful consent, digital platforms can treat children like data farms once they agree to sign up, Professor Handsley said.
Platforms like Instagram and TikTok are allowed to share user data with others for advertising and other commercial purposes such as analytics.
Google collects search terms, videos watched, views and interactions with content and ads, voice and audio information, purchase activity, communications with friends, and data from third party websites.
Timothy Graham, a senior lecturer in digital media at Queensland University of Technology, said the scope of the terms were so broad that it was unclear where the user data actually went.
“When you dig into these terms and conditions … most [of the time] you basically sign away every right you have to your content,” he told The New Daily.
“There’s unavoidable catastrophic consequences that can come because of you agreeing legally to do [that] when you don’t have the statutory ability to agree to those kind of things because you’re a child.”
In one example, Dr Graham suggested social media posts made by kids today could affect their ability to apply for jobs later in life, particularly as large-scale data processing technology becomes more widespread.
“A lot of other financially interested companies amass data from these platforms,” he said. “They have lots of data archives that includes posts by individuals, and some of them are children.”
TikTok, Snapchat, streaming platform Twitch and Epic Games are also allowed to share user data with other companies if they are sold, merged or go bankrupt in future.
“If we look a little way into the future and make some educated guesses, access to health services and insurance could be affected, access to educational and work opportunities could also be affected,” Professor Handsley said.
“What if this data gets sold to a university or an insurer?”
Key issue of our time
Similar concerns to Professor Handsley’s are already surfacing.
Over the past year, the ACCC has tussled with tech giant Google over its decision to buy wearables company Fitbit, as the deal will see user health data handed to Google.
The ACCC was concerned that Google would use the data for advertising, without having obtained informed consent from Fitbit users.
Dr Graham said Australia lacked the legal safeguards to protect children online and called the problem a “key issue of our time” that requires policymakers to write new laws to protect children and adults alike.
He said legislating a right to be forgotten would be a good initial step.
And Professor Handsley also urged policymakers to legislate key protections like a right to be forgotten, saying it was unlikely digital platforms would act of their own accord.
“You just cannot expect the tech companies to do it,” she said.
“They exist to make a profit, they exist to do what they can do to make a profit. Unless there’s a law against something, they’re going to do it.”
Sunita Bose, managing director of the Digital Industry Group (DIGI), a body that represents Facebook, Google and Twitter, said its members have protections in place for minors, including age restrictions, depending on the nature of services being offered.
“DIGI has been engaging with the Government’s privacy law reform, as well as code development related to the safety of minors,” Ms Bose said in a statement on Tuesday.
“Developing proper terms of service and privacy policies can be a difficult balancing act, as they are legally required to include certain information, and need to be comprehensive as well as clear.”