11/26/2024 | Press release | Distributed by Public on 11/27/2024 18:14
The safety of our community, especially our younger users, is a top priority for TikTok, and something we take incredibly seriously. This year, we expect to invest more than $2 billion USD in our trust and safety work globally. This investment supports tens of thousands of trust and safety experts who work around the clock, across the globe, to keep our community safe. As part of this work, between April and June 2024 alone, we removed more than 20 million suspected underage accounts globally.
At TikTok, we know there is no finish line when it comes to improving safety for our community. We will continue to invest in our people and systems. For the Australian TikTok community to thrive, they need to have a safe and authentic digital experience. That experience looks different for users of different ages, including through considered product design. As well as proactively moderating content at-scale, we empower families and young people with the tools they need to manage their own experience every step of the way. We encourage all parents and caregivers to look at our Guardian's Guide for more information. TikTok's recent submission to the Joint Select Committee on Social Media and Australian Society also considered related safety matters. Of note for this Committee are our extensive, industry-leading protections and policies that are specifically designed to keep our younger users safe with age-appropriate experiences.
The Online Safety Amendment (Social Media Minimum Age) Bill 2024:
We have significant concerns with the process that has culminated in the Online Safety Amendment (Social Media Minimum Age) Bill 2024 (the Bill). Unfortunately, this Committee has not been given sufficient scope to address the many complex questions associated with this piece of legislation, let alone broader matters related to protecting children online.
Even through our time-limited review, we see a range of serious, unresolved problems that the Government must clarify to assure the Parliament that the Bill will not cause unintended consequences for all Australians. In particular, we encourage the Parliament to undertake proper and detailed engagement with experts, platforms, mental health organisations, young people and their families. Where novel policy is put forward, it's important that legislation is drafted in a thorough and considered way, to ensure it is able to achieve its stated intention. This has not been the case with respect to this Bill.
Issues with the Bill's definitions
This Bill has an important, but broad and unclear, definition that impacts the type of businesses and services that could potentially be included. As it is drafted, almost every online service could fall within the definition of 'age restricted social media platform ', including fitness apps and music streaming services. This lack of clarity effectively scopes in thousands of businesses and digital services that Australians rely on every day.
Questions the Government must provide clarity on include:
The Bill's definitions need considerable work to ensure they are clear, enforceable and applied fairly and as expressly intended by the legislation.
The Bill creates a 'license to be online', and hinges its enforcement on an age assurance trial
There are extensive references in the Explanatory Memorandum to the outcome of the Government's age assurance trial, including that "it will be instructive for regulated entities, and will form the basis of regulatory guidance issued by the Commissioner, in the first instance ". At Budget Estimates, on 5 November 2024, the following exchange gives the Committee insight into how the age verification trial will work:
As the Government's admissions in Budget Estimates make clear, age-restricted social media platforms will need to undertake age assurance for each and every Australian user in order to remove age-restricted users from their services. This effectively creates a mechanism whereby Australians need a 'licence to be online'.
It appears that this Bill hinges on an uncompleted trial, the outcome of which will likely require all Australians to be vetted through a yet-to-be-determined age assurance system. There are many questions about the trial itself, let alone its possible outcomes and conclusions, which have yet to be answered. Given the impact of mandating that all Australians who wish to use social media platforms be subject to such an age assurance system, we urge Parliament to consider the broader implications of legislating such an outcome without knowing any details of the system itself.
This also raises one of many significant, outstanding questions that impacts the privacy of Australians online.
'Privacy' provisions could undermine both privacy and safety
The Bill's stated intention is to improve safety. However, there are a range of inconsistencies in this Bill, including in relation to privacy, some of which conflict with its purported safety objectives. Certain other proposed changes in the Bill seek to cover ground already addressed in the Privacy Act, creating confusion in how conflicting provisions are intended to operate. The Government should provide the Committee with guidance on these issues, and some examples of these tensions within the Bill are outlined below.
With respect to section 63F(3)(a), it is unclear from the drafting of this deletion requirement whether platforms would be permitted to retain information about the individual's age itself (e.g. to retain the information that the user associated with an account/email address/phone number is 21 years old), or whether platforms are required to collect an individual's age each and every time they use the platform. In practice, this seems at odds with privacy principles and efficient data collection. Some examples are provided below:
The above are just some of the challenges this Bill presents, noting the short timeframe provided to scrutinise the legislation. However, it is clear that the Bill will impact all Australians and that its rushed passage poses a serious risk of further unintended consequences. As countless online safety experts and mental health organisations have highlighted, there remain many unanswered questions and unresolved concerns regarding this legislation, with its poorly drafted and unworkable definitions, unclear privacy safeguards, and its dependence on the Government's yet-to-be-completed age assurance trial ranking highest among them.
We urge the Committee and the Parliament to treat this legislation as it would any other significant reform proposal, and to take the time to listen to the experts, many of whom have clearly and repeatedly expressed their concerns with this legislation.