Strengthening online safety

 

It is time for tech companies to empower parents with the tools they need to protect their children online. By Nick Cater.

The following is the Executive Director’s introduction to the MRC’s online safety report whose recommendations will be adopted should the Coalition return to government after the 2021 Federal Election.

The extreme measures introduced to control the coronavirus pandemic have brought dramatic changes to the way we raise our kids. As yet, we have no way of knowing how permanent these changes will be or the long-term consequences for children who have been forced to spend their formative years in online classrooms and virtual playgrounds. Their safety in this digital environment remains paramount.

Concerns have been raised about the safety of children online since the commercialisation of the internet in the 1990s. The opportunities for children to go online unsupervised multiplied with the arrival of smart mobile technology and personal devices.

Data from the United States shows that the proportion of children who spend four hours a day or more using electronic devices has more than doubled during the pandemic from 21 per cent to 44 per cent. Twenty-six per cent of pre-school children crossed the four-hour threshold, 44 per cent of 5-10 year-olds and 47 per cent of 11-13 year-olds.

There have been benefits with the development of online classrooms which have the potential to enhance education when bricks and mortar institutions permanently reopen. It is what happens out of school hours that raises the greatest concern.

Concern in the community about protecting children online is widespread. A recent survey found that 79 percent of Australians were more concerned about the dangers of leaving children unsupervised online than they would be about leaving them unsupervised in a playground.

The question of who is responsible for supervising the online playground and equipping them to do it is vital if we are to safeguard our kids. Primary responsibility naturally belongs to parents under English common law with limited powers of parens patriae, the power to intervene against and abusive or negligent parent, granted to the state. The provisions of in loco parentis allow the responsibility of biological parents to be transferred partly or in whole to another responsible adult or institution such as a school.

In the brave new virtual world in which are children are increasingly being raised, big tech assumes the powers of in loco parentis by default in the absence of parental control. That means Apple and Google, who control the dominant mobile operating systems and mobile apps marketplaces, are deciding what our children can and cannot do online. They prioritise their commercial objectives over community expectations and this has now been clearly established by regulatory reviews across the globe, including the ACCC's Digital Platforms Inquiry in the mobile market which has developed into a global duopoly, split between the corporations controlling the rival Android and IOS platforms.

Evidence of practices which are clearly misaligned with community objectives include.

  • In 2018, Apple unilaterally removed parental control and screen time management apps from the App Store in a blatant move to stifle competition;

  • Apple and Google in particular, support a two-tiered app developer environment where business customers can access materially more performant and functional safety features than parents;

  • Google and Apple have declared that the age of consent is effectively 13. This is the age at which users are able to establish their own accounts rather than being part of a family sharing account.

More troubling still are the actions of tech companies that restrict the ability of parents to monitor the online activity of their own children. In 2018, Apple systematically removed 11 of the leading third-party parental control apps from its platform. The restrictions coincided with the launch of Apple’s own control software, Screen Time. Attempts by independent software developers to develop apps using the capability of Screen Time have been frustrated by issues of compliance.

The action of the tech corporations in thwarting the development and marketing of software to assist parents and safeguard children is at odds with community standards. Polling shows that 74 per cent of Australians believe the duty of care belongs to parents. Only 11 per cent think the tech companies themselves would do a better job.

The competition issues around digital platforms are complex. Hundreds of billions of dollars worth of apps are sold every year in a global marketplace mostly controlled by just two players. The ACCC’s ongoing Digital Platform Services Inquiry is the Australian government’s contribution to a worldwide push by lawmakers to protect the interests of consumers and third-party developers.

Australia can and must take a lead by taking proactive steps to introduce greater competition into the digital platform market. The serious concerns highlighted in this report should be the catalyst to do so.

While the advent of digital platforms present novel challenges to competition regulation, it is not dissimilar to the approach Australia took to ensuring competition in mobile telephony 20 years ago.

The Howard government introduced innovative, industry-specific regulatory measures including mobile number portability, universal service guarantees, the expansion of the powers of the ACCC and the establishment of a Telecommunications Industry Ombudsman.

While the competition challenges presented by digital platforms are very different, the underlying principle is the same. Competition is the most effective way to better serve the consumer and encourage innovation.

As the report recommends, the complexity of regulating digital platforms with the broader competition policy framework points to the need for sector-specific measures including the establishment of a digital platforms ombudsman to handle complaints.

Ensuring that parents are given every assistance helping their children avoid the perils of the online world should be made a priority. The recommendations in this report set out a path to accomplish that aim.

Watch Celia Hammond MP launch the report alongside panellists Jordan Foster and Tim Levy.