fbpx
Newsdesk

Children’s Internet Code and How Social Platforms are Preparing For It

A new ‘ground-breaking’ code to create a safer and more suitable internet browsing experience for children came into force in the UK yesterday. TikTok and YouTube are among those who have acted quickly to make changes to their platform to align with the code.

In September 2020, the UK’s independent data authority, the Information Commissioner’s Office, introduced the Age Appropriate Design Code, allowing companies a year to comply following concerns relating to privacy, inappropriate advertising, and tactics to keep children online for long periods of time.

The Information Commissioner’s firm decided that if social media, gaming, video, and music platforms do not have regulations in place to protect the data of children, they could become affected physically, emotionally, and financially in the future. A year on, the code has now been officially enforced. 

How will the code help? 

Companies online that target children must focus on the following:

  • Ensure that services are designed to be age appropriate and in the best interest of young people.
  • Carefully consider whether the use of data on the site keeps young people safe from commercial and sexual exploitation.
  • A high level of privacy should be implemented by default.
  • Design features which encourage young people to share more data are not to be used.
  • Geo-location services which track where a young person is based should be switched off.
  • Personal data collected from UK children should be mapped.

For the aim of the code to be successfully achieved, companies online must play ball. Social media and video platforms such as Instagram, YouTube, and TikTok are extremely popular amongst young people, and luckily these platforms have been quick to jump onto implementing helpful changes.

YouTube will turn off default auto-play on videos for all children, meaning that children will only watch videos they select themselves, rather than the auto-play feature selecting recommended videos that may not be appropriate. Ad targeting and personalisation will also be blocked for all children.

TikTok will stop sending notifications after 21:00 to 13 to 15-year-olds and 22:00 to 16 and 17-year-olds. Stopping notification in the evenings puts a greater focus on the welfare of children, encouraging them to spend less time on their mobile devices at bedtime.

In 2019, Instagram made it a requirement that new users would have to provide their birthdate when signing up to the app, but now, older users who were using the app before this requirement came into play will have to provide theirs in order to continue using the app. If you haven’t previously provided your date of birth, you will be prompted as soon as you open the app. This requirement aims to ensure that all Instagram users are over the age of 13.

This code focuses on areas of children’s safety not just in regards to safety online, but their greater welfare that could be affected by their internet usage.

Will brands and influencers have to make changes?

Many influencers and creators on the social platforms mentioned above have large Gen Z followings, many of which will fit into the 13-18 age category. There are often debates as to whether influencers should have to alter their content to suit a younger audience, or whether authenticity is paramount, and if an influencer’s content is not suitable for younger audiences, then younger audiences shouldn’t be consuming it.

Ultimately, the decision on whether to alter content to focus on the safety of younger audiences is down to the individual influencer. However, the brands they work with may feel a greater obligation to adapt content as well as altering publishing times in order to reach as many people as possible – especially with the introduction of no notifications after a certain time for children on TikTok.

1 comment

Have your say