Instagram rolls out new parental controls and safety guidelines

Instagram on Wednesday launched a series of updates aimed at making the platform safer for underage children by giving parents more support and resources on the app.


What you need to know

  • Instagram on Wednesday launched a series of updates aimed at making the platform safer for underage children by giving parents more support and resources on the app.
  • The updated supervision tools, which are currently active in the US and will roll out globally over the coming months, can be found in the new Family Center portion of the app
  • Moderation controls are optional, and the teen user must agree to participate; Rules can be terminated by the child or a parent at any time
  • Parents and guardians can now view how long teens spend on Instagram and can set app time limits

The updated moderation tools, which are currently active in the US and will roll out globally over the coming months, can be found in the new Family Center part of the Instagram app. Parents of teens aged 13-17 can now view the amount of time their kids spend on Instagram and can set time limits, can be notified when their child reports a user on the app and view and receive updates about the type of content the teen is watching. on instagram.

Moderation controls are optional, and the teen user must agree to participate; The rules can be terminated by the child or a parent at any time. The controls expire automatically when the user turns 18, based on the date of birth provided when creating their account.

“Teens will now need to start administering in the app on mobile devices, and we will be adding an option for parents to start administering in the app and on desktop in June,” Adam Mosseri, head of Instagram, wrote in a blog post. “Teens will need to consent to parental supervision if requested by a parent or guardian.”

The Family Center will also serve as a learning center for parents and guardians to learn about the new tools, and will have videos and other resources to teach adults “how to talk to teens about social media.”

Under the Instagram User Agreement, individuals under the age of 13 are not permitted to use the Application. Meta Inc. , the parent company of Instagram, says children often gain access to its platforms by lying about their age – but says it removes accounts if it determines that users are underage.

Mosseri added that Wednesday’s announcement is “just one step on a longer path,” as similar parental controls will be rolled out across all other Meta platforms.

Over the coming months, the company plans to roll out parental supervision tools on the Quest VR headset. From April, parents will be able to prevent their children from buying toys they deem inappropriate. By May, teens will be automatically banned from downloading content inappropriate for their age.

Meta and social media platforms in general have faced increasing pressure from lawmakers and parents alike to enhance safety on their sites.

The US Surgeon General in late 2021 issued rare public advice referring in part to media companies and their potential role in negatively affecting young people’s mental health, saying that while some programs “can have a strong impact on young people”, the attack “is wrong, Misleading or exaggerated media narratives can perpetuate misperceptions and stigma against people with mental health problems or substance abuse.”

The report urged social media companies to take a number of steps to reduce their negative impact on teens’ mental health — the first of which should be providing the government with more accurate data on the behavioral effects of time spent online, report officials said. .

Meta has updated its user guidelines several times since the report was published. In December, Instagram launched a “Take A Break” feature, which will let teen users know if they’ve focused on a particular topic for too long or have been using the app for an extended period of time.