How can you preserve children’s online safety and what is the Online Safety Act? - Cafeqa

How can you preserve children’s online safety and what is the Online Safety Act?

Anúncios

The Online Safety Act mandates additional measures from tech firms to protect youngsters when they use the internet

New regulations, however, won’t take effect until 2025, and some feel they don’t go far enough.

How many hours a day does the average youngster in the United Kingdom spend online?
A study conducted by the communications regulator Ofcom indicates that children ranging from eight to seventeen years old spend two to five hours daily online. As people become older, they spend more time online.

Anúncios

Almost all children over the age of 12 have access to a mobile phone, and the vast majority of them view videos on sites like YouTube or TikTok.

More than 80% of online teens have used an AI product, such as ChatGPT or Snapchat’s MyAI.

Anúncios

According to Ofcom, the majority of youngsters over the age of 12 believe that spending time online is beneficial to their mental health.

Some members of the minority, however, find it to be untrue. Among youths ranging in age from eighteen to seventeen, one-eighth reported experiencing online harassment, bullying, or other forms of cyberbullying.

In a poll of 13-year-olds, the Children’s Commissioner found that 50% had seen pornographic content that was “hardcore, misogynistic” on social media.

Where can I find the various online controls for parents?


The majority of parents (66%) claim to use parental controls to restrict their children’s online content, according to Internet Matters, a safety organization established by major UK internet firms.

There are instructions on how to use each of the various parental controls and a list of them.

For instance, YouTube is the most popular site among UK youths, therefore parents who are concerned about their children seeing inappropriate information may enable the “kids” version, which hides adult videos.

If you have older children who use the main site, you can set up monitored accounts so you can see what they’re up to online.

Facebook Messenger’s Family Centre is another place to set up supervision.

According to TikTok, parents may choose to keep their teenager’s account private using the app’s family pairing feature.

Daily time limitations, planned break periods, and the ability to identify profiles that children have reported are all part of Instagram’s parental controls.

These restrictions, however, are not infallible. According to Ofcom, roughly 20% of kids utilize workarounds.

To families, Zuckerberg offers his apologies


Can you tell me how to use the controllers on consoles and mobile phones?
Until a person proves they are 18 or older, phone networks may ban access to some pornographic websites.

Also, some of these phones come with parental settings that let you choose which websites your kids may access.

There are applications and systems available for both Android and Apple devices that parents may use.
These have the ability to monitor web surfing, ban or restrict access to certain applications, limit access to explicit material, and prohibit transactions.

Google has Family Link while Apple offers Screen Time. You may find comparable applications made by independent programmers.

You may restrict access to certain material categories using parental controls on broadband providers as well.

Controls on game consoles also allow parents to monitor their children’s gaming activity and manage any in-game purchases.

What is the best way to teach your kids to be safe while using the internet?


As the NSPCC points out, it’s crucial to have conversations with kids about being safe when using the internet and to show genuine interest in what they do while online.

It suggests including talks about it into everyday discourse, similar to talking about a child’s day at school, so that youngsters feel more comfortable sharing their worries.

How do IT businesses need to adapt to the new regulations?


The government claims that social media companies and search engines should take responsibility for protecting minors from some lawful but dangerous content under the Online Safety Act, which is set to take effect in the second half of 2025.

Also, platforms will need to prove they will remove unlawful material, which includes:

Extreme sexual assault against children, dominating or coercive behavior, selling illicit narcotics or weapons, animal cruelty, terrorism
Sites that host pornographic material will be required to use age verification measures in order to prevent youth access.

Additional additional crimes have been enacted, such as:

Cyber-flashing, the practice of sending sexually explicit images to users without their consent, and sharing “deepfake” pornography, which uses AI to substitute a person’s likeness into pornographic content, are all addressed by the act. Additionally, it facilitates the process by which grieving parents can access their children’s personal information held by tech companies.

Ofcom, the regulator, has been granted further enforcement powers and has released draft codes to ensure that corporations adhere to the new regulations.

It states that in order to prevent the most dangerous information from showing up in children’s feeds and to make other hazardous materials less visible and prominent, corporations must adjust the algorithms that determine which content users view.

Any business found in violation of the regulations will have their minimum user age increased to 18 years old, according to chief executive Dame Melanie Dawes.

Aside from that, Michelle Donelan, the secretary of technology, emphasized the need of big tech carefully implementing the codes:

Work with us and get ready,” she admonished.

“Do not wait for enforcement and hefty fines – step up to meet your responsibilities and act now.”

Who or what has voiced opposition to the proposed regulations?


Critics have criticized the wait until the new regulations take effect and have labeled them “insufficient” not the case of some parents whose children died as a result of exposure to hazardous information on the internet.

In an open letter to Prime Minister Rishi Mr. Sunak and Leader of the Opposition Sir Keir Starmer, a number of grieving parents, including Ian Russell (Mollie’s father) and Esther Ghey (Brianna’s mother), demanded more measures.

They are seeking the inclusion of mental health and suicide prevention in the new school curriculum as well as a pledge to enhance the Online Safety Act within the first half of the next parliament.

“While we will study Ofcom’s latest proposals carefully, we have so far been disappointed by their lack of ambition,” the letter states.

“Online Safety Act” is deemed insufficient by Esther Ghey.
Using her own words – the anonymous Twitter account of Molly Russell
In response to the new regulations, what have tech businesses said?
Snapchat and Meta both boasted about their current parental controls and stated they have additional safeguards in place to prevent users under the age of 18.

“As a platform popular with young people, we know we have additional responsibilities to create a safe and positive experience,” a spokesperson for Snapchat stated.

For Meta, the goal was for youth “to connect with others in an environment where they feel safe,” according to a company spokesperson.

“Content that incites violence, encourages suicide, self-injury or eating disorders breaks our rules – and we remove that content when we find it,” according to them.

When asked about the proposed changes, many other IT firms contacted by BBC News chose not to comment.

Lastest News