Alberta teens, advocate support social media ban, want more urgency from Ottawa | #childpredator | #onlinepredator | #sextrafficing


Listen to this article

Estimated 6 minutes

The audio version of this article is generated by AI-based technology. Mispronunciations can occur. We are working with our partners to continually review and improve the results.

Two Alberta teens and a Calgary-based online safety advocate are welcoming the news that Ottawa is considering age restrictions for social media and artificial intelligence chatbots, but say the government needs to act faster to pass online safety legislation.

“I’m pleased to see that members of Parliament all the way up to the minister and prime minister are seriously considering legislation to protect our children,” said Sara Austin, founder and CEO of Children First Canada. 

“But the debate is over. We know that social media, AI chatbots, gaming … they’re harming our kids. We need legislation that looks at the full scope of harm and that treats this as the crisis that it is.”

Last week, federal Culture Minister Marc Miller said the government is “very seriously” considering whether to impose an age restriction on access to social media and AI chatbots, following two non-binding resolutions that received majority support at the Liberal Party policy convention earlier this month.

The first resolution called on the government to obligate social media companies to prevent users under 16 from using their platforms, similar to a law Australia passed last year.  

The second urged the government to ban anyone under the age of 16 from using AI chatbots, like ChatGPT, and other forms of AI deemed harmful.

Austin said Children First has been at the forefront of online safety advocacy over the last decade. The organization is also part of a coalition of advocacy groups calling on the government to reintroduce online harms legislation. 

While considering age limits are an important step in protecting kids online, they cannot happen without online safety legislation first, she said.

“There needs to be a duty of care to prevent harm in the first place,” she said, adding many of the countries who have social media age restrictions already had some form of legislation in place.

“Canada has nothing on the books. There’s absolutely nothing protecting our children in the digital world. That’s unacceptable,” she said. 

A woman looks off camera on a sunny day, with a snow-covered street in the background.
Sara Austin is the founder and CEO of Children First Canada. She says she’d like to see the federal government work faster to implement online safety legislation in Canada. (Chelsey Mutter/CBC)

The Liberals are poised to reintroduce elements of the Online Harms Act, which died on the order paper when Parliament was prorogued last year. 

Last month, the government reconvened a panel of online safety experts and advocates to advise on how to approach online safety and regulate tech giants. But federal government officials said last week they are still weighing the best way to implement an online safety bill.

Social media ‘not a bad thing’

Thirteen-year-old Naba Kazi, who lives in Lethbridge, said she isn’t super active on social media, and mostly uses it to communicate with her friends. Still, she said there is a degree of pressure to be on it in order to keep up with current trends.

“If you’re not up to par with trends and all that, you can be seen as someone who, you know, doesn’t know enough, isn’t really included,” said Kazi, who is also a member of Children First’s Youth Advisory Council. 

LISTEN | What should online safety legislation look like?

The House10:18After years of studies, what should a revived Online Harms Act look like?

The federal government has said it is gearing up to revive online harms legislation which could include Australia-like age restrictions on social media use. Josephine Maharaj – a 12th grader who testified at a parliamentary committee on online safety this week – and law professor Suzie Dunn tell host Catherine Cullen what they think the government should do to make the internet safer.

But Kazi said she doesn’t see social media as inherently bad.

“Social media isn’t necessarily a bad thing, it just can be used in a bad way,” adding most platforms were not designed with youth safety in mind. 

For that reason, she would like to see stronger regulations that put the onus on media companies to make their platforms safer for young users. While age restrictions could help, Kazi said they feel more like a “quick fix” than a solid solution.

AI needs to be addressed too

Seventeen-year-old Calgarian Lane Koei said she too has felt pressure to be active on social media and use AI chatbots. She started using social media when she was nine and AI when she was in grade six.

“People are using chatbots to vent, to get information and they’re not really connecting with people,” Koei said. “I was struggling with that and that had an impact on my mental health, so I haven’t used AI in a while now.”

Koei said limited access to chatbots and social media could have helped her develop healthier relationships with the platforms early on. She also worries about the environmental effects of AI infrastructure, and participated in a protest against it in March.

Still, Koei said she’d like to see future regulations find a way to strike a balance between the positive uses of social media and AI, and the potential harms.

WATCH | Breaking down the climate impact of AI:

Breaking down the climate impact of AI

The energy needed to generate artificial intelligence leaves behind a sizable carbon footprint, but it’s also increasingly being used as a tool for climate action. CBC’s Nicole Mortillaro breaks down where AI emissions come from and the innovative ways the technology is being used to help the planet.

“I think that just a blanket statement, social media ban won’t necessarily fix the issues that it had caused and people can find other ways [to use it],” Koei said. 

“But if we can find a way to regulate social media a little more just for the sake of safety, or find a way that people can still share what they want to share without that same risk, then I think we should go for it,” she added.

That’s what Austin is hoping to push for when she heads to Ottawa for a national day of action Children First is organizing at the end of the month. She would like to see an independent regulator who could assess risks and hold tech companies accountable, alongside legislation built around preventing future harm.

“It would also be safety by design to really consider the unique needs of children and youth and design their products with the safety of children at the heart of it,” she said.



Source link

——————————————————–


Click Here For The Original Source.

National Cyber Security

FREE
VIEW