Info@NationalCyberSecurity
Info@NationalCyberSecurity

Facebook whistleblower, Iowa City native calls on Congress to set social media boundaries | #childsafety | #kids | #chldern | #parents | #schoolsafey


Former Facebook data scientist Frances Haugen speaks during a hearing of the Senate Commerce, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security, on Capitol Hill, Tuesday, Oct. 5, 2021, in Washington. (Jabin Botsford/The Washington Post via AP, Pool)

Iowans and billions of others around the world today live in an information environment more and more mediated by algorithms.

And every day social media companies make subtle choices about how their products operate and what a person sees on their screens, which have profound and lasting impacts on the physical and mental health of users.

Iowa City native Frances Haugen said social media users, in the U.S. and globally, should have the right to peek behind the curtain and “crash test” the algorithms.

“The consequences of that are when our economy is run more by opaque systems, instead of by transparent ones, it means that inside those companies there’s a constant temptation to begin to have the story the company tells deviate from what is the truth of that company — what it’s actually experiencing and what it’s actual impact is on the world,” Haugen said.

The former Facebook employee-turned-whistleblower delivered a keynote address Thursday as part of The Gazette’s Iowa Ideas Conference 2023.

Since revealing herself as the source behind tens of thousands of pages of leaked documents showing Facebook executives were indifferent to the harms caused by the amplification and spread of hateful and false information via its platform, Haugen has advocated for laws in the U.S. and abroad that aim to make social media safer for kids.

The data scientist has testified before governments around the globe and played key advocacy roles in the European Union passing landmark design guidelines for social media companies in April 2022 and California approving a new online kids safety law.

Haugen has more recently advocated for specific laws in the U.S. and abroad that aim to make social media safer for children.

Haugen launched a nonprofit, Beyond The Screen, to advance a new, open-source effort aimed at holding social networking platforms accountable for harmful practices and reducing social media’s negative impacts.

The initiative, “Standard of Care,” seeks to pool expertise from researchers, nonprofit leaders, litigators and data scientists who study the harms created and exacerbated by social media to identify gaps in research about online harms and devise with ways to fill them.

Frances Haugen

“I learned that Facebook had told a very simple story that was misleading,” Haugen said. “They had said the way we make social media safe is we take down bad content. We do it after the fact, instead of building systems that are safer by design.”

For example, she said forcing users to click links to other websites before resharing a post reduces the spread of misinformation by 10 percent to 15 percent, and fosters more thoughtful communication on its social network.

“When we have systems that prioritize what’s known as engagement — those clicks, those comments, those reshares, those likes — Facebook itself has been public since 2018 saying systems like that end up prioritizing, amplifying the most extreme content, because, people are drawn to click,” Haugen said. “Even when you ask them afterward, ‘Did you like that?’ They say, ‘No.’”

She gave the example of TikTok, where videos pushed widespread disinformation and violent, ethnic discriminatory narratives ahead of general elections last year in Kenya.

“If we want to not look forward 20 years and see these cycles of violence continue, we need to step up and say, ‘We need transparency for these systems,“ she said. ”And we have conversations about how they operate.”

Following Haugen’s leak of Facebook documents, the social media company built teams to focus more on social harms and hired more moderators in more languages.

“They got more equitable and more effective,” she said. “And, unfortunately, Elon Musk bought Twitter (now know as X), and came in and fired half the company. And, then, another 25 percent of the company left.”

Musk also dissolved the social media platform’s Trust and Safety Council, the advisory group of civil, human rights and other organizations the company formed in 2016 to address hate speech, child exploitation, suicide, self-harm and other problems on the platform.

“And there weren’t any real consequences,” Haugen said.

The same with artificial intelligence.

“The two main things with social media we got wrong, first, was transparency. But, the second thing is incentives,” Haugen said. “Whatever moves the fastest is being rewarded with the most additional investment, the most customers. … When it comes to AIs, we’re back in the era of ‘move fast and break things.’”

As a result, “safety isn’t always obvious” and there are not incentives “for them to do the responsible things,” she said.

“What if the Fortune 500 came out and said, ‘Hey, we need to change the incentives’?” Haugen said. “’We’re going to be putting trillions of dollars into these systems in the next couple of years. We will only buy products from people who develop their AIs in responsible ways, safe ways. They make their data available. They have published safety plans.”

Guardrails also are needed in the United States to protect children from harmful content, Haugen said. Several states, including Iowa, are investigating TikTok and its possible harmful effects on young users’ mental health.

⧉ Related article: Why Iowa’s bringing in a law firm to investigate TikTok

Government officials and child-safety advocates have cited TikTok’s practices and computer-driven promotion of content they say can endanger the physical and mental health of young users, such as video content that promotes eating disorders, self-harm and suicide.

Haugen noted TikTok, a wholly owned subsidiary of Beijing-based parent company ByteDance, limits teen users in China to 40 minutes a day and cuts off access after 10 p.m. Why can’t the same be done or required in the United States, she asked.

State lawmakers this year advanced, but failed to approve, limits on social media platforms for teens.

Haugen said she’s cautiously optimistic a warning issued by the U.S. Surgeon General in May about social media’s impact on youth mental health will lead to new laws and better tools to protect children online.

She said Congress can maximize the benefits and minimize the harms of social media platforms to create safer, healthier online environments for children.

“The Surgeon General’s warning is like a first step in the right direction,” Haugen said.

Iowa Ideas Conference

The Gazette’s Iowa Ideas conference began Thursday. It will continue Friday with more than two dozen panel discussions and a closing keynote address featuring biotech CEO and University of Iowa graduate Leslie J. Williams.

The conference is entirely virtual and free. Register at iowaideas.com.

Video replays of Thursday’s keynote addresses and sessions are available online at www.iowaideas.com/replays/2023.

Comments: (319) 398-8499; [email protected]

(function(d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s); js.id = id;
js.src=”https://connect.facebook.net/en_US/sdk.js#xfbml=1&version=v3.2″;
fjs.parentNode.insertBefore(js, fjs);
}(document, ‘script’, ‘facebook-jssdk’));

————————————————


Source link

National Cyber Security

FREE
VIEW