Issued on: Modified:
San Francisco (AFP) – Elon Musk’s talk of slimming Twitter’s staff and letting people post anything allowed by law is expected to clash with the reality of fending off hackers, trolls, police and regulators, experts say.
If Musk guts Twitter staff or mass resignations hit the platform, it could mean “doom,” said Rebekah Tromble, director of the Institute for Data, Democracy and Politics at George Washington University.
“No matter Musk’s big vision, you need a highly skilled, knowledgeable workforce capable of (re)building a viable platform and responding to EU obligations,” Tromble told AFP.
Along with engineers, that includes legal and policy teams that keep user data safe and guard against dangerous posts.
“There really, truly are almost countless ways that Twitter as a company has to think about safeguarding its users,” Tromble said.
Cybersecurity issues range from lone hackers out to cause mischief to organized groups and attacks by nation states.
Then there are “bad actors” who gang up to abuse targets on Twitter in a tactic referred to as “dog piling.”
“One of my greatest fears at the moment, is that a sort of large scale firing or even large scale resignations will mean that the already imperfect system will just backslide,” Tromble said.
Losing people from teams that fight intrusive demands by police or other government agencies for Twitter user data means experience walks out the door with them, Tromble added.
Musk is in for a wake-up call when it comes to taking a laissez faire approach to content moderation, according to Emma Llanso of the Center for Democracy and Technology.
US law is permissive in terms of letting social media platforms decide content policies and not holding them accountable for what users post, but that could soon change, Llano said.
The US Supreme Court, in a decision with potentially far-reaching ramifications, is set to hear two cases challenging the legal immunity of internet companies from liability for content posted by their users.
The top court in the United States may well decide to roll back how much social media firms like Twitter are immune to blame for content “recommended” to users.
“There are any number of decisions content sorting algorithms must make regarding which tweets a user sees,” Llanso said.
“Does that make them recommended?”
Musk has said he wants to rely more on software and less on people for content moderation.
The Supreme Court is also to consider cases concerning whether states can dictate content rules at social media platforms.
And while there is currently strong legal footing for Musk to do as he wants with content moderation in the United States, laws are more restrictive in Europe and elsewhere.
“Many countries around the world are really looking at cracking down on the broad leeway social media services have had till now on setting content policy the way they see fit,” Llanso said.
Varying content moderation laws will also mean that Twitter has to figure out in real time what can be shown where.
With Musk at the helm for just some 24 hours, malicious characters were already testing the limits of Twitter systems, Tromble noted.
“And when hate speech, doxxing and harassment slip through the cracks, real harm occurs,” Tromble said.
“Doxxing” is the publication of private or identifying information about a person, often with malicious intent.
Even if there aren’t legal consequences for letting Twitter turn foul, there are business consequences, said Electronic Frontier Foundation director of federal affairs, India McKinney.
“People are looking for a place to go,” McKinney said of the search by some users for an alternative to Twitter.
“It is an opportunity for someone, that’s for sure.”
© 2022 AFP