0

By Amole Olatunde

Mark Zuckerberg, the founder of Facebook, one of the world’s biggest social app, has announced plans to unplug online terrorists as part of initial steps to make the online community better and safer.

The explosive growth of social media and online social communities including Facebook, Badoo, Twitter, Wechat, etc, has created a niche area for business advertisements, e-banking, online retailing and other positive developments.

But the flipside is that the online community has also become a veritable space for sociopaths, terrorists and other non-state armed groups to incite violence, get new recruits and create mass fear.

ADVERTISEMENT

There is global concern among internet stakeholders and social platform owners on how to checkmate these evils without interfering with the creative use of the internet. Facebook says it will be implementing a policy designed to review violence-tainted posts as a way to get them off its platform as well as to blacklist those making such posts.

The internet has long become an extension of terrorists’ war zones. In Nigeria, the  Boko Haram terrorist group has for over six years posted video streams of its evils to incite mass fear among the population and also elicit attention for bargaining from the authorities. Graphic images of people being slaughtered have helped to spread fear among the population in a way that has weakened the belief of citizens in the ability of the state to destroy the terrorists.

“Often the perception is created that the terrorists have a strong foothold in the state, which is not always the case. But it is the terrorists that benefit from such streaming,” said Tom Ilesanmi, a Lagos-based social media expert.

ADVERTISEMENT

Isis, another terrorist group operating in Iraq, Syria and some other Middle Eastern countries has used live streaming of beheading people to send fear into millions across the world.

ADVERTISEMENT

“If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner – whether that’s responding quickly when someone needs help or taking a post down,” said  Zuckerberg in a statement.

“Over the next year, we’ll be adding 3,000 people to our community operations team around the world – on top of the 4,500 we have today – to review the millions of reports we get every week and improve the process for doing it quickly.”

“These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation. And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it – either because they’re about to harm themselves, or because they’re in danger from someone else”.

He also added that “in addition to investing in more people, we’re also building better tools to keep our community safe. We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer”.

All these measures are being put in place as a result of a report he got that someone went Live on the platform to confess he was considering suicide. However, Zuckerberg said he and his team immediately reached out to law enforcement, and they were able to prevent him from hurting himself.

In other cases, they weren’t so fortunate, adding that “no one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need”.

More in News

You may also like