Facebook and Twitter ‘refusing to take down Islamophobic posts’ in wake of Rotherham scandal

File picture of Rotherham, South Yorkshire.

File picture of Rotherham, South Yorkshire.

7
Have your say

Facebook and Twitter have defended their policies after it was claimed they are refusing to take down hundreds of inflammatory Islamophobic postings despite being alerted to them by anti-racism groups.

Online postings accusing Muslims of being rapists, paedophiles and comparable to cancer are said to have increased significantly in the wake of the Rotherham sex scandal, which revealed at least 1,400 children had been abused in the town over 16 years, mainly by men of Pakistani backgrounds.

The murder of British hostages held by the Islamic State group have also contributed to the increase in posts.

A report in The Independent said in most cases those behind the abuse have not had their accounts suspended or the posts removed.

Muslim groups said they had brought dozens of accounts and hundreds of messages to the attention of the social media companies, but despite this, most of the accounts reported are still easily accessible.

Fiyaz Mughal, director of Faith Matters, an interfaith organisation which runs a helpline called Tell MAMA, for victims of anti-Muslim violence, said he was disappointed by the attitude of both firms.

“It is morally unacceptable that social media platforms like Facebook and Twitter, which are vast profit-making companies, socially engineer what is right and wrong to say in our society when they leave up inflammatory, highly socially divisive and openly bigoted views,” he said.

“These platforms have inserted themselves into our social fabric to make profit and cannot sit idly by and shape our futures based on ‘terms and conditions’ that are not fit for purpose.”

A Facebook spokeswoman said: “We take hate speech seriously and remove any content reported to us that directly attacks others based on their race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability or medical condition.

“With a diverse global community of more than a billion people, we occasionally see people post content which, whilst not against our rules, some people may find offensive.

“By working with community groups like Faith Matters, we aim to show people the power of counter speech and, in doing so, strike the right balance between giving people the freedom to express themselves and maintaining a safe and trusted environment.”

A Twitter spokesman said: “We review all reported content against our rules, which prohibit targeted abuse and direct, specific threats of violence against others.”