The ‘eight faces’ of anti-Muslim trolls on Twitter (Wired UK)

Andreas Eldh / CC BY 2.0

A criminology lecturer has
published a report dissecting online Islamophobia, identifying what
he believes are the eight types of troll responsible for much of
the hate speech.

Imran Awan, a senior lecturer at Birmingham City University, was
fairly new to Twitter when he began examining the 140-character
dialogues unfolding from January 2013 to April 2014. But the dates
were key. Awan wanted to know whether the brutal murder of Fusilier
Drummer Lee Rigby on 22 May, 2013 affected the level of hate speech

“As somebody who had joined Twitter myself, I was shocked at the
level of abuse I was getting,” Awan told “I write for
the Huffington Post and the Guardian, and I get a
lot of strong comments and harassment. I researched and talked a
lot about Islamophobia, but it seemed nobody had tested it in the
online world.”

Awan’s report, published in Policy & Internet, is an appraisal of the state of
affairs and a call to action, and one that brings to the fore for
the first time the behaviours of some of the main perpetrators
online. But as noted in his paper, there has already been some data
gathering, particularly by Tell Mama (Measuring Anti-Muslim
Attacks). The organisation is a platform that allows anyone who has
experienced or witnessed anti-Muslim abuse in England to report it
securely and anonymously. It means that Tell Mama can build up an
aggregate picture of abuse suffered by Muslims across England, with
details and locations recorded.

The most recent report, collated by Matthew Feldman and Mark
Littler of Teesside University’s Centre for Fascist, Anti-fascist,
and Post-Fascist studies, revealed that 734 hate crimes were
committed between 1 May 2013 and 28 February 2014 — 599 of these
were online, and 135 were street-based. According to the
report, more than a third of the online attacks had links to the
far right movement, and in the week proceeding Rigby’s murder Tell
Mama estimates a spike in incidents of 373 percent over the
preceding week.

“The Woolwich attack in May 2013 has led to a spate of hate
crimes committed against Muslim communities in the United Kingdom,”
Awan writes in the paper. “These incidents include Muslim women
being targeted for wearing the headscarf and mosques being
vandalised. While street level Islamophobia remains an important
area of investigation, an equally disturbing picture is emerging
with the rise in online anti-Muslim abuse.”

Awan, wanting to look behind these stats, began studying the
content and context of some of the most deeply abusive tweets that
fell into this category. He used three hashtags, first off,
#Woolwich, #Muslim and #Islam, to search and begin trawling through
the tweets. All three were hashtags that had trended in the UK
around the same period.

Awan then focussed on the tweets that were mainly from the UK
and “the most Islamophobic”, which amounted to around 100 Twitter
accounts. Of these 100 accounts, 500 tweets were examined. It’s a
very small sample size, but this was what Awan wanted — at least
to begin with — so that he could “ry and find out how Muslims are
viewed by perpetrators of online abuse, who are targeting them
through social media sites such as Twitter”. Of those tweets
examined, 72 percent were from males (determined “through various
factors such as Twitter Analytics and contents of tweets”) and 75
percent exhibited particularly strong anti-Muslim abuse including

Awan found repeated phrases including  “Muslim Paedos” (30
percent), “Muslim terrorists” (22 percent), “Muslim scum” (15
percent), “Pisslam” (10 percent) and “Muslim pigs” (9 percent).
Awan notes how much of the hate speech seemed to be related to news
of grooming rings in the UK that appeared to be run by men of Asian

“However, in some cases people simply used Twitter as a means to
antagonise and create hostility; with some accounts using
derogatory terminology by referring to Muslims as ‘Muzrats’ (a
demeaning word to describe Muslims as vermin or comparing them to a

Awan, taking these tweets, developed eight different categories
that he feels the perpetrators fit into. “When I started to examine
the number of tweets, similar characteristics emerged. So I
dissected all tweets and got this typology”: 

The Trawler: targets anyone with a Muslim
The Apprentice: new to Twitter, they target
people with the help of more experienced online abusers
The Disseminator: tweets and retweets messages,
pictures, and documents of online anti-Muslim hate
The Impersonator: users with fake profiles they
use to target individuals
The Accessory: joins in other conversations to
target vulnerable figures
The Reactive: following a major incident, such as
Woolwich, begins a campaign against an individual or group
The Mover: changes their Twitter account
irregular to target individuals
The Professional: has a huge following (and
multiple accounts) and launches a major campaign of hate against an
individual or group (Awan says Tommy Robinson, the cofounder of the
English Defence League, was one such individual) 

To be fair, this could just be a list of descriptions about
different types of trolls. The target of their abuse is obviously
key to this article, but the behavious are pretty much universal
Twitter behaviours.

However, Awan believes this specific issue needs to be addressed
immediately, by law enforcement and government.

“I see the dangers as being a lot worse. Right now government
looks at offline behaviour — in the online world in some cases the
police are probably slightly afraid. They probably feel it’s a grey
area, but it’s under the Malicious Communications Act.” It’s
certainly a grey area. But law enforcement is on its way to
catching up, this year sentencing 25-year-old John Nimmo and
23-year-old Isabella Sorley for harassing

Caroline Criado-Perez on Twitter. It certainty helped that the
feminist campaigner is a prolific Twitter user and refused to let
the matter lie. The reality is a lot of people will not report the
abuse – it’s partially why organisations like Tell Mama have to

Awan says Twitter is not doing enough, but blocking and
reporting functions are a “positive way” of dealing with it.
Ultimately he is seeking what anyone that has seen this kind of
reactive abuse would: “a change of mindset and culture — to think
about what impact it will have on victims.” This is a call to all
facets of society, and a reminder that people should not overlook
this kind of abuse.

It is important to note, however, that this paper is based on a
very small test sample. Furthermore, HateBot, a computer program that has been scouring Twitter for
hate speech, found that the most prevalent terms did not relate to
the Muslim community. Instead, the terms “gypo” and “pikey” account
for 15 percent. In fact no anti-Muslim hate speech made it into the
top ten words. 

If the article suppose to have a video or a photo gallery and it does not appear on your screen, please Click Here


Leave a Reply

Your email address will not be published.