By Martin Evans, The Telegraph
Facebook has helped introduce thousands of Islamic State of Iraq and the Levant (Isil) extremists to one another, via its ‘suggested friends’ feature, it can be revealed.
The social media giant – which is already under fire for failing to remove terrorist material from its platform – is now accused of actively connecting jihadists around the world, allowing them to develop fresh terror networks and even recruit new members to their cause.
Researchers, who analysed the Facebook activities of a thousand Isil supporters in 96 countries, discovered users with radical Islamist sympathies were routinely introduced to one another through the popular ‘suggested friends’ feature.
Using sophisticated algorithms, Facebook is designed to connect people who share common interests.
The site automatically collects a vast amount of personal information about its users, which is then used to target advertisements and also direct people towards others on the network they might wish to connect with.
(Facebook is being accused of inadvertently helping Islamist extremists connect and recruit new members. A new report in The Telegraph cites research suggesting that the social media giant connected and introduced thousands of extremists through its “suggested friends” feature. J.M. Berger, a fellow with the Counter-Terrorism Strategic Communications program and author of “Extremism,” joins CBSN with some perspective on the report. Courtesy of CBS Evening News and YouTube. Posted on May 6, 2018)
The extent to which the ‘suggested friend’ feature is helping Isil members on Facebook is highlighted in a new study, the findings of which will be published later this month in an extensive report by the Counter Extremism Project a non profit that has called on tech companies to do more to remove known extremist and terrorist material online.
Gregory Waters, one of the authors of the report, described how he was bombarded by suggestions for pro-Isil friends, after making contact with one active extremist on the site.
Even more concerning was the response his fellow researcher, Robert Postings, got when he clicked on several non-extremist news pages about an Islamist uprising in the Philippines.
Within hours he had been inundated with friend suggestions for dozens of extremists based in that region.
Mr Postings said: “Facebook, in their desire to connect as many people as possible have inadvertently created a system which helps connect extremists and terrorists.”
Once initial introductions are made the failure of Facebook to tackle extremist content on the site means Jihadists can quickly radicalise susceptible targets.
In one example uncovered by the researchers, an Indonesian Isil supporter sent a friend request to a non-Muslim user in New York in March 2017.
During the initial exchange the American user explained that he was not religious , but had an interest in Islam.
Over the following weeks and months the Indonesian user began sending increasingly radical messages and links including pro-Isil propaganda, all of which were liked by his target.
Mr Postings said: “Over a period of six months the [US based user] went from having no clear religion to becoming a radicalised Muslim supporting Isil.”
The study also examined the extent to which Facebook was failing to tackle terrorist material on its site.
Of the 1,000 Isil supporting profiles examined by researchers, less than half of the accounts had been suspended by Facebook six months later.
(Facebook CEO Mark Zuckerberg: Terrorist Propaganda Allowed Under First Amendment. Facebook has said there is no place for terrorists on its site. In front of the House Energy and Commerce Committee, Facebook CEO Mark Zuckerberg answers questions from Rep. Jeff Duncan (R-S.C.). Courtesy of CNBC and YouTube. Posted on Apr 12, 2018)
Mr Posting said: “Removing profiles that disseminate IS propaganda, calls for attacks and otherwise support the group is important…the fact that the majority of pro-IS profiles in this database have gone unremoved by Facebook is exceptionally concerning.”
Even when terrorist material was identified and the offending posts removed, the user was often allowed to remain on the site.
In one case a British terror suspect had his Facebook account reinstated nine times after complaining, despite being accused of having posted sick Isil propaganda videos.
Mr Waters said: “This project has laid bare Facebook’s inability or unwillingness to efficiently address extremist content on their site.
“The failure to effectively police its platform has allowed Facebook to become a place where extensive IS supporting networks exist, propaganda is disseminated people are radicalised and new supporters are recruited.”
Mr Postings added: “Even when profiles or content is removed, it is not always done fast enough, allowing Isil content to be to be widely share and viewed before getting removed.”
Mr Waters said: “The fact that Facebook’s own recommended friends algorithm is directly facilitating the spread of this terrorist group on its site is beyond unacceptable.”
Simon Hart, a Conservative MP who sits on Culture Media and Sport Select Committee, said: “The idea that Facebook is inadvertently providing an introduction service for terrorists is quite extraordinary.”
“It is another terrifying example of the unintended consequences of this sort of technology.”
“If you design a system for one thing and it becomes another it is hard to police.”
“Nobody will have set out to provide a network for terrorists to connect, but the important thing is how Facebook responds now this matter has been raised with them.”
A spokesman for Facebook said: “There is no place for terrorists on Facebook.”
We work aggressively to ensure that we do not have terrorists or terror groups using the site, and we also remove any content that praises or supports terrorism.”
“Our approach is working – 99 per cent of ISIS and Al Qaeda-related content we remove is found by our automated systems. But there is no easy technical fix to fight online extremism.”
“We have and will continue to invest millions of pounds in both people and technology to identify and remove terrorist content.”