By David Gilbert, VICE News
Everitt Aaron Jameson had a plan.
The 25-year-old tow truck driver was going to travel to the remote mountain region near home in Modesto, California and build bombs from PVC pipe, gunpowder and nails.
Then, on Christmas Day 2017, he planned on carrying out an attack on San Francisco’s Pier 39, in the name of Islamic State.
Jameson told all this to an undercover FBI agent posing as one of ISIS’ top leaders, so the attack never happened.
But, according to a new report from the Counter Extremism Project, Jameson was part of an active and closely-knit Facebook group of ISIS-supporting Americans who regularly discussed these topics.
The report reveals that the leaders of the American group had been friends with Jameson on Facebook before the attack, and he had taken part in weekly Facebook Live meetings the group holds where they discuss overtly extremist and terrorist topics.
Following Jameson’s arrest, the leaders lamented the fact that “someone should’ve took him under his wing to show how to spot undercover agents” and announced the next weekly Facebook Live meeting would “contain tips on how to avoid getting busted…we gotta do better to help new converts.”
(The FBI says a former Marine sharpshooter planned a Christmas Day massacre at San Francisco’s Pier 39. The suspect is Everitt Aaron Jameson, a tow-truck driver from Modesto. Courtesy of CBS SF Bay Area and YouTube. Posted on Dec 22, 2017)
Facebook claims it has aggressively targeted terrorist content, eliminating 99 percent of terror-related messages before anyone even reports it.
But the 90-page report from the Counter Extremism Project, entitled “Spiders of the Caliphate,” lays out the shortcomings in Facebook’s approach, and how ISIS supporters avoid detection by using Facebook Live to host meetings and linking to banned material in comments, tricks that avoid Facebook’s automated flagging tools.
Even worse, the report shows how Facebook’s algorithmically-powered “recommended friends” feature is helping connect disparate groups of ISIS supporters across the globe.
The researchers warn that it is only a matter of time before these online communities help organize a real-world attack.
(Facebook is being accused of inadvertently helping Islamist extremists connect and recruit new members. A new report in The Telegraph cites research suggesting that the social media giant connected and introduced thousands of extremists through its “suggested friends” feature. J.M. Berger, a fellow with the Counter-Terrorism Strategic Communications program and author of “Extremism,” joins CBSN with some perspective on the report. Courtesy of CBS Evening News and YouTube. Posted on May 6, 2018)
“[ISIS’s] presence on Facebook is pervasive and professionalized, contrary to the tech company’s rhetoric and efforts to convince the public, policymakers, and corporate advertisers from believing otherwise,” the researchers say in their report.
The U.S. group is one of 15 communities of ISIS supporters the researchers have identified in regions like the Middle East, Europe, Turkey and Africa.
These groups, which are also interconnected, contain at least 1,000 unique members located in 96 countries around the world.
The researchers identified the accounts by trolling through Facebook for recently released ISIS propaganda using positive terminology, searching for geographic-based names, and examining pro-ISIS Facebook pages.
They then searched the friends lists of the accounts they found to establish where networks were located and how they communicated with each other.
And these accounts are representative of a much larger presence.
The researchers told VICE News that they ignored hundreds of other accounts because they didn’t publicly disclose their location, so were omitted from this report.
“We believe that the 1,000 accounts collected here represent only a small fraction of those on the site,” David Ibsen, executive director for the Counter Extremism Project told VICE News.
The Counter Extremism Project is not the only group which has found terrorist content lingering on the social network, despite the promises of Facebook’s leadership.
A recent report published by the Digital Citizens Alliance and the Global Intellectual Property Enforcement Center found hundreds of violent and disturbing terrorist videos, images and posts on Facebook and other social networks, including Instagram and YouTube.
“If digital platforms are unable to effectively police their own content when they are under scrutiny from Congress and state policymakers and facing serious trust issues with their users, it’s time for someone else to do it for them,” Tom Galvin, executive director of the Digital Citizens Alliance, said.
(Learn More. Tom Galvin, of the Digital Citizens Alliance, weighs in on if he thinks the internet is backfiring on us amid Facebook’s recent data controversy. Courtesy of WKYC Channel 3 and YouTube. Apr 10, 2018)
Facebook did not respond to questions about the report.
Last month Facebook said it removed 1.9 million pieces of ISIS and al Qaeda related content from its network in the first three months of 2018.
Of this 99 percent was detected by its automated systems before it was reported by users. What the report shows however is that there are major gaps in the company’s approach to dealing with terrorist content.
Ibsen says that Facebook is simply not going enough to counteract the spread of Islamic State propaganda on its platform: “For all of this to exist on the site despite Facebook’s claims…indicates the massive scale in which IS supporters are active on the site.”
Of the 1,000 accounts Counter Extremism Project identified, Facebook has now removed just over half, 537, according to Greg Waters, one of the researchers.
RADICALIZE AND RECRUIT
Facebook relies heavily on using photo, text, and video-matching artificial intelligence to automate the process of removing terror-related content from its network, which works for a large proportion of the content.
But ISIS supporters have found several loopholes in Facebook’s system, including posting videos, pictures or links to ISIS propaganda in comments rather than in posts, which seems to avoid Facebook’s algorithms.
“[Facebook] continues to tell the public that they are employing human and technological solutions to stop terrorist accounts and posts to their platforms,” Eric Feinberg, CEO of Global Intellectual Property Enforcement Center, said.
“This research shows that despite their promises their efforts are not cutting it and they still have a long way to go.”
The report states that the terror group has “a structured and deliberate strategy of using Facebook to radicalize, recruit, support, and terrorize individuals around the world” and that a limited number of individuals work to magnify the group’s presence on the platform.
The structure is organized so that no one individual ISIS Facebook account is so important that taking it offline would collapse the entire network.
(Learn More. CEP Communications Director Steven Cohen discusses the increase in use of vehicles as weapons of terror due in part to extremist propaganda and online radicalization. Courtesy of The Counter Extremism Project, ABC7 KRCR and YouTube. Posted on Apr 26, 2018)
Facebook claims it has a counter-terrorism team of 200, and these employees should have a solid understanding of the extremist groups they are fighting, including the terminology, colloquialisms, and iconography used by ISIS and al Qaeda supporters.
Ibsen said he and his colleague Robert Postings manually collected their data in their spare time over a six month period, and suggested Facebook’s in-house team could follow the same strategies, “to actively search out ISIS accounts, removing them manually while making the necessary adjustments to their AI programs to counter IS users’ attempts at avoiding detection.”
FRIENDING TERROR
The researchers claim Facebook’s unique ability to connect disparate groups of people is leading to the automation of connections between ISIS supporters, with the recommended friends automatically surfacing connections that may otherwise never happen.
“The recommended friends feature must be reworked so that it does not actively connect ISIS supporters to each other or to users who are at risk of being radicalized and recruited,” Ibsen said.
The report details the case of an Indonesian ISIS supporter who sent a friend request to a non-Muslim user in New York in March 2017.
While the U.S.-based user said he was interested in Islam, he also said he was non-religious.
However, interactions between the two over several months saw the Indonesian user send increasingly radical messages and links, including pro-ISIS propaganda — all of which were liked by his target.
To date, the U.S.-based IS-supporters network had not been able to connect with any official member of the terror group — though not for lack of trying, the researchers say.
The very public and blatantly pro-ISIS discussions held by this group lead the researchers to suggest they will sooner or later attract the attention of ISIS leadership.
”It is only a matter of time before these pre-existing sophisticated networks are taken advantage of by experience IS operatives to conduct terror attacks within the United States,” the report says.
Original post https://news.vice.com/en_us/article/59qgaz/american-isis-supporters-are-organizing-on-facebook-heres-how
Learn More…
FB Accused of Introducing Extremists through ‘Suggested Friends’ Feature