Facebook has  “made little progress” in dealing with the problem of auto-generated pages for terror groups, according to reporting from The Associated Press

The story is a follow-up on an April report that concluded the pages “are aiding Middle East extremists and white supremacists in the United States.”

Hear more at today’s congressional hearing on the topic, which began at 10 before the Committee on Commerce, Science, and Transportation. Check back here for a full report.

The NWC has filed a whistleblower petition with the the Securities and Exchange Commission Here’s what National Whistleblower Center director John Kostyack had to say in the AP story:

The issue was flagged in the initial SEC complaint filed by the center’s executive director, John Kostyack, that alleges the social media company has exaggerated its success combatting extremist messaging.

“Facebook would like us to believe that its magical algorithms are somehow scrubbing its website of extremist content,” Kostyack said. “Yet those very same algorithms are auto-generating pages with titles like ‘I Love Islamic State,’ which are ideal for terrorists to use for networking and recruiting.”

Facebook released an update of its anti-terror effort yesterday and is making its case on the Hill. The update It did not mention auto-generated pages.  Written testimony submitted at the hearing stated that official Facebook policy prohibits calls for violence and hate speech. It listed Facebook’s program to combat crime on the platform,  including AI and machine learning systems designed to assess whether a post violates Facebook standards. They have a staff of 350 “whose primary job is dealing with terrorists” and dangerous groups.”

Follow tweets from the event @counteringcrime, the feed for an organization battling organized crime on the internet. They are not impressed.

More from NWC twitter feed: