Just before the 2016 US presidential election, a Russian-backed group with 217,000 Facebook followers posted an illustration that depicted Hillary Clinton as a devil fighting Jesus.
The caption: "''Like' if you want Jesus to win!"
The illustration was one of many Russian-linked propaganda efforts displayed Wednesday during a Senate Intelligence Committee hearing.
Sen. Mark Warner, the committee's vice chairman, showed the illustration, shared by the Russia-affiliated "Army of Jesus" Facebook page, as evidence of how Russian actors used social media posts and ads to sow divisiveness and spread propaganda.
Facebook along with other web platforms, such as Twitter and Google are facing hearings this week on Capitol Hill which will look into the ways their platforms were exploited.
As many as 126 million people - or one-third of the United States population - may have seen material posted by a Russian troll farm under fake Facebook identities between 2015 and 2017, according to testimony by Facebook general counsel.
This is the largest figure we've heard so far about the possible reach of Russian operatives, which includes their involvement in last year's presidential election.
In testimony before the Senate Judiciary Subcommittee on Crime and Terrorism reveals that Russian attempted to influence U.S. voters by using the power of social media platforms and an understanding of hot-button social issues was much broader than originally thought.
Facebook General Counsel Colin Stretch said the goal was "to try to sow division and discord and to try to undermine our election process."
Stretch called the content of the Russian-bought ads "deeply disturbing." He said it was "seemingly intended to amplify societal divisions and pit groups of people against each other."
The hearings and new disclosures cast a harsh spotlight on the immense power of the tech companies at a time when there is renewed interest in greater regulation for the industry.
"Many of the ads and posts we've seen so far are deeply disturbing, seemingly intended to amplify societal divisions and pit groups of people against each other. They would be controversial even if they came from authentic accounts in the United States. But coming from foreign actors using fake accounts they are simply unacceptable," Stretch's testimony said.
"Our goal is to bring people closer together. These foreign actors sought to drive people apart," said Stretch.
Under pressure in advance of hearings on Russian election interference, Facebook is moving to increase transparency for everyone who sees and buys political advertising on its site.
Executives for the social media company said Friday they will verify political ad buyers in federal elections, requiring them to provide correct names and locations, and to create new graphics where users can click on the ads and find out more about who's behind them.
Google, which previously had not commented on its internal investigation, will break its silence. In a recent blog post, the search giant confirmed that it discovered about $4,700 worth of search-and-display ads with dubious Russian ties.
It also reported 18 YouTube channels associated with the Kremlin's disinformation efforts, as well as a number of Gmail addresses that "were used to open accounts on other platforms."
Yet one of Google's biggest challenges — much like Facebook and Twitter — is its handling of organic content, including videos uploaded by RT, a Russian government-funded news network. Called a propaganda arm of the Kremlin, RT videos have millions of views on YouTube. In Google's investigation, however, the tech giant said it "found no evidence of manipulation of our platform or policy violations." As a result, Google said that RT and other state-sponsored media outlets are still "subject to our standard rules."
Twitter, for its part, recently banned RT from advertising on its platform, though the publication is still allowed to tweet there. Facebook has announced no change.
For its part, Twitter plans to unveil two new key findings during its testimony to Congress, sources told Recode on Monday. Chief among them: The company's acting general counsel, Sean Edgett, said the company had discovered and suspended roughly 2,752 accounts tied to known Kremlin trolls.
Edgett said, "Trust and safety are our two top priorities and we are making changes."