November 1: The Congressional grilling continues
On Wednesday, the general counsels of Facebook, Twitter, and Google returned to Capitol Hill for two more hearings before the Senate and the House Intelligence Committees.
During the morning session, Sen. Dianne Feinstein had some harsh words for the tech representatives.
"You created these platforms ... and now they're being misused," she said. "And you have to be the ones who do something about it — or we will."
Many political observers, including members of Congress, also expressed disappointment that the companies' top leaders declined to publicly answer questions about the serious charge that a foreign actor used their networks to meddle in the US election.
Sens. Martin Heinrich of New Mexico and Angus King of Maine were among those who reprimanded the companies' CEOs for not showing up.
"I'm disappointed you're here and not your CEOs," King told the companies' lawyers.
October 31: Facebook testifies before Congress
Republicans and Democrats grilled Facebook executives during a hearing over its lack of response during and after last year's election. Senators said the social networking powerhouse had been too slow in exposing Russian propaganda.
"Why has it taken Facebook 11 months to come forward and help us understand the scope of this problem, see it clearly for the problem it is, and begin to work in a responsible legislative way to address it?" Sen. Chris Coons of Delaware asked.
During the hearing, Facebook General Counsel Colin Stretch admitted the company could have done more early on.
"In hindsight, we should have had a broader lens," Stretch said. "There are signals we missed."
October 30: Russian posts on Facebook reached 126 million Americans
Just before Facebook was due to testify on Capitol Hill, the social network revealed that Russia's Internet Research Agency, a shadowy Russian-backed company, made 80,000 posts from 2015 to 2017 that reached up to 126 million Americans.
Most of the posts focused on divisive social and political messages such as race relations and gun rights.
October 2: 'We take responsibility'
Joel Kaplan, Facebook's vice president of global policy, announced he would be giving 3,000 Russian-linked ads to Congress for review.
He also outlined new policies, including tightening restrictions on ad content, increasing authenticity requirements, and hiring more than 1,000 people to review and flag improper ads.
"We care deeply about the integrity of elections around the world," Kaplan said. "We take responsibility for what happens on our platform and we will do everything we can to keep our community safe from interference.
September 30: 'I ask for forgiveness'
At the end of Yom Kippur, Zuckerberg, who is Jewish, asked for forgiveness from those he hurt.
"For the ways my work was used to divide people rather than bring us together, I ask forgiveness and I will work to do better," he wrote in another Facebook post.
September 27: Zuckerberg regrets dismissing Russian misinformation
In another Facebook post, Zuckerberg said he regrets not taking Russian interference seriously in the beginning.
"After the election, I made a comment that I thought the idea misinformation on Facebook changed the outcome of the election was a crazy idea. Calling that crazy was dismissive and I regret it," Zuckerberg said. "This is too important an issue to be dismissive."
September 21: Zuckerberg confirms Facebook is fully cooperating with the government
In a post on Facebook, Zuckerberg said the company is actively working with the government with its ongoing investigations into Russian interference in the election.
"We have been investigating this for many months, and for a while we had found no evidence of fake accounts linked to Russia running ads," Zuckerberg said. "When we recently uncovered this activity, we provided that information to the special counsel.
He also said Facebook was looking into other Russian groups, former Soviet states, and campaigns to further decipher nefarious activities online.
September 6: Evidence of Russian-backed ads emerges
Alex Stamos, Facebook's Chief Security Officer, announced that an internal review found that "approximately $100,000 in ad spending ... associated with roughly 3,000 ads" were likely operated out of Russia.
Stamos added that the ads violated Facebook's policies because they came from inauthentic accounts.
This was the first piece of hard evidence that Facebook made available to the public showing Russia's efforts to manipulate opinion online.
July: 'No evidence Russian actors bought Facebook ads'
Top investigators in the Senate's Russia investigation look to Facebook for answers about the Kremlin's involvement in spreading propaganda online. Sen. Mark Warner of Virginia says he met with Facebook officials in California to discuss Russia's election interference.
Facebook agreed to cooperate with the investigation.
But in a statement to CNN, a company spokesperson said "we have seen no evidence that Russian actors bought ads on Facebook in connection with the election."
April 27: Facebook says it is cracking down on misinformation campaigns
The company's security executives announced that they expanded their security focus "from traditional abusive behavior, such as account hacking, malware, spam, and financial scams, to include more subtle and insidious forms of misuse, including attempts to manipulate civic discourse and deceive people."
The executives also acknowledged that "these are complicated issues and our responses will constantly evolve, but we wanted to be transparent about our approach."
February 16: The Facebook Manifesto
Zuckerberg released his vision of Facebook's role in the world in a lengthy open letter to the public.
"It is our responsibility to amplify the good effects and mitigate the bad — to continue increasing diversity while strengthening our common understanding so our community can create the greatest positive impact on the world," he wrote.
January 18, 2017: Facebook blocks Russian state news network
Just days before Trump's inauguration on January 20, Facebook temporarily blocked RT, the Kremlin-backed news organization, from posting links and other media on its Facebook page.
Russian officials lambasted the social media company, calling the ban "unacceptable," and warning of "active countermeasures."
The 72-hour ban was lifted shortly after Trump was inaugurated. Some pundits wondered whether this was the beginning of Facebook's efforts to crack down on groups that share hoaxes and propaganda on its platform.
January 6, 2017: US intelligence report concludes Russia interfered in the election
A declassified intelligence report directly accused Russian President Vladimir Putin of ordering "an influence campaign in 2016 aimed at the US presidential election", and concluded that social media played a major role.
"Russia’s goals were to undermine public faith in the US democratic process, denigrate Secretary Clinton, and harm her electability and potential presidency," the report said.
There was no immediate response from Facebook.
December 15, 2016: Zuckerberg announces plan to fight fake news
Amid growing criticism, Facebook said it would partner with fact-checking organizations, including Snopes, ABC News, Politifact, and FactCheck.org, to combat fake news.
Some free speech activists worried that fact-checking wouldn't be applied equally to both sides of the political spectrum and that unpopular opinions could be suppressed.
"I understand how sensitive this is and I have instructed our team to proceed carefully and focus on fighting spam, not flagging opinions," Zuckerberg said. "For example, we're focused on obvious hoaxes with headlines like 'Michael Phelps just died of a heart attack' designed to get people to click on the stories and see ads."
November 10, 2016: Mark Zuckerberg dismisses Russia's influence
Just two days after voters elected President Donald Trump, Facebook CEO Mark Zuckerberg said there was only a "small amount" of fake news on his platform.
He also downplayed Facebook's role in influencing voters.
"To think it influenced the election in any way is a pretty crazy idea," he said. "I do think there is a certain profound lack of empathy in asserting that the only reason someone could have voted the way they did is they saw some fake news."
Source: Business Insider India