Facebook security chief to leave after “a difficult three years” ahead of US midterms

Facebook is gearing up to tackle potential Russian interference in the US’s upcoming midterm elections. The move comes as part of Facebook’s much-publicised strategy to prevent a repeat of 2016, when the social media platform conceded its “enormous responsibility” for Russian interference in the US presidential elections.

Facebook security chief to leave after

In the wake of Facebook’s change of security tack comes the news that the company’s security chief, Alex Stamos, is leaving the company. Stamos, who has been at the firm since 2015, will leave to join Stanford University as a fellow. Speaking to the New York Times, the outgoing security boss described his term at Facebook as “a difficult three years”. 

alex_stamos_facebook_security_chief_leaves

Stamos went on to detail in a Facebook post on Wednesday that the experience had been enriching, if more than a tad challenging: “I have been proud to work with some of the most skilled and dedicated security professionals in the world in one of the most difficult threat environments faced by any technology company,” he wrote. 

The decision comes at a critical time for Facebook; with the US midterm elections around the corner, the company is upping its game in a bid to clamp down on Russian interference.

A post on the Facebook newsroom entitled “Removing Bad Actors from Facebook”, reveals how the social media site has removed a network of over 32 suspected Russian-linked accounts and pages designed for political organisation in the US. The uncovering allegedly marks the biggest effort by Russian third parties to interfere in US politics in the run-up to the elections.

READ NEXT: Facebook details how fake news spread during the US election

While Facebook hasn’t confirmed that the network is of Russian origin, the company revealed that it has “found evidence of some connections between these accounts” and the ones run by Russian fraudsters ahead of the 2016 election. The primary group in question is the Russian-based Internet Research Agency (IRA). Facebook maintained, however, that “we are not going to attribute this activity to any one group right now.”

This stems from the platform’s current inability to unearth the true identity and origin of the pages; Facebook writes that the ominously termed “bad actors” have been “more careful to cover their tracks,” something the site attributes “in part due to the actions we’ve taken to prevent abuse over the last year”.

READ NEXT: Facebook admits it has power to “corrode democracy”, pledges to tackle fake news

Meanwhile, content of the pages in question was varied, with reports of one promoting a “No Unite the Right 2” march – a counter-demonstration against the “Unite the Right” march, honouring the woman killed in 2017’s Charlottesville rally. The page reportedly garnered real activists, who “unwittingly helped build interest in the event,” posting information regarding location, transport and materials for the protests. Facebook expects more details to come to light as the investigation continues.

Why is it so important Facebook is proactive? Midterm elections are mammoth political events in the US; one third of the Senate is up for election along with the entire House of Representatives. With the capacity to sway the President’s clout in government, the imminent trajectory of US politics is on the line. For Facebook to sway the results by providing a platform for fake news and Russian interference would be to corrupt the course of democracy.

You can read more on what Facebook is doing to cull the platform of “bad actors” in its newsroom post here.

Image credit: San Antonio Express News 

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.