British research firm Cambridge Analytica has hit the headlines this week after an undercover investigation by Channel 4 News, alongside evidence given to the press by a whistleblower who had ties with the organisation, uncovered some rather unsavoury working practices. There’s a lot to digest, especially if you’re not familiar with social networks and how digital data is used for advertising and campaigning. If that’s the case, we’ve put together this quick-fire guide that should get you up to speed on the issue that could have profound implications for how we protect our digital data going forward.
Cambridge Analytica is a British-based consumer data research organisation that was established in 2013 by Alexander Nix to help businesses and political parties to understand and exploit audience behaviour.
Its researchers take the vast amounts of data able to be collated on people online, like the information contained in their social media accounts, and apply behavioural science to aid its customers in targeting and influencing those people.
One of Cambridge Analytica’s most successful early campaigns was in influencing US elections. In 2013, the company received a $15 million fund from Republican party donor Robert Mercer, and then went on to win contracts with various Republican governors contesting for political seats. Cambridge Analytica worked on Ted Cruz’s Republican presidential candidate campaign. Mercer, a fervent Trump supporter, also became a shareholder in Cambridge Analytica as the company continued to forge strong links with right-wing customers who were eager to exploit audiences and influence election results.
“We collect up to 5,000 data points on over 220 million Americans and use more than 100 data variables to model target audience groups and predict the behaviour of like-minded people,” so boasts Cambridge Analytica. In 2016, Cambridge Analytica was hired by a soon-to-be-successful Trump presidential campaign.
Facebook’s very business model is founded on the harvesting and selling of its customer data, but the social network’s relationship with Cambridge Analytica has taken that notion to an unprecedented level.
In 2014, Cambridge Analytica – which was in the process of mapping users’ online identities to determine their political biases, likes and dislikes, and potential voting intentions – bought into an app developed by British-based academic Aleksandr Kogan for Facebook users that acted as a personality survey. To gain access to the app, which was marketed as a fun, sharable personality quiz akin to hundreds of the other quizzes and games so common on Facebook, Kogan required Facebook users to allow it to scrape their personal profile details – an action now banned by Facebook but which was all above board at the time.
If you pay for the magazine you should always take it. Vendors are working for a hand up, not a handout.
As a result – and what has been further revealed this month through a whistleblower named Christopher Wylie who worked on the app with Kogan – it turns out the app provided Kogan and Cambridge Analytica with more than 50 million raw profiles of Facebook users. While only around 270,000 Facebook users used the app, Kogan was able to access associated profiles and friends of these users to gain even more data. Ethical considerations around influencing elections and political campaigns, such as Brexit, and the use of personal data aside, the big question is now is: Did Cambridge Analytica and Kogan break the law?
For starters, a formal request has been sent to Facebook CEO Mark Zuckerberg from British MPs requesting him to answer questions regarding the acquisition and use of private user data.
Last Friday, Facebook itself issued a statement saying that it had removed Kogan’s app from Facebook after it had learned that his research had been given to Cambridge Analytica.
At the time, the data from the app was gathered by gaming Facebook’s own platform and was not taken illegally, but Facebook argues the data was not authorised for Kogan to share with others. Another legal point of contention is that customers of the app were not made aware that their profile information may be used in Donald Trump’s election campaign.
Cambridge Analytica itself claims that it never used the data and actually deleted it when requested to by Facebook. Determining this information is now the crux of the investigation.
Facebook issued a statement, “Several days ago, we received reports that, contrary to the certifications we were given, not all data was deleted. We are moving aggressively to determine the accuracy of these claims. If true, this is another unacceptable violation of trust and the commitments they made. We are suspending SCL/Cambridge Analytica, Wylie and Kogan from Facebook, pending further information.”
The UK’s Information Commissioner, who is responsible for the digital data privacy policies of UK citizens, is applying a for a warrant to search the offices of Cambridge Analytica, based in London.
I must emphatically state that Cambridge Analytica does not condone or engage in entrapment
But there’s more. Britain’s Channel 4 News also this week broadcast hidden camera footage of its reporters posing undercover as potential Cambridge Analytica customers. The footage captured the firm’s boss Nix suggesting he could use customer data collected online to help discredit politicians ahead of elections – very sleazy.
Nix has denied any wrongdoing. “I must emphatically state that Cambridge Analytica does not condone or engage in entrapment, bribes or so-called ‘honeytraps’, and nor does it use untrue material for any purpose,” he said in a statement.
Zuckerberg also has to testify before Congress in the US about further actions Facebook will take to protect its users’ data. The European Parliament is also now investigating Facebook’s potential data misuse.
And regardless of legal outcomes for now, it’s not good news for Facebook or for the privacy of the hundreds of millions of its users. At a time when the social network is already under heavy fire for not dealing with fake news, unfairly influencing political campaigns in its own right, and not acting quick enough to shut down terrorist propaganda, its partnership with Cambridge Analytica is only now fanning the flames for a company in turmoil.
Main image: Flickr/iStock