The scandal that has erupted around Cambridge Analytica’s alleged harvesting of 50m Facebook profiles assembled from data provided by a UK-based academic and his company is a worrying development for legitimate researchers.
Political data analytics company Cambridge Analytica – which is affiliated with Strategic Communication Laboratories (SCL) – reportedly used Facebook data, after it was handed over by Aleksandr Kogan, a lecturer at the University of Cambridge’s department of psychology.
Kogan, through his company Global Science Research (GSR) – separate from his university work – gleaned the data from a personality test app named “thisisyourdigitallife”. Roughly 270,000 US-based Facebook users voluntarily responded to the test in 2014. But the app also collected data on those participants’ Facebook friends without their consent.
This was possible due to Facebook rules at the time that allowed third-party apps to collect data about a Facebook user’s friends. The Mark Zuckerberg-run company has since changed its policy to prevent such access to developers.
Whistleblower Christopher Wylie, who previously worked as a contractor at Cambridge Analytica, told the Guardian that the company used the data to target American voters ahead of President Donald Trump’s victory in 2016. He claimed that Cambridge Analytica was a “full-service propaganda machine”.
Cambridge Analytica has denied any wrongdoing and said that the business tactics it used are widespread among other firms. For his part, Kogan insists that what he did was at all times compliant with the law – and also says, according to CNN, that he would be happy to testify before US Congress and talk to the FBI about the work he did for the company.
Facebook said on March 18 that it had suspended SCL, alleging that Kogan had “lied to us and violated our platform policies by passing data from an app that was using Facebook login to SCL/Cambridge Analytica.” Facebook states under part three of its platform policy that developers do not have permission to “transfer any data that you receive from us (including anonymous, aggregate, or derived data) to any ad network, data broker or other advertising or monetisation-related service.”
In a statement to Cambridge News, the University of Cambridge said:
We are aware that Dr Kogan established his own company, Global Science Research (GSR), of which SCL/Cambridge Analytica was a client. It is not uncommon for Cambridge academics to have business interests, but they must satisfy the university that these are held in a personal capacity and that there are no conflicts of interest.
It is our understanding that the thisisyourdigitallife app was created by GSR. Based on assurances from Dr Kogan as well as the evidence available to us, we have no reason to believe he used university data or facilities for his work with GSR, and therefore that there is no reason to believe the university’s data and facilities were used as the basis for GSR’s subsequent work with any other party.
A day after the Cambridge Analytica scandal hit, Facebook’s shares plummeted on Wall Street amid the privacy backlash. But could the incident affect legitimate academic research?
Social media data is a rich source of information for many areas of research in psychology, technology, business and humanities. Some recent examples include using Facebook to predict riots, comparing the use of Facebook with body image concern in adolescent girls and investigating whether Facebook can lower levels of stress responses, with research suggesting that it may enhance and undermine psycho-social constructs related to well-being.
It is right to believe that researchers and their employers value research integrity. But instances where trust has been betrayed by an academic – even if it’s the case that data used for university research purposes wasn’t caught in the crossfire – will have a negative impact on whether participants will continue to trust researchers. It also has implications for research governance and for companies to share data with researchers in the first place.
Universities, research organisations and funders govern the integrity of research with clear and strict ethics procedures designed to protect participants in studies, such as where social media data is used. The harvesting of data without permission from users is considered an unethical activity under commonly understood research standards.
The fallout from the Cambridge Analytica controversy is potentially huge for researchers who rely on social networks for their studies, where data is routinely shared with them for research purposes. Tech companies could become more reluctant to share data with researchers. Facebook is already extremely protective of its data – the worry is that it could become doubly difficult for researchers to legitimately access this information in light of what has happened with Cambridge Analytica.
Clearly, it’s not just researchers who use profile data to better understand people’s behavioural patterns. Marketing organisations have been profiling consumers for decades – if they know their customers, they will understand the triggers that prompt a purchase of their product, enabling them to adjust marketing messages to improve sales. It has become easier with digital marketing – people are constantly tracked online, their activities are analysed using data analytics tools and personal recommendations are made. Such methods are core to the business strategies of tech giants’ such as Amazon and Netflix.
Information from online behaviour can be used to predict people’s mood, emotions and personality. My own research into Intelligent Tutoring Systems uses learner interactions with software to profile personality type so it can automatically adapt tutoring to someone’s preferred style. Machine learning techniques can combine theories from psychology with new patterns found – such as Facebook “likes” – to profile users.
Eli Pariser, who is the CEO of viral content website Upworthy, has been arguing against personalisation tools since 2011. He has warned against the dangers of information filtering, and believes that the use of algorithms – to profile people to show them information tailored to personal tastes – is bad for democracy.
While these fears appear to be borne out by some allegations levelled against Cambridge Analytica, it’s worth noting that there has been no evidence to show that US votes were swung in favour of Trump due to Cambridge Analytica’s psychometric tool.
However, given his academic status, Kogan’s apparent decision to transfer Facebook data for commercial ends in violation of the social network’s policies could yet have explosive consequences, not least because researchers might find it more difficult to get Facebook – and its users – to agree to hand over the data for research alone.