Jamie Bartlett

Finally, politicians have realised how to hold Facebook to account

Finally, politicians have realised how to hold Facebook to account
Text settings
Comments

This week, the Digital, Culture, Media and Sport Committee looking into fake news convened for a special session. For the first time since 1933, when the joint committee on Indian constitutional reform included parliamentarians from India, politicians from nine other countries joined Damian Collins and other MPs to cross-examine Facebook and others. A couple of days earlier, Collins had used an arcane parliamentary procedure to send a representative of the sarjeant at arms to a hotel room in order to squeeze documents from an app developer who made software to locate Facebook photos of people wearing bikinis. (You read that right). This, it seems, is what it takes to hold large international tech firms to account these days.

I’m often asked whether politicians can really keep up with tech. Isn’t it too complicated, and politicians too techno-illiterate? After watching the summer’s hearings, where Facebook boss Mark Zuckerberg waltzed through absurd show-trails at the US Senate and at the European Parliament, it seemed so. After all, technology is complex beyond words; I doubt even Zuck himself understands most of Facebook’s code. And following data trails around can be infuriating. The Information Commissioners Office is currently trawling through 700 terrabytes of data relating to Cambridge Analytica. (That’s millions of pages). When Zuck was up against the Senate earlier this year, Senator Orrin Hatch famously asked how Facebook makes money, given it’s free to use. ‘Senator we run ads’, Zuck replied. At that point I’d almost given up. But that verdict would have been premature. This DCMS Committee, while not perfect, is showing that it can be done. It just takes determination, time, flexibility, international co-ordination – and MPs willing to learn.

Technically, the committee is titled ‘Disinformation and Fake News’. That was the original plan, but it got sucked into the Cambridge Analytica vortex and decided to keep going. The most glaring issue it’s grappling with is this one: a company called GSR had lots of people fill in personality surveys on Facebook. When those people did, GSR collected Facebook data about them. This is pretty normal, except it collected data about their friends too. Tens of millions of them in fact, which it then sold to Cambridge Analytica, who used it to build a model about personality types. (There’s been some wild and unproven claims about the effectiveness of this model, and none of this data was used in the Brexit referendum, although some people mistakenly think it was). Even though this was in breach of Facebook’s terms, did Facebook do enough to stop it? What did the company know, and when? What other instances are there like this? Might there be more in future?

I realise this sounds like tedious technicality. But as the world becomes datafied, as elections are won and lost using modelling and micro-targeting, these are the issues on which the democratic health of the nation will depend. Plus, the Cambridge Analytica/ GSR issue is just one of several related questions about Facebook and democracy: the emails snatched from the bikini hunter suggest a Facebook engineer warned the company about Russian efforts to steal data year back. Then there’s the issue of whether they’re doing enough about fake news. And on, and on. 

There’s an interesting cultural divide in tech firms between the engineers – the real bosses – and the policy people that have to go out to defend them. In the stocks on Tuesday was Richard Allen, Facebook’s “head of policy solutions”. He’s generally thought of as a thoroughly decent chap and also happens to be a member of the House of Lords. He’s a wily performer – he acknowledges problems, he accepts criticisms, keeps his cool, and says they’re working on it. And to be fair, over the last few months Facebook has dramatically improved its efforts on these issues, including stamping down on monetised fake news. But the Committee weren’t having it:  

Charlie Angus (Canadian MP): When we talk about regulation, would you be interested in asking your friend Mr Zuckerberg if we should have a discussion about antitrust?

Richard Allan: It depends on the problem we are trying to solve. If the challenge is around—

Charlie Angus: What if the problem is Facebook? That is the problem.

Ouch! Parliamentarians all over the world are a bit fed up with the striding brash Silicon Valley colossus that’s disrupting democracy. Some of this Committee is a showy muscle flex. You listen to me, Mr Facebook, I represent people! Why isn’t Zuckerberg here himself – does he not care about democracy? They ‘empty chaired’ Zuck and tweeted a photograph about it.

What’s impressed me wasn’t the chest beating though but how much they’d buffed up. An ordinary punter listening in on Tuesday (which, I admit, is unlikely) would have been petrified by the tedium. Allen explained that, well, in version one of Facebook’s API, developers could get data from users, plus certain data from friends depending on permissions granted, but (and I want to be clear about this) following a rule change in 2014 it was only users themselves, although ‘whitelisted’ accounts had… You see what I mean? But the MPs were on to it. Give us an example, persisted Ian Lucas, of when you’ve taken action against a developer who breached these rules. Allen prevaricated. ‘Can you name one case?’ Lucas went on. And so on – for hours and hours. 

This is exactly what’s needed. Most of us don’t care about these finicky details, and never will. But these are now the things – who had the data, when, who knew, on what terms – on which the integrity of our democracy will hinge. Facebook and Allen now must go back and dig up more answers, including rebutting a claim from one witness – Ashkan Soltani, former head of technology at the Federal Trade Commission – that he misled the Committee. They’re damn persistent, this lot.

In future, Committee hearings like this will be far more common. We’ll be worrying about internet enabled devices in the home, about bias machine learning algorithms, about crypto-currency fraud, and complex data sharing scams. It’s not technical savvy that’s needed – it’s determination, patience, and lots of churning through boring files. Future committees might have to creatively get hold of USB sticks from strange people, and work together with international colleagues. They might even need to allow people to wear hoodies and trainers when giving evidence. And with enough time and doggedness, they might actually work.