Bob talks with Princeton scholar Orestis Papakyriakopoulos about the social media titan’s latest assault on transparency, and the all-too-familiar blame-shifting that followed it. That has become standard operating procedure from a company Bob describes as “amoral, except when it’s immoral.” TEDDY ROOSEVELT: Surely there never was a fight better worth making than the one which we are in. BOB GARFIELD: Welcome to Bully Pulpit. That was Teddy Roosevelt, I'm Bob Garfield. Episode 4: It Wasn't Me, It Was My Dog. Last week, Facebook abruptly shut down a research program by scholars, at New York University's Ad Observatory, who had been monitoring the company's political advertising inventory. NEWSCASTER: Now, this whole battle started on Tuesday when Facebook disabled the accounts of researchers at the NYU Ad Observatory, Facebook explaining, quote, “NYU’s Ad Observatory project studied political ads using unauthorized means to access and collect data from Facebook in violation of our terms of service. We took these actions to stop unauthorized scraping and protect people's privacy in line with our privacy program under the FTC order.” BG: Yes, Facebook's product management director, Mike Clark, claimed in a blog post that the company's hands were tied by the government. You know, just like Son of Sam claimed it was his dog who ordered him to kill. Within 24 hours, Wired magazine and others revealed that the FTC consent order provided no such thing. Even the agency's Bureau of Consumer Protection weighed in, with acting director Samuel Levine writing to Facebook founder Mark Zuckerberg saying, quote, “I am disappointed by how your company has conducted itself in this matter.” Please note that Levine didn't say surprised, just disappointed, because the history of Facebook is the history of Facebook conducting itself in disappointing ways, voicing shame and regret from the bottom of its heart, and then returning to deceptive and greedy business as usual. MARK ZUCKERBERG (MONTAGE): We didn't take a broad enough view of our responsibility, and that was a big mistake and it was my mistake. This was a major breach of trust and, and I'm really sorry that this happened. We have a basic responsibility to protect people's data. And if we can't do that, then we don't deserve to have the opportunity to serve people.NEWSCASTER: In 2003, Zuckerberg apologized in the Harvard Crimson for any harm done after his website FaceMash asked users to rate people's hotness. Three years later, Zuckerberg said Facebook, quote, “really messed this one up,” following user complaints that the newly launched news feed invaded their privacy.NEWSCASTER: Zuckerberg apologized once again in 2007 for an uproar over the company's Beacon advertising system, saying, “I know we can do better.” BG: That last part courtesy of CBS News. So the FTC wasn't surprised about the latest phony excuse for systematic opacity, and neither was Orestis Papakyriakopoulos, a postdoctoral research director at Princeton University's Center for Information Technology Policy. He's speaking to me from Athens, Greece. Orestis, welcome to Bully Pulpit. ORESTIS PAPAKYRIAKOPOULOS: Glad to be here, Bob. BG: All right, we'll get to your work shortly. But I want to begin with the NYU project. What were they studying? OP: So, the NYU researchers had an Ad Observatory project. They were trying to monitor what ads are placed on Facebook and who sees them, like which demographics are targeted and so on — in order to provide additional transparency on how online advertising takes place. BG: And what was the method? Were they, in fact, scraping content or metadata from the site in some clandestine fashion, as Facebook alleged? OP: No, actually, they've developed a plugin that you put on your browser, the Ad Observer, and they asked users all over the world to use their plugin, and practically the plugin was recording what the users saw. So in this way, they could see which ads a user was targeted. BG: Wait, so when Facebook invoked protecting user privacy, all of the users had proactively downloaded the browser extension and were giving explicit permission to the NYU people to see what ads they were being served. OP: Exactly, but when Facebook uses the term users, they mean the advertisers who placed the ads. The advertisers did not give their permission to NYU to collect the information about the targeted ads. BG: [chuckling] OP: Yeah, exactly. BG: I see, so the advertisers who pay money to have their ads seen we're skittish about having their ads seen. OP: Exactly. BG: Now, the whole point of the Facebook algorithm is that consumers get more and more content they have demonstrated interest in by clicking on it or commenting or sharing. That very same algorithm, though, takes the same user behavior data and allows advertisers to micro target to exactly the consumer profile they're most interested in, whether to buy a car or toothpaste or a political worldview. OP: Yeah, so Facebook's business model until today is to use this data they collect to place personalized advertisements and they sell the space and they sell the tool they've developed so advertisers can place their ads. BG: Selling the tools they've developed. This gets to the next sensitive area of privacy, because the FTC order that the company invoked last week came with a five billion dollar fine for violating an earlier 2012 consent decree after Facebook was caught not only being careless, but mercenary with users personal data. Can you remind me what the specifics were of the original complaint? OP: Sure. So back in 2012, the FTC claimed that Facebook was violating numerous privacy rules. And more specifically, for example, users believed that they had put their accounts to private settings or some information that they had on their profile were not public, but advertisers still had the opportunity to collect this data. Another example of what was violated back then is that although users were deleting their profiles or saying that taking their information down, third party entities were still able to collect this data, although the users had removed their consent access on the platform. BG: So then came the new order in 2019, in which the FTC said Facebook was found to be, quote, “deceiving users about their ability to control the privacy of their personal information.” Can you summarize the 2019 case? OP: Sure. So going back to 2012, because Facebook violated specific rules, the FTC said that Facebook needs to change how it functions to make more clearer representations of what holds in privacy terms and what not, to inform users as well as to switch off all these back doors that gave data about users to third party individuals. And although Facebook started doing that, for example, what happened is that although new apps were not able to get this data, if you had an older up, you still were able to collect information. And this is the window that was exploited also by Cambridge Analytica, that the company used an app that was created in the past for a different purpose and started collecting data about users, and these data the users have not given their consent to give the data to the company. BG: And this wasn't like, oops, careless of me. This had to have been done with malice aforethought. OP: Yeah. So definitely Cambridge Analytica did it because they found an opportunity there to collect all this data. I don't know if Facebook knew about the backdoor or not, but definitely they did not do their job right. BG: And then sat on the information for two years before the story finally blew up in the media. OP: And going back to now to 2019, the FTC said, hey, Facebook did not conform to our claims. There are still issues with data privacy and Facebook need to conform to the older rules. Plus, there were some new issues that appeared. For example, Facebook need to make more transparency in how they use their face recognition technology and their platform. The FTC implemented stronger accountability mechanisms in cases that Facebook violates against the norm, and so on. BG: So once again, disappointing but unsurprising. And just ,as is was the case with Cambridge Analytica, simply astonishing indifference to the abuse of its targeting algorithm. And this is whether permitting Trump friendly or Boris Johnson friendly foreign agents to spread toxic lies in a political campaign, or the Myanmar Buddhist military to incite pogroms with false accusations against the Muslim Rohingya minority. I've often described the company as amoral, except when it is immoral. Would you care to argue against that proposition? OP: So definitely Facebook as every company, they look at their self-interest. This is what they were doing in the past and they are keep doing now. Their model is to collect as much data they can and find ways to sell it to get the most profit out of it. That also means that not disclosing a lot of things that are going on on the platform because these might make them accountable and also make them impose restrictions on their business model. BG: And in fact, in the Cambridge Analytica affair, there were a number of universities and the United States Senate trying to look into how it could have all taken place. Facebook vowed transparency, but instead actually tried to stymie some researchers by failing to make its API fully available and so on. How cooperative were they even when they were most in the crucible following Cambridge Analytica? OP: Generally, Facebook I think that transparency efforts of Facebook belong more to the marketing part of the company rather than an actual effort of the company to be more open with scientists and policy makers and so on. So they always try to give minimal data under rules that protect them 100 percent. And also the quality of the data information they provide usually is not able to answer key questions about the nature of the platform, how does i