In which I come out: Notes from the FTC Privacy Roundtable

January 31, 2010 at 3:49 am 13 comments

I was on a panel at the second FTC privacy roundtable in Berkeley on Thursday. Meeting a new community of people is always a fascinating experience. As a computer scientist, I’m used to showing up to conferences in jeans and a T-shirt; instead I found myself dressing formally and saying things like “oh, not at all, the honor is all mine!”

This post will also be the start of a new direction for this blog. So far, I’ve mostly confined myself to “doing the math” and limiting myself to factual exposition. That’s going to change, for two reasons:

  • The central theme of this blog and of my Ph.D dissertation — the failure of data anonymization — now seems to be widely accepted in policy circles. This is due in large part to Paul Ohm’s excellent paper, which is a must-read for anyone interested in this topic. I no longer have to worry about the acceptance of the technical idea being “tainted” by my opinions.
  • I’ve been learning about the various facets of privacy — legal, economic, etc. — for long enough to feel confident in my views. I have something to contribute to the larger discussion of where technological society is heading with respect to privacy.

Underrepresentation of scientists

Living up to the stereotype

As it turned out, I was the only academic computer scientist among the 35 panelists. I found this very surprising. The underrepresentation is not because computer scientists have nothing to contribute — after all, there were other CS Ph.Ds from industry groups like Mozilla. Rather, I believe it is a consequence of the general attitude of academic scientists towards policy issues. Most researchers consider it not worth their time, and a few actively disdain it.

The problem is even deeper: academics have the same disdainful attitude towards the popular exposition of science. The underlying reason is that the goal in academia is to impress one’s peers; making the world better is merely a side-effect, albeit a common one. The incentive structure in academia needs to change. I will pick up this topic in future posts.

The FTC has an admirable approach to regulation

As I found out in the course of the day’s panels, the FTC is not about prescribing or mandating what to do. Pushing a specific privacy-enhancing technology isn’t the kind of thing they are interested in doing at all. Rather, they see their role as getting the market to function better and the industry to self-regulate. The need to avoid harming innovation was repeatedly emphasized, and there was a lot of talk about not throwing the baby out with the bathwater.

The following were the potential (non baby hurting) initiatives that were most talked about:

  • Market transparency. Markets can only work well when there is full information, and when it comes to privacy the market has failed horribly. Users have no idea what happens to their data once it’s collected, and no one reads privacy policies. Regulation that promotes transparency can help the market fix itself.
  • Consumer education. This is a counterpart to the previous point. Education about privacy dangers as well as privacy technologies can help.
  • Enforcement. A few bad apples have been responsible for the most egregious privacy SNAFUs. The larger players are by and large self-regulating. The FTC needs to work with law enforcement to punish the offenders.
  • Carrots and sticks. Even the specter of regulation, corporate representatives said, is enough to get the industry to self-regulate. Many would disagree, but I think a carrots-and-sticks approach can be made to work.
  • Incentivizing adoption of PETs (privacy enhancing technologies) in general. The question of how the FTC can spur the adoption of PETs was brought up on almost every panel, but I don’t think there were any halfway convincing answers. Someone mentioned that the government in general could go into the market for PETs, which seems reasonable.

As a libertarian, I think the overall non-interventionist approach here is exactly right. I’m told that the FTC is rather unusual among US regulatory agencies in this regard (which makes sense, considering that the FCC, for example, spends its time protecting children from breasts when it is not making up lists of words.)

Facebook’s two faces

Facebook public policy director Tim Sparapani, who was previously with the ACLU, made a variety of comments on the second panel that were bizarre, to put it mildly. Take a look (my comments are in sub-bullets):

  • “We absolutely compete on privacy.”
    • That’s a weird definition of “compete.” Facebook has a history of rolling out privacy-infringing updates, such as Beacon, the ToS changes, and the recent update that made the graph public. Then they wait to see if there’s an outcry and roll back some of the changes. It is hard to think of another company has had such a cavalier approach.
  • “There are absolutely no barriers to entry to create a new social network.”
    • Except for that little thing called the network effect, which is the mother of all barriers to entry. In a later post I will analyze why Facebook has reached a critical level of penetration in most markets which makes it nearly unassailable as a  general-purpose social network.
  • “Our users have learned to trust us.”
    • I don’t even know what to say about this one.
  • “We are a walled garden.”
    • Sparapani is confusing two different senses of “walled garden” here. This was said in response to a statement by the Google rep about Google’s features to let users migrate their data to other services (which I find very commendable). In this sense, Facebook is indeed a walled garden, and doesn’t allow migration, which is a bad thing.  But Sparapani said he meant it in the sense that Facebook doesn’t sell user data wholesale to other companies. That sounds like good news, except that third party app developers end up sharing user data with other entities, because enforcement of the application developer Terms of Service is virtually non-existent.
  • “If you delete the data it’s gone.” (in the context of deleting your account)
    • That might be true in a strict sense, but it is misleading. Deleting all your data is actually impossible to achieve because most pieces of data belong to more than one user. Each of your messages will live on in the other person’s inbox (and it would be improper to delete it from theirs). Similarly, photos in which you appear, which you would probably like gone when you delete your account, still live on in the album of whoever took the picture. The same goes for your pokes, likes and other multi-user interactions. These are the very things that make a social network social.
  • “We now have controls on privacy at the moment you share data. This is an extraordinary innovation and our engineers are really proud of it.”
    • The first part of that statement is true: you can now change the privacy controls on each of your Facebook status messages independently. The second part is downright absurd. It is completely trivial to implement from an engineering perspective (and LiveJournal for instance has had it for a decade).

There were more absurd statements, but you get the picture. It’s not just the fact that Sparapani’s comments were unhinged from reality that bothers me — the general tone was belligerent and disturbing. I missed a few minutes of the panel, during which he apparently he responded to a criticism from Chris Conley of the ACLU by saying “I was at the ACLU longer than you’ve been there.” This is unprofessional, undignified and a non-answer. Amusingly, he claimed that Facebook was “very proud” of various aspects of their privacy track record at least half a dozen times in the course of the panel.

Contrast all this with Mark Zuckerberg’s comments in an interview with Michael Arrington, which can be summed up as “the age of privacy is over.” That article goes on to say that Facebook’s actions caused the shift in social norms (to the extent that they have shifted at all) rather than merely responding to them. Either way, it is unquestionable that Facebook’s true behavior at the present time pays lip service to privacy, and Zuckerberg’s statement is a more-or-less honest reflection of that. On the other hand, as I have shown, the company sings a completely different tune when the FTC is listening.

Engaging privacy skeptics

Aside from Facebook’s shenanigans, I feel that that there are two groups in the privacy debate who are talking past each other. One side is represented by consumer advocates, and is largely echoed by the official position of the FTC. The other side’s position can be summed up as “yeah, whatever.” When expressed coherently, there are three tenets of this position (with the caveats that not all privacy skeptics adhere to all three):

  • Users don’t care about privacy any more
  • Even if they do, privacy is impossible to achieve in the digital age, so get over it
  • There are no real harms arising from privacy breaches.

Click image to embiggen

To  the right is an illustrative example of a mainstream-media representative who was at the workshop covering it on Twitter through the lens of his preconceived prejudices.

Privacy scholars never engage with the skeptics because the skeptical viewpoint appears obviously false to anyone who has done some serious thinking about privacy. However, it is crucial to engage the opponents, because 1. the skeptical view is extremely common 2. many of the startups coming out of the valley fall into this group, and they are are going to have control over increasing amounts of user data in the years to come.

The “privacy is dead” view was most famously voiced by Scott McNealy. In its extreme form it is easy to argue against: “start streaming yourself live on the Internet 24/7, and then we’ll talk.” (To be sure, a few people did this 10 years ago as a publicity stunt, but it is obvious that the vast majority of people aren’t ready for this level of invasiveness of monitoring/data collection.) But engaging with skeptics isn’t about refutation, it’s about dealing with a different way of thinking and getting the message across to the other side. Unfortunately real engagement hasn’t really been happening.

I have a double life in academia and the startup world, and I think this puts me in a somewhat unusual position of being able to appreciate both sides of the argument. My own viewpoint is somewhere in the middle; I will expand on this theme in future blog posts.

Entry filed under: Uncategorized. Tags: , , , , .

The Entropy of a DNA profile The Secret Life of Data

13 Comments Add your own

  • 1. Biweekly Links – 02-01-2010 « God, Your Book Is Great !!  |  February 1, 2010 at 10:27 pm

    [...] In which I come out: Notes from the FTC Privacy Roundtable Arvind Narayanan is an expert in privacy and data anonymization . His work on de-anonymizing social [...]

    Reply
  • 2. Tech and Law  |  February 3, 2010 at 10:32 am

    2nd FTC Privacy Roundtable 2010…

    Those who wish may compare Facebook\’s statements on the FTC panel with the comments made by Richard Allan on behalf of Facebook in the recent panel session on internet rights with Google,Vodafone and Open Insights at the London School of Economics….

    Reply
  • 3. WH  |  February 3, 2010 at 10:39 am

    Thanks for this interesting report. You may perhaps be interested in statements by Facebook’s Director of European Public Policy made at a recent panel with Google etc on internet rights at the London School of Economics.

    Reply
    • 4. Arvind  |  February 3, 2010 at 5:35 pm

      Thanks for the pointer! Always interesting to compare things across the pond. I’m surprised I hadn’t come across your blog before.. subscribed.

      Reply
  • 5. Jonathan Katz  |  February 3, 2010 at 11:34 pm

    Sure the general public cares about privacy…until they have to “pay” for it (whether monetarily, or by spending time, or losing out on the coolest new social medium, or whatever). People care about anonymous e-cash, too, and we know how far *that* business model went.

    Now it could be that the government will impose privacy by default, without relying on the market. And maybe this would even be a good idea. But I don’t see it happening.

    (These points are meant to be provocative, I look forward to your replies.)

    Reply
    • 6. Arvind  |  February 4, 2010 at 1:51 am

      As for your first point, I agree, this is something I’ve been planning to address in a future post: statistics like “66% of users care about privacy” are meaningless without specifying exactly what they will give up to protect that privacy.

      As for the government mandating privacy, I totally disagree. Firstly I think it is unenforceable (as long as we are to have anything remotely resembling a free Internet). Secondly I don’t think it would be a good idea. If a free and efficient market for privacy decides that privacy has little value — say because social norms are indeed changing — then having the government mandate privacy goes against the basic tenet of self-determination in a free society.

      Reply
  • 7. Mike  |  February 4, 2010 at 5:04 am

    Regarding whether pictures are deleted, Ars Technica has looked into this, as have I, and the answer is that for pictures that are public for even a minute, they don’t ever seem to be deleted from Facebook.

    After the roundtable, I spoke with Tim Sparapani about his error, and pointed him to my research regarding deletion speeds of online pictures: http://michaeljaylissner.com/blog/testing-deletion-speed-of-online-photo-sites

    So far, no response from him or Facebook.

    Reply
    • 8. Arvind  |  February 4, 2010 at 5:18 am

      That’s interesting. Thanks for letting me know.

      Reply
  • 9. Lars  |  February 5, 2010 at 9:06 am

    Thanks for the interesting coverage.

    I think you are mostly right about the gap between privacy and privacy skeptics. Statements like “You have zero privacy anyway” are just a show of cluelessness. Nonetheless it is important to argue, and argue well, with the no-privacy-people, because it is foremost a problem of social agreement. (And many privacy enhancing techniques only work if a large part of society uses them.)

    Resignation is to be avoided. Even if privacy where nonexistent wouldn’t it be better to work toward it?

    Reply
  • 10. The Secret Life of Data « 33 Bits of Entropy  |  February 6, 2010 at 8:48 pm

    [...] the recent FTC privacy roundtable, Scott Taylor of Hewlett Packard said his company regularly had the problem of not being able to [...]

    Reply
  • 11. ginsu  |  February 9, 2010 at 12:55 am

    Hey Arvind, sorry for the late response – nice to see you “out” -

    One possible correction to your notes: I seem to recall it was Nicole Wong at Google who said “We absolutely compete on privacy.” That was a bit of a contrast to the prior speaker, from LinkedIn, who said that their view of privacy was that it was a retention issue rather than customer-acquisition factor (i.e. bad privacy practices might cause them to lose customers).

    Then the Facebook guy straddled, saying basically, “I agree with both of them.”

    Reply
    • 12. Arvind  |  February 9, 2010 at 3:58 am

      Thanks for the note. You’re right that the Facebook guy did straddle at one point, saying he wanted to agree with the Google and the LinkedIn representatives. However, I was referring to a different comment a little bit before that, where the Facebook guy did indeed say “we absolutely compete on privacy.” Of course, Wong may have also made a very similar comment; I don’t remember one way or the other.

      Reply
  • 13. Google Buzz, Social Norms and Privacy « 33 Bits of Entropy  |  February 11, 2010 at 10:24 pm

    [...] norms in a detrimental way in order to meet their business objectives. This has become a recurring theme (c.f. the section on Facebook in that article). I don’t think there is any possibility of [...]

    Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Trackback this post  |  Subscribe to the comments via RSS Feed


About 33bits.org

I'm an assistant professor of computer science at Princeton. I research (and teach) information privacy and security, and moonlight in technology policy.

This is a blog about my research on breaking data anonymization, and more broadly about information privacy, law and policy.

For an explanation of the blog title and more info, see the About page.

Subscribe

Be notified when there's a new post — subscribe to the feed, follow me on Google+ or twitter or use the email subscription box below.

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 203 other followers


Follow

Get every new post delivered to your Inbox.

Join 203 other followers