The brilliant New York University philosopher Helen Nissenbaum has put her approach to privacy at the center of the national agenda.

flows.jpg

 Nissenbaum's approach to privacy focuses on how information flows, not just the information itself.

PALO ALTO -- A mile or two away from Facebook's headquarters in Silicon Valley, Helen Nissenbaum of New York University was standing in a basement on Stanford's campus explaining that the entire way that we've thought about privacy on the Internet is wrong.

It was not a glorious setting. The lighting was bad. The room was half empty. Evgeny Morozov was challenging her from the back of the room with strings of tough, almost ludicrously detailed questions.

Nissenbaum's March presentation was part of Stanford's Program on Liberation Technology and relied heavily on her influential recent research, which culminated in the 2010 book, Privacy in Context, and subsequent papers like "A Contextual Approach to Privacy Online."

But the most important product of Nissenbaum's work does not have her byline. She's played a vital role in reshaping the way our country's top regulators think about consumer data. As one measure of her success, the recent Federal Trade Commission report, "Protecting Consumer Privacy in an Era of Rapid Change," which purports to lay out a long-term privacy framework for legislators, businesses, and citizens, uses the word context an astounding 85 times!

Given the intellectual influence she's had, it's important to understand how what she's saying is different from other privacy theorists. The standard explanation for privacy freakouts is that people get upset because they've "lost control" of data about themselves or there is simply too much data available. Nissenbaum argues that the real problem "is the inapproproriateness of the flow of information due to the mediation of technology." In her scheme, there are senders and receivers of messages, who communicate different types of information with very specific expectations of how it will be used. Privacy violations occur not when too much data accumulates or people can't direct it, but when one of the receivers or transmission principles change. The key academic term is "context-relative informational norms." Bust a norm and people get upset.

This may sound simple, but it actually leads to different analyses of current privacy dilemmas and may suggest better ways of dealing with data on the Internet. A quick example: remember the hubbub over Google Street View in Europe? Germans, in particular, objected to the photo-taking cars. Many people, using the standard privacy paradigm, were like, "What's the problem? You're standing out in the street? It's public!" But Nissenbaum argues that the reason some people were upset is that reciprocity was a key part of the informational arrangement. If I'm out in the street, I can see who can see me, and know what's happening. If Google's car buzzes by, I haven't agreed to that encounter. Ergo, privacy violation.

Nissenbaum gets us past thinking about privacy as a binary: either something is private or something is public. Nissenbaum puts the context -- or social situation -- back into the equation. What you tell your bank, you might not tell your doctor. What you tell your friend, you might not tell your father-in-law. What you allow a next-door neighbor to know, you might not allow Google's Street View car to know. Furthermore, these differences in information sharing are not bad or good; they are just the norms.

Perhaps most importantly, Nissenbaum's paradigm lays out ways in which sharing can be a good thing.  That is to say, more information becoming available about you may not automatically be a bad thing. In fact, laying out the transmission principles for given situations may encourage people, both as individuals and collectively, to share more and attain greater good. On a day when a House of Representatives committee is holding a hearing on privacy titled, "Balancing Privacy and Innovation," which really should be titled, "Balancing Privacy and Corporate Interests," any privacy regulation that's going to make it through Congress has to provide clear ways for companies to continue profiting from data tracking. The key is coming up with an ethical framework in which they can do so, and Nissenbaum may have done just that. 

Right now, people are willing share data for the free stuff they get on the web. Partly, that's because the stuff on the web is awesome. And partly, that's because people don't know what's happening on the web. When they visit a website, they don't really understand that a few dozen companies may collect data on that visit.

The traditional model of how this works says that your information is something like a currency and when you visit a website that collects data on you for one reason or another, you enter into a contract with that site. As long as the site gives you "notice" that data collection occurs -- usually via a privacy policy located through a link at the bottom of the page -- and you give "consent" by continuing to use the site, then no harm has been done. No matter how much data a site collects, if all they do is use it to show you advertising they hope is more relevant to you, then they've done nothing wrong.

It's a free market kind of thing. You are a consumer of Internet pages and you are free to go from one place to another, picking and choosing among the purveyors of information. Nevermind that if you actually read all the privacy policies you encounter in a year, it would take 76 work days. And that calculation doesn't even account for all the 3rd parties that drain data from your visits to other websites.

Even more to the point: there is no obvious way to discriminate between two separate webpages on the basis of their data collection policies. While tools have emerged to tell you how many data trackers are being deployed at any site at a given moment, the dynamic nature of Internet advertising means that it is nearly impossible to know the story through time. As I explained in a previous post, advertising space can be sold and resold many times. At each juncture, the new buyer has to have some information about the visit. Ads can be sold by geography or probable demographic indicators, too, so there may be many, many companies that are involved with some of the data on an individual site.

I asked Evidon, the makers of a track-the-trackers tool called Ghostery, to see how many data trackers ran during the past month on four news websites and my home here, The Atlantic. The numbers were astonishing. The Drudge Report and Huffington Post both ran over 200 trackers. The New York Times ran 146 and The Wall Street Journal 99. We deployed a 48. Of course, these are just the numbers: data tracking firms are invasive in different ways, so it could be possible that our 48 tracking tools collect just as much data as Drudge's 205. Even if the sheer numbers seem to indicate that something different in degree is happening at Drudge and Huffington Post than at our site, I couldn't tell you for sure that was the case.

How can anyone make a reasonable determination of how their information might be used when there are more than 50 or 100 or 200 tools in play on a single website in a single month? "I think the biggest challenge we have right now is figuring out a way to educate the average user in a way that's reasonable," Evidon's Andy Kahl told me. Some people talk about something like a nutritional label for data policies. Others, like Stanford's Ryan Calo, talk about "visceral notice."

Nissenbaum doesn't think it's possible to explain the current online advertising ecosystem in a useful way without resorting to a lot of detail. She calls this the "transparency paradox," and considers it insoluble. What, then, is her answer, if she's thinking about chucking basically the only privacy protections that we have on the Internet?

Well, she wants to import the norms from the offline world into the online world. When you go to a bank, she says, you have expectations of what might happen to your communications with that bank. That should be true whether you're online, on the phone, or at the teller.  Companies can use your data to do bank stuff, but they can't sell your data to car dealers looking for people with a lot of cash on hand.

The answer, as applied by the FTC in their new framework, is to let companies do standard data collection but require them to tell people when they are doing things with data that are inconsistent with the "context of the interaction" between a company and a person.

I've spent this entire story extolling the virtues of Nissenbaum's privacy paradigm. But here's the big downside: it rests on the "norms" that people expect. While that may be socially optimal, it's actually quite difficult to figure out what the norms for a given situation might be. After all, there is someone else who depends on norms for his thinking about privacy.

"People have really gotten comfortable not only sharing more information and different kinds, but more openly and with more people," Mark Zuckerberg told an audience in 2010. "That social norm is just something that has evolved over time."


We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.