Q.&A.: Guarding Personal Data From Abuse by Insiders

Photo
Courtney Bowman, a member of the privacy and civil liberties group at Palantir Technologies, at the company's office in New York.Credit Michael Appleton for The New York Times

Courtney Bowman is a member of the privacy and civil liberties group at Palantir Technologies. Palantir, a privately held tech company in Palo Alto, Calif., first became known through its work for many military, police and intelligence services in the United States and overseas.

Today, more than half of Palantir’s business is with private sector companies, which use it for activities as diverse as improving the efficiency of oil exploration and figuring out where to put the gum at a checkout stand. It also works with disaster relief agencies.

The software comes with an auditing capability, so it is possible to see who looked at what. It is not clear that this capability is always used, particularly by companies, but Palantir says that it is trying to figure out ways of preserving individual civil liberties in an age when computers are tracking everything.

Mr. Bowman, who has degrees in physics and philosophy from Stanford, is one of the authors of a book on designing computer systems to ensure privacy, and he spoke recently with The New York Times. The conversation has been condensed and edited.

Q.

What is the relationship between security and privacy?

A.

Information security is a key component of privacy. Information must be held securely. It’s not going to get you all the way, but there is a lot of continuity.

Q.

And hackers are often looking for personal information?

A.

Privacy and security use many of the same technologies. Things like encryption, auditing databases, logging what was used or controls on who can have access to information, and alerts if there is a breach.

For privacy, the main worry may not be hackers as much as bad actions by authorized users. A useful concept in information system architecture is accountability oversight. Flagging people who misuse things. Revealing private things only by degree. Having access controls.

Q.

That sounds like a pretty basic system of rules put on the data.

A.

On the surface, it’s straightforward, but it can be complex. Data usage and what you can learn from it can be contingent on the context. It doesn’t take much to find personal information that has supposedly been made anonymous.

Q.

Can you retrofit new rules on older computer systems?

A.

People build systems, then deal with data protection. They think measures like content access restriction, user auditing and data retention practices can be configured after a system is built. It hurts the ability to put in good safeguards.

Good privacy systems work on anticipating and fighting risks like hacking, or private information leakage, but they need to be built from the ground up with privacy considerations in mind. They are seamless extensions of core operating functionality.

It’s better to build something flexible from the start. Figure out the key risks to sensitive information and try to account for all the related vulnerabilities. Decide how to overcome any single point of failure and anticipate security failures, so you can limit damage.

Q.

As more types of data come online, do you worry about people combining several types of available data to figure out private information?

A.

It’s a huge challenge. But there are ways of dealing with it. Giving approved people proportional access to data gives some protection about how the data is going to get used. You can have rules to temporarily encrypt aspects of data. You can make a risk profile in what you are disclosing — sometimes that’s possible, too, but it’s not easy. You can also constrain where data will go.

The problem is, if you fully de-identify personal data, you also lose its utility.

Q.

Should people worry more about companies or about governments?

A.

Government entities have a lot of regulation. A lot of these were written in the 1980s, and they need to be adapted. People are working on this. You hear a lot about a need to know — what under some circumstance you could see — versus the right to know.

Q.

Is there a danger governments will circumvent their own rules by hiring private sector mercenaries who work offshore, where the rules are different?

A.

Data analysis arbitrage? We’re already in that world. You’ll see third parties doing work for governments that are prohibited, then vending the results. You hope people deal in the spirit as well as the letter of the law.

Q.

What about the private sector?

A.

There isn’t a lot that says what companies should and shouldn’t do. They are iterating the technology faster than the policies.

The decisions around how we build a system have a significant effect on privacy mores. For example, there are no shortages of issues around privacy and how Facebook is designed.

The existing framework for personalization is permissions. You agree to it. But the traditional method is a notice and consent form that’s 50 pages long, and nobody reads it.

Q.

Which should scare me more: a big Chinese hacking or companies constantly trying to look at me and figure out my online behavior?

A.

When a data breach is exposed, it’s a discrete event. You know what will happen, for the most part. Marketing is directed at a lifestyle.

The private sphere is where we develop character. There is a risk of compounding pressures, brought on by a barrage of tailored advertisements and “personalizations,” eroding the private sphere. Over the long run, that’s more insidious than a hack.

Q.

And we’re probably going to get looked at even more. Is that another kind of security concern?

A.

The concern is predictive data analytics, capabilities that run the risk of turning back years of civil rights, by influencing economic outcomes and targeting people for specific reasons.

The risk is subtle, but these algorithms have compounding influences. They may list certain job opportunities for you, and not others, or you may not see an interest rate offer on a loan, because of how they profiled you. It won’t just shift our behaviors. It will shift outcomes.

Q.

How does this sort of work affect you?

A.

I don’t engage in social media. I limit my online presence. My co-authors and I are the kind of people who read the 50 pages of the terms of service before we sign up for something. I’m continually turning off tracking features on my devices.