Quick links

Patrolling the Intersection of Computers and People

Photo of Professor Arvind Narayanan

By  Doug Hulette
Photo by David Kelly Crow

Arvind Narayanan stands at the crossroads of technology and the public interest. You might call him a policeman, but he’d probably prefer to be considered a detective or maybe an auditor in the complex, fast-changing, and often fragile relationship between computers and people.

An assistant professor in the Computer Science Department since 2012, Narayanan delves into a broad spectrum of issues involving information privacy and security. But a common thrust involves a two-pronged question: How can digital technology go wrong, and what can we do about it?

“I’m particularly interested in exposing problems that cannot be solved by technical measures alone, and require policy interventions or societal adjustments,” says Narayanan, who earned his Ph.D. from the University of Texas, Austin, and did postdoctoral work at Stanford. He leads the Princeton Web Transparency and Accountability Project, which explores the ways that companies collect and use personal information. He is an affiliated faculty member at the Center for Information Technology Policy at Princeton and an affiliate scholar at Stanford Law School’s Center for Internet and Society.

“Throughout my career, my work has served as a key layer of oversight whenever the commercial incentives of the rapidly advancing technology industry diverge from the interests of the public,” Narayanan says. For example, he co-authored a paper published in the April 14 issue of the journal Science that laid out concerns that machine-learning programs can acquire — and, by doing so, extend and possibly augment — the cultural biases inherent in human language. (See Princeton’s article about the paper.)

In an email exchange, Narayanan offered his thoughts on the changing dynamics of computer science.

It seems some computer scientists these days take an increasingly broad view of what they do and how they do it.

“The role of computers in society has changed. Algorithms started out as recipes for automating tasks like sorting numbers. Today algorithmic systems make decisions about people's lives — screening job applicants, personalized health, scoring loan applications, criminal risk prediction, and many others — using deeply personal data.

“So it's not enough to ask if code executes correctly. We also need to ask if it makes society better or worse. This of course involves judgment and isn’t a pure computer science problem.”

So that’s where computer science meets policy fields?

“Technologists can’t do this work by themselves, but at the same time can't leave it to someone else. Studying the societal impact and risks of digital technology requires rigorous computer science research and leads to new scientific insights. This is the core premise of my research.

“For example, my research group builds tools and systems to reverse engineer the tracking of your clicks, searches, and behavior that happens invisibly when you browse the web. We've repeatedly uncovered how newly introduced features in the HTML standard are being abused to harvest personal information.”

“The deeper question is this: What is the balance of power between the consumer, websites, and advertisers online? What technical interventions can change this power balance to favor the consumer?”

You also do research on Bitcoin and other cryptocurrencies. What’s the connection?

“The web and cryptocurrencies are both decentralized technologies, and that's why I find them both so interesting to study. Unlike, say, Apple's App Store, anyone can create a website without being screened by a central authority. And unlike credit cards, anyone can send or receive bitcoins with no intermediary. When there is no central authority, it encourages tinkering and entrepreneurship — it's been called ‘generativity’ and ‘permissionless innovation’ — and this can be great for progress.

“The flip side is that decentralized platforms can be a bit of like the Wild West, with no one entity having enough of an incentive or authority to find and fix privacy and security problems.

And that’s a role you’re trying to fill?

“Academic research is crucial. In the realm of cryptocurrencies, my group has developed ways to make digital coins more resistant to theft and discovered surprising ways in which people's financial privacy can be compromised. We’ve also studied the systemic risks — that is, could a cryptocurrency like Bitcoin unravel and collapse because the incentives of its participants aren’t aligned with the overall goals of the system?

“The field of Algorithmic Game Theory studies questions of incentives in algorithm design, but not many AGT experts study cryptocurrencies, and vice versa. Luckily we have an AGT expert here at Princeton CS, Matt Weinberg. We've set up a collaboration between our groups, and our initial paper revealed lurking problems in the way that Bitcoin rewards miners.”

How can students get started with cryptocurrencies?

“The software is open-source, and easy to tinker with. There are many tutorials and forums online.”

“For a more rigorous introduction, together with colleagues, I’ve released an online course and a textbook that teaches the computer science behind cryptocurrencies, but also touches upon topics like how cryptocurrencies can be regulated and the place of cryptocurrencies in society.”

At the bottom line, is computer science becoming more humanistic?

“I would say that humanistic considerations are becoming more important. In the long run, I think digital technology will start to disappear into the social fabric, with computers and algorithms almost invisibly mediating our interactions with our peers, with institutions, and with the things around us. Computer scientists will need to come together with experts in sociology, law, and policy to understand and shape our ‘algorithmic society’.

Follow us: Facebook Twitter Linkedin