Quick links

CITP

CITP Seminar: The Princeton Digital Ad Observatory

Date and Time
Tuesday, April 27, 2021 - 12:30pm to 1:30pm
Location
Zoom Webinar (off campus)
Type
CITP
Speaker
Arunesh Mathur, from CITP

It is well known that digital ads violate privacy, yet we know little about their content. Digital ads are of increasingly low-quality, and often contain manipulative and deceptive components to lure users into viewing potentially harmful content. Such harmful content can range from disinformation or hyper-partisan websites all the way to products with questionable claims/value like health products or payday loans. Further, many such types of ads appeal to vulnerable populations like children, older adults, and low-income individuals—all of which calls for more public scrutiny into their content. However, owing to their fleeting nature, digital ads are neither archived nor systematically studied. Previous studies of digital ads have been small in scale and limited to a single point in time.

This project proposes creating a large-scale repository of digital ads crawled and continually updated automatically from around the web. The aim is to use this repository to study the characteristics of digital ads, draw public attention towards problematic ads, and hold the publishers/platforms/networks that host these ads accountable. This talk will highlight some preliminary findings and outline an agenda to advance our understanding of the digital advertising ecosystem.

Bio:
Arunesh Mathur is a postdoctoral research fellow at the Center for Information Technology Policy. His research examines the societal impacts of technical systems through an empirical lens. His dissertation research showed how commercial, political, and other powerful actors employ dark patterns to exploit individuals and society. His research has received two best paper awards (ACM CSCW and USENIX SOUPS) and the Privacy Papers for Policy Makers Award.

To request accommodations for a disability please contact Jean Butcher, butcher@princeton.edu, at least one week prior to the event.

CITP Seminar: Beyond Algorithmic Bias: An Interrogation of the Google Search by Image Algorithm

Date and Time
Tuesday, February 23, 2021 - 12:30pm to 1:30pm
Location
Zoom Webinar (off campus)
Type
CITP
Speaker
Orestis Papakyriakopoulos, from CITP

We perform a socio-computational interrogation of the google search by image algorithm, a main component of the google search engine. By drawing from Bourdieu’s theory of cultural reproduction, we treat the algorithm as a subject, that was shaped (trained) by a specific culture and hence produces and reproduces the conditions of its creation. In this way, the algorithm functions as a window for understanding the attitudes of its designers, owners, and the ones existing in the dataset it was trained on.

We learn from the algorithm by presenting it with more than 40 thousands faces of all ages and more than four races (prompts) and collecting and analyzing the assigned labels (responses) with the appropriate statistical tools. In this way, we uncover a small portion of the algorithm’s beliefs, values, knowledge and skills, and use its visual understanding of humans and their appearance to learn the algorithm’s reflection of society. We find that the algorithm reproduces structures existing in the white male patriarchy, often simplifying, stereotyping and discriminating females and non-white individuals, while providing more diverse and positive descriptions of white men.

Since we are able to locate structural elements of the algorithm’s culture, we study up: we provoke individuals at the top of the techno-hierarchical ladder, by demonstrating how the algorithm places them within the socio-cultural reality that they consciously or unconsciously shaped, many times creating biased representations of them. Based on the analysis, we discuss the scientific and design implications of the study and provide suggestions for alternative ways to design just socioalgorithmic systems.

Article co-author: Arwa Michelle Mboya (MIT Media Lab)

Bio:
Orestis Papakyriakopoulos is a postdoctoral research associate at CITP. His research showcases political issues and provides ideas, frameworks, and practical solutions towards just, inclusive and participatory socio-algorithmic ecosystems through the application of data-intensive algorithms and social theories.

Orestis has a master’s degree in civil engineering from the National Technical University of Athens and a master’s degree in philosophy of science and technology from the Technical University of Munich. He received a Ph.D. in computer science from the Technical University of Munich. In 2019-20 he was a visiting scholar at the Massachusetts Institute of Technology’s Center for Civic Media.

To request accommodations for a disability please contact Jean Butcher, butcher@princeton.edu, at least one week prior to the event.

CITP Seminar: Privacy and Disclosure in the Digital Age

Date and Time
Tuesday, April 13, 2021 - 12:30pm to 1:30pm
Location
Zoom Webinar (off campus)
Type
CITP
Speaker
Leslie John, from Harvard Business School

Why do people post salacious photos or incendiary comments on social media, when the damage to their relationships, reputation and careers could be permanent? Why do we prefer to hire people who reveal unsavory information about themselves relative to those who simply choose not to disclose? Why are people more likely to disclose the fact that they cheated on their taxes on a website called “How BAD r U?” that clearly offers no privacy protection than on a sober-looking site with privacy safeguards? And why did Target face consumer ire for sending pregnancy-related coupons to a teenage customer it (correctly) inferred to be pregnant? Yet, how come Amazon can make product recommendations conspicuously based on users’ behavior without seemingly provoking privacy backlash? These questions are all manifestations of the privacy paradox: the apparent disconnect between people’s privacy preferences and behavior.

This talk will describe recent research investigating answers to these questions, with a particular focus on a paper exploring the effect of temporary sharing on impression formation.

Relative to traditional, offline forms of communication, there is an enhanced permanence to digital sharing. Digital disclosures can come back to haunt, making it challenging for people to manage the impressions they make upon others. Nine studies show that these challenges can be exacerbated by temporary-sharing technologies. Temporary sharing reduces privacy concerns, in turn increasing disclosure of potentially compromising information (in the form of uninhibited selfies). Recipients attribute these indiscretions to sharers’ bad judgment, failing to appreciate the situational influence—the temporariness of the sharing platform—on sharers’ disclosures. Sharers do not anticipate this consequence, mistakenly believing that recipients will attribute their disclosure decisions to the (temporary) platform on which they chose to send the photographs.

Bio:
Leslie K. John is a Marvin Bower Associate Professor of Business Administration at the Harvard Business School. Currently, she teaches on the topics of Negotiation, Marketing and Behavioral Economics in various Executive Education courses, including in the Program for Leadership Development. She has also taught extensively in both the required and elective MBA curricula.

Leslie is a behavioral scientist who studies how people make decisions, and the wisdom or error of those decisions. In her primary line of research, she studies privacy decision-making, identifying what drives people to share or withhold personal information, as well as their reactions to firms’ and employers’ use of their personal data. In another line of research, Leslie studies health decision-making, devising psychologically-informed interventions to help people make healthier choices.

Her work has been published in leading academic journals including the Proceedings of the National Academy of Sciences, Psychological Science, Management Science, The Journal of Marketing Research, and the Journal of the American Medical Association. It has received media coverage in outlets including the New York Times, The Wall Street Journal, Financial Times, and Time Magazine. She has received numerous awards, including from the Association for Psychological Science and the Marketing Science Institute; and was named a Wired Innovation Fellow.

Leslie holds a Ph.D. in behavioral decision research from Carnegie Mellon University, where she also earned an M.Sc. in psychology and behavioral decision research. She completed her bachelor’s degree in psychology at the University of Waterloo.

To request accommodations for a disability please contact Jean Butcher, butcher@princeton.edu, at least one week prior to the event.

CITP Seminar: Can Voters Detect Ballot Manipulations with a Transparent Voting Machine

Date and Time
Tuesday, March 30, 2021 - 12:30pm to 1:30pm
Location
Zoom Webinar (off campus)
Type
CITP
Speaker
Juan Gilbert, from University of Florida

Touch-screen ballot-marking devices (BMDs) produce paper ballots that are counted by optical-scan voting machines and can be recounted by hand. If the BMD is hacked or misprogrammed so that it prints a different candidate selection than the voter indicated on the touchscreen, the voter is supposed to notice this; this is an essential protection against hacking and programming bugs. Recent studies have shown that, unfortunately, only a small fraction of voters read their paper ballot carefully enough to catch errors. That’s a problem, because errors printing the paper ballots cannot be caught by recounts. This talk will present a new design for ballot marking devices, the Transparent BMD, which uses a novel human-computer interaction (HCI) mode: it prints the ballot just where the voter is already looking–behind a transparent touchscreen–and deliberately directs the voter’s attention to the paper printout in just the right place. User studies on the prototype show a dramatically higher rate for voter detection of errors.

Bio:
Dr. Juan E. Gilbert is the Andrew Banks Family Preeminence Endowed Professor and Chair of the Computer & Information Science & Engineering Department at the University of Florida where he leads the Human Experience Research Lab. He is also a fellow of the Association of Computing Machinery (ACM), a fellow of the American Association of the Advancement of Science (AAAS), and a fellow of the National Academy of Inventors (NAI). Juan is the inventor of Prime III, an open source, secure and accessible voting technology that has been used in numerous organization elections and recently in statewide elections in New Hampshire and Butler County, Ohio. Prime III is the only open-source voting system to be used in state, local and federal elections. He was a member of the National Academies Committee on the Future of Voting: Accessible, Reliable, Verifiable Technology that produced the report titled, “Securing the Vote: Protecting American Democracy”.


To request accommodations for a disability please contact Jean Butcher, butcher@princeton.edu, at least one week prior to the event.

CITP Seminar: A Duty of Loyalty for Privacy Law

Date and Time
Tuesday, March 23, 2021 - 12:30pm to 1:30pm
Location
Zoom Webinar (off campus)
Type
CITP
Speaker
Woodrow Hartzog, from Northeastern University

Data privacy law fails to stop companies from engaging in self-serving, opportunistic behavior at the expense of those who trust them with their data. Academics and policymakers have recently proposed a possible solution: require those entrusted with peoples’ data and online experiences to be loyal to those who trust them. But critics and companies have concerns about a duty of loyalty. What, exactly, would such a duty of loyalty require? What are the goals and limits of such a duty? Should loyalty mean obedience or a pledge to make decisions in peoples’ best interests? What would the substance of the rules implementing the duty look like?

A duty of loyalty should be based upon the risks of digital opportunism in information relationships. Data collectors bound by this duty of loyalty would be obligated to act in the best interests of people exposing their data and online experiences, up to the extent of their exposure. They would be prohibited from designing digital tools and processing data in a way that conflicts with a trusting parties’ best interests. This duty could also be used to set rebuttable presumptions of disloyal activity and act as an interpretive guide for other duties. A duty of loyalty would be a revolution in data privacy law. That’s exactly what is needed to break the cycle of self-dealing ingrained into the current Internet.

Bio:
Woodrow Hartzog is a professor of law and computer science at Northeastern University School of Law and the Khoury College of Computer Sciences. He is also a resident fellow at the Center for Law, Innovation and Creativity (CLIC) at Northeastern University, a faculty associate at the Berkman Klein Center for Internet & Society at Harvard University, a non-resident fellow at The Cordell Institute for Policy in Medicine & Law at Washington University, and an affiliate scholar at the Center for Internet and Society at Stanford Law School.

His research on privacy, media, and robotics has been published in scholarly publications such as the Yale Law Journal, Columbia Law Review, and California Law Review and popular publications such as The New York Times, The Washington Post, and The Guardian. He has testified multiple times before Congress and has been quoted or referenced by numerous media outlets, including NPR, BBC, and The Wall Street Journal. He is the author of Privacy’s Blueprint: The Battle to Control the Design of New Technologies, published in 2018 by Harvard University Press.

To request accommodations for a disability please contact Jean Butcher, butcher@princeton.edu, at least one week prior to the event.

CITP Seminar: Predict and Surveil: Data, Discretion, and the Future of Policing

Date and Time
Tuesday, February 16, 2021 - 12:30pm to 1:30pm
Location
Zoom Webinar (off campus)
Type
CITP
Speaker
Sarah Brayne, from University of Texas at Austin

Computational procedures increasingly inform how we work, communicate, and make decisions. This talk will draw on interviews and ethnographic observations conducted within the Los Angeles Police Department to analyze the organizational and institutional forces shaping the use of information in policing. It will be revealed how law enforcement leverages big data and new surveillance technologies to allocate resources, classify risk, and conduct investigations. It will be argued that big data does not eliminate discretion, but rather displaces discretionary power to earlier, less visible parts of the policing process, which has implications for organizational practice and social inequality.

Bio:
Sarah Brayne is an assistant professor of sociology at The University of Texas at Austin. In her research, Sarah uses qualitative and quantitative methods to examine the social consequences of data-intensive surveillance practices. Her book, Predict and Surveil: Data, Discretion, and the Future of Policing (Oxford University Press), draws on ethnographic research with the Los Angeles Police Department to understand how law enforcement uses predictive analytics and new surveillance technologies. In previous research, she analyzed the relationship between criminal justice contact and involvement in medical, financial, labor market, and educational institutions. Sarah’s research has appeared in the American Sociological Review, Social Problems, Law and Social Inquiry, and the Annual Review of Law and Social Science and has received awards from the American Sociological Association, the Law and Society Association, and the American Society of Criminology.

Prior to joining the faculty at UT-Austin, Sarah was a postdoctoral researcher at Microsoft Research. She received her Ph.D. in Sociology and Social Policy from Princeton University.

Sarah has volunteer-taught college-credit sociology classes in prisons since 2012. In 2017, she founded the Texas Prison Education Initiative.

To request accommodations for a disability please contact Jean Butcher, butcher@princeton.edu, at least one week prior to the event.

This talk will not be recorded.

CITP Seminar: One Person, One Vote

Date and Time
Tuesday, February 9, 2021 - 12:30pm to 1:25pm
Location
Zoom Webinar (off campus)
Type
CITP
Speaker
Sharad Goel, from Stanford University

About a quarter of Americans report believing that double voting is a relatively common occurrence, casting doubt on the integrity of elections. But, despite a dearth of documented instances of double voting, it’s hard to know how often such fraud really occurs (people might just be good at covering it up!). This talk will describe a simple statistical trick to estimate the rate of double voting — one that builds off the classic birthday paradox — and show that such behavior is exceedingly rare. It will be further argued that current efforts to prevent double voting can in fact disenfranchise many legitimate voters.

Bio:
Sharad Goel an assistant professor at Stanford University in the Department of Management Science & Engineering, in the School of Engineering. He has courtesy appointments in Computer Science, Sociology, and the Law School.

Sharad looks at public policy through the lens of computer science, bringing a computational perspective to a diverse range of contemporary social issues. Some topics he has recently worked on are: policing practices, including statistical tests for discrimination; fair machine learning, including in automated speech recognition; and U.S. elections, including swing voting, polling errors, voter fraud, and political polarization.

He founded and directs the Stanford Computational Policy Lab. The Lab is a team of researchers, data scientists, and journalists that address policy problems through technical innovation. For example, they recently deployed a “blind charging” platform in San Francisco to mitigate racial bias in prosecutorial decisions.

Sharad also writes essays about policy issues from a statistical perspective. These include discussions of algorithms in the courts (in the New York Times, the Washington Post, and the Boston Globe); policing (in Slate and the Huffington Post); mass incarceration (in the Washington Post); election , polls (in the New York Times); claims of voter fraud (in Slate, and also an extended interview with This American Life); and affirmative action (in Boston Review).

Sharad received a bachelor’s degree in mathematics from the University of Chicago, master’s degree in computer science and a doctoral degree in applied mathematics from Cornell University. Before joining the Stanford faculty, Sharad worked at Microsoft Research in New York City.

To request accommodations for a disability please contact Jean Butcher, butcher@princeton.edu, at least one week prior to the event.

CITP Seminar: Where Human-Centered Design Meets the Humanitarian Mandate

Date and Time
Tuesday, February 2, 2021 - 12:30pm to 1:30pm
Location
Zoom Webinar (off campus)
Type
CITP
Speaker
Jennifer Hauseman, from International Committee for the Red Cross

Since its creation in 1863 in Geneva, Switzerland, the International Committee of the Red Cross (ICRC) has one objective: to ensure protection and assistance for victims of armed conflict and other situations of violence. Its story is one of impartiality and neutrality in the development of humanitarian action. As the guardian of the Geneva Conventions—and in partnership with the Red Cross and Red Crescent Movement—the ICRC provides humanitarian support in more than 80 countries around the globe. Since its founding 150-plus years ago, the ICRC has moved from its early days of records of prisoners of war on carefully catalogued, handwritten index cards—in order to provide answers to family members—to an organization in the midst of a technology transformation.

What are the ICRC’s considerations when it comes to emerging information technology policies and legal requirements meant to protect vulnerable communities from misuse of data and cyber security issues? How is its technology impacted by the limitations of low-bandwidth environments while managing systems to support thousands of professional staff? How does a global organization–in this case, one that assists those affected by war and other situations of violence–provide digital engagement as well as its traditional, hands-on, frontline humanitarian response? The ICRC takes a future-focused and outward-looking approach, which will be the focus of this talk.

As the director of communication and information management for ICRC, Jennifer Hauseman is responsible for the organization’s technology, including everything from software development to online campaigns to the digitization of its storied archives. The talk will also discuss the realities of creating effective technologies in some of the most challenging environments in the world, and what considerations should come into play for any policymaker wanting to create a service or system designed to protect the most vulnerable.

Bio:
Jennifer Hauseman is the director of communication and information management for the International Committee for the Red Cross (ICRC). Jennifer joined the ICRC first as head of digital communications and now in her current role since July 2018. Prior to joining the ICRC, she was a deputy director for communications at the Bill & Melinda Gates Foundation, and was a manager, programmer and senior editor at Amazon.com.

To request accommodations for a disability please contact Jean Butcher, butcher@princeton.edu, at least one week prior to the event.

CITP Seminar: Computer Crime

Date and Time
Tuesday, November 24, 2020 - 12:30pm to 1:30pm
Location
Zoom Webinar (off campus)
Type
CITP
Speaker
Josh Goldfoot, from the Department of Justice

Josh Goldfoot
Please join the webinar here.


Criminal computer intrusions can endanger privacy, safety, financial security, and more. The problem of computer crime has grown so that it threatens not only businesses and government agencies, but potentially every Internet user. Current threats include malicious software, ransomware, denial of service attacks, and data breaches. An underground economy has grown to allow criminals to more easily obtain the tools necessary to commit, and profit from, criminal computer intrusions. Drawing on the public record, the talk will discuss how the Department of Justice has employed criminal investigation and prosecution to respond to these threats.

Bio:
Josh Goldfoot is principal deputy chief of the Computer Crime and Intellectual Property Section (CCIPS) in the Department of Justice’s Criminal Division, where he helps supervise a group of 40 attorneys who investigate and prosecute computer crimes and criminal intellectual property offenses.  Beginning in 2005, Josh worked at CCIPS prosecuting computer intrusions and wiretap offenses, as well as serving as an expert on the law of electronic evidence and online investigations.  In 2013, Josh became deputy chief for cyber policy in the Department’s National Security Division, and then returned to CCIPS in 2016, becoming principal deputy chief in 2019.  Josh has received the Attorney General’s John Marshall Award for his work on remote computer searches, the Attorney General’s Distinguished Service Award for his work on a botnet takedown, the FBI Director’s Award for an international hacking case, and also four different Assistant Attorney General awards.

Josh has trained hundreds of AUSAs and DOJ attorneys in electronic evidence, online investigations, computer intrusions, cybersecurity, and prosecuting cybercrime.  He has authored or co-authored five law review articles about law and technology: The Pen-Trap Statute and the Internet (2018); A Trespass Framework for the Crime of Hacking (2016); The Physical Computer and the Fourth Amendment (2011); A Declaration of the Dependence of Cyberspace (2009), and Antitrust Implications of Internet Administration (1998). He received a United States patent in 2008 for shape recognition technology.  He is a graduate of Yale University and earned his law degree from the University of Virginia School of Law in 1999. He has worked in technology law since 1999, when he advised Internet startups in Silicon Valley on intellectual property issues. Prior to joining the Department of Justice in 2005, he litigated civil cases, and clerked for Judge Alex Kozinski on the Ninth Circuit U.S. Court of Appeals. Josh authored and operates the web site “sentencing.us,” which calculates U.S. federal sentencing guidelines.


To request accommodations for a disability please contact Jean Butcher, butcher@princeton.edu, at least one week prior to the event.

CITP Seminar: How Privacy Got Its Race

Date and Time
Tuesday, November 17, 2020 - 12:30pm to 1:30pm
Location
Zoom Webinar (off campus)
Type
CITP

Anita Allen
Please join the webinar here.


There is increasing interest in understanding the difference race makes for the enjoyment of privacy and the protection of privacy rights. This talk surveys issues and concerns at the intersection of race relations and privacy — values and rights. Who gets to be shielded or secluded? Who gets watched; gets to observe? Who gets profiled, who ignored? Who gets to be invisible or is forced into invisibility? The focus will be the United States and Blacks but parallel structures of power and domination can be seen in China with respect to its minorities.

Bio:
Anita L. Allen is an internationally renowned expert on privacy law and ethics, and is recognized for contributions to legal philosophy, women’s rights, and diversity in higher education. In July 2013, Allen was appointed Penn’s Vice Provost for Faculty, and in 2015, Chair of the Penn Provost’s Advisory Council on Arts, Culture and the Humanities. From 2010 to 2017, she served on President Obama’s Presidential Commission for the Study of Bioethical Issues. She was presented the Lifetime Achievement Award of the Electronic Privacy Information Center in 2015 and elected to the National Academy of Medicine in 2016. In 2017 Allen was elected Vice-President/President Elect of the Eastern Division of the American Philosophical Association. In 2015 Allen was on the summer faculty of the School of Criticism and Theory at Cornell. A two-year term as an Associate of the Johns Hopkins Humanities Center concluded in 2018.

Her books include Unpopular Privacy: What Must We Hide (Oxford, 2011); Privacy Law and Society (Thomson/West, 2017); The New Ethics: A Guided Tour of the 21st Century Moral Landscape (Miramax/Hyperion, 2004); and Why Privacy Isn’t Everything: Feminist Reflections on Personal Accountability (Rowman and Littlefield, 2003).


To request accommodations for a disability please contact Jean Butcher, butcher@princeton.edu, at least one week prior to the event.

Follow us: Facebook Twitter Linkedin