Ben Jones

CS PhD Candidate at Princeton University

Ben Jones
Email: b...@princeton.edu (PGP Key)
Office: 320 Sherrerd Hall
Research Interests: Networking, Security, Internet Censorship
CV
GitHub
Group Webpage


About Me I am a Computer Science PhD candidate at Princeton University working in the area of Networks and Security. I work with Nick Feamster on measuring and circumventing Internet censorship. I joined the NOISE lab after obtaining a BS in Computer Engineering from Clemson University in Clemson, SC. I attended Georgia Tech from August 2013 to December 2014 prior to moving with Nick to Princeton.


Research Interests My research is focused on Internet interference problems, with a dual emphasis on measurement and intervention. I broadly define Internet interference as any manipulation of traffic for political or fiscal gain. Though there are many problems in this space, my work is primarily focused on Internet censorship. After finding an Internet interference problem, I seek to understand the problem through measurement and eventually mitigate the problem through intervention. At the measurement stage, I establish when, where, how, and what is censored or manipulated through studies at the scale of ISPs, countries, and the world. At the intervention stage, I apply the understanding gained through measurement to offer solutions. In the case of Internet censorship, I develop circumvention tools that enable users to freely communicate.


Honors and Awards

  • NSF GRFP Honorable Mention
    I received an honorable mention for my submission to the NSF Graduate Research Fellowship Program in the 2014-2015 academic year. 1 student received the fellowship and 2 other students received honorable mentions for the Computer Security sub-area for this cycle.
  • OTF Senior Fellow in Information Controls
    For the 2015-2016 academic year, I am supported by a OTF Senior Fellowship in Information Controls.
  • OTF Seasonal Fellow in Information Controls
    For the Spring 2015 semester, I received an OTF Seasonal Fellowship in Information Controls.


Recent Publications.

Recent Publications.
Projects
  • Automated Detection and Fingerprinting of Censorship Block Pages

    In some countries, the censor replaces the content citizens try to access with a message telling them that the content is blocked. Normal HTTP content varies between locations, people, and visits to a site, making it difficult to distinguish block pages from the intended content and making censorship measurement challenging. In a collaboration with Tzu-Wen Lee and Phillipa Gill from Stony Brook University, my advisor and I developed techniques to detect block pages with a 95% true positive rate and a precision of 80%. We also extended these detection techniques to identify the filtering tools that created the block pages and identifying a new filtering tool in Saudi Arabia.
    This work has been accepted to the Internet Measurements Conference (IMC) 2014

  • Facade: High-Throughput, Deniable Censorship Circumvention Using Web Search

    Along with several other students at Georgia Tech, I wrote a censorship circumvention system that hides arbitrary traffic within search queries. By analyzing the AOL search corpus, we discovered that typical search queries have around 40 bits of entropy and have limited each URL to contain 40 bits of entropy. Though this offers very low throughput, the scheme is theoretically robust to most forms of statistical analysis.
    This work has been accepted to USENIX FOCI 14

  • Detecting Performance Degradation as a Form of Censorship

    As my main project with my advisor at Georgia Tech, I am developing techniques to identify performance degradation as a form of censorship. Performance degradation is incredibly difficult to differentiate from network failures, but we hope to develop a classifier that determines if poor performance may be due to censorship by leveraging other features. Because censorship is often motivated by politically sensitive events, we can correlate politically sensitive events with periods of poor performance with resources like the GDELT database. While correlation with an important event is a strong feature, we will also leverage measurements to multiple sites and from different vantage points as additional features.

  • Detecting Manipulation of the DNS Root

    At ICSI, I am working with Vern Paxson and Nick Weaver to discover manipulation of the DNS root. While we are interested in any replica of the DNS root, we are especially interested in replicas that maliciously change DNS responses. To detect these modifications, we scanned all open DNS resolvers in IPv4 space and validated that the resolved IPs were consistent and the DNSSEC information validated correctly. Because many of the sites that we scanned were hosted on CDNs, we fetched HTTP content from each resolved IP and validated that HTTP headers were consistent to establish that the DNS responses were not manipulated. We also used a home measurement platform, RIPE Atlas, to explore DNS proxies, odd timing to root anycast instances, and DNSSEC availability.
    For better or worse, this project only yielded negative results

  • Measuring Global Internet Censorship

    At ICSI, I am also performing a global scan for Internet censorship with Roya Ensafi, Vern Paxson, and Nick Weaver. To address gaps in knowledge about censorship, we are measuring DNS censorship from all open resolvers in IPv4 and from a home network measurement platform, RIPE Atlas, and we are measuring IP connectivity to web sites with a hybrid idle scan. For my portion of the project, I am writing the code to scan all open resolvers in IPv4 and analyze the results.

  • Internet Freedom: Empowering Journalists in West Africa

    For my undergraduate research, I worked with several other students to develop a censorship circumvention system for journalists in West Africa. In addition to implementing several components of the design, I also explored the problem of gathering aggregate usage statistics without collecting or storing identifying information from users. I adapted techniques from other research areas to suit our needs and analyzed the accuracy and anonymity tradeoffs of my solutions.