Computing Facilities

Last updated: December 12, 2013

The Department of Computer Science at Princeton University maintains a dedicated computing and network infrastructure to support its academic and research mission. In addition to these resources, members of the CS department also have access to shared campus computing resources.

The CS Network has a backbone operating at 10 Gb/sec and spans several buildings including the Computer Science Building, portions of Sherrerd Hall (the Center for Information Technology Policy), 221 Nassau Street (PlanetLab research staff), the Friend Center (two research labs and a workroom containing 34 Linux computers for academic use), as well as a dedicated "island" of 20 racks within the main campus data center. The CS Network has a 10 Gb/sec uplink to the main campus network. The main campus has two 2 Gb/sec uplinks to the commodity Internet and a 1 Gb/sec uplink to Internet2. Within the Computer Science Building, each room is wired with multiple Category 5/5e UTP ports, single-mode fiber, multimode fiber, and RG6 coaxial cable. The entire building is covered by an 802.11b/g/n wireless network.

The department maintains a central, high-performance modular filesystem with an InfiniBand interconnect. In its current configuration, there are approximately 95 TB of usable storage. The filesystem connects to the CS Network with a 10 Gb/sec uplink. All research data are protected by the equivalent of RAID-6 or better.

At any given time, there are hundreds of personal computers connected to the network running Windows, MacOS, or Linux. Personal computers can connect to the central file system using CIFS. For centralized, general purpose computing, the department has three clusters of machines running the Springdale Linux distribution (a custom Red Hat distribution maintained by members of the computing staff of Princeton University and the Institute for Advanced Study). The first cluster, penguins, is used for lightweight interactive computing and consists of two virtual machines. The second cluster, cycles, consists of four servers; each server has dual, eight-core processors, 128GB RAM, and 1 Gb/sec Ethernet connections to the network and file system. The third cluster, ionic, is a traditional beowulf cluster with an aggregate of 544 cores, 1 TB RAM, and 38 TB of local disk. Each node in the ionic cluster has a 1 Gb/sec Ethernet connection to a dedicated switch which, in turn, has a 10 Gb/sec uplink to the department backbone and file system. All three clusters are remotely accessible via SSH.

The Computer Science Building and the attached Friend Center building house several laboratory spaces (Graphics Lab, Systems Lab, Theory Lab, etc.) as well as our local co-location facility. The co-location facility contains 10 racks that are available for department infrastructure and specialized research equipment. In addition, the CS department is currently assigned 20 racks at the University's main data center located approximately 2 miles from the Computer Science Building. These racks are also available for both department infrastructure and specialized research equipment.

In addition to the resources dedicated to the CS department, researchers also have access to campus resources such as the library system (over 8 million volumes), the DataSpace Repository (for long-term archiving and dissemination of research data), and shared computational clusters. The largest cluster, orbital, consists of 268 nodes with an aggregate of 3216 cores, 12 TB RAM, 335 TB of local disk all interconnected with quad-data-rate (QDR) InfiniBand.