2019-09-06 CCS services restored

As of 12:00 P.M. All CCS storage and systems are operational again.  Thank you for your patience.

2019-04-10 Projects filesystem expansion complete

The expansion and maintenance of the /projects2 filesystem has been completed, and all access has been restored.


2019-02-05 CG Campus network upgrades complete

CCS network upgrades have completed and access has been restored.


2018-10-13 Network maintenance complete

CCS network maintenance has completed and access has been restored.


2018-09-21 File system access restored

/scratch and /projects file system access has been restored.



CCS systems are colocated at the Verizon Terremark NAP of the Americas (NOTA or NAP). The NAP Datacenter in Miami currently features a 750,000 square foot, purpose-built datacenter, Tier IV facility with N+2 14 Megawatt power and cooling infrastructure.

At the NAP, equipment floors start at 32 feet above sea level. The roof slope is designed to aid in drainage of floodwater in excess of 100-year storm intensity, assisted by: 18 rooftop drains, architecture designed to withstand a Category 5 hurricane with approximately 19 million pounds of concrete roof ballast, and 7-inch-thick steel reinforced concrete exterior panels. Plus, the building is outside FEMA 500-year designated flood zone. The NAP uses a dry pipe fire-suppression system to minimize the risk of damage from leaks.

The NAP has a centrally-located Command Center monitored by 7×24 security and security sensors. In order to connect the University of Miami with the NOTA Datacenter, UM has invested in a Dense Wavelength Division Multiplexing (DWDM) optical ring for all of its campuses. The CCS Advanced Computing resources occupy a discrete, secure wavelength on the ring, which provides a distinct 10 Gigabit HPC network to all UM campuses and facilities.

Given the University of Miami’s past experience including several hurricanes and other natural disasters, we anticipate no service interruptions due to facilities issues. The NAP was designed and constructed for resilient operations. UM has gone through several hurricanes, power outages, and other severe weather crises without any loss of power or connectivity to the NAP. The NAP maintains its own generators with a flywheel power crossover system. This insures that power is not interrupted when the switch is made to auxiliary power. The NAP maintains a two-week fuel supply (at 100% utilization), and is on the primary list for fuel replacement due to its importance as a data-serving facility.

In addition to hosting the University of Miami’s computing infrastructure, the NAP of the Americas is home to the US SouthCOM, Amazon, EBay, and several telecommunications companies’ assets. The NAP at Miami hosts 97% of the network traffic between the US and Central/South America. The NAP is also the local access point for Florida LambdaRail (FLR), which is gated to Internet 2 (I2) to provide full support to the I2 Innovation Platform. The NAP also provides TLD information to the DNS infrastructure and is the local peering point for all networks in the area.

The University of Miami has made the NAP its primary Data Center, occupying a very significant footprint on the third floor. Currently all UM-CCS resources, clusters, storage, and back up systems run from this facility and serve all major campuses of UM.


The Center for Computational Science (CCS) holds offices on all three campuses at the University of Miami: the Miller School of Medicine, the Rosenstiel School of Marine and Atmospheric Science, and its main operations on the Coral Gables campus at the Gables One Tower and the Ungar Building.

Miller School of Medicine
Rosenstiel Medical Science Building

Rosenstiel School of Marine and Atmospheric Science

Coral Gables Campus
Gables One Tower

Coral Gables Campus
Ungar Building

Each location is equipped with a dual processing workstations and essential software applications. CCS has three dedicated conference rooms and communication technology to interact with advisors (phone, web-, and video conferencing), plus a Visualization Lab with 2D and 3D displays (located at the Ungar Building).