The WVU High-Performance Computing (HPC) facilities support computationally-intensive research that requires especially powerful computational capabilities.
HPC resources assist research teams at WVU to greatly reduce their computational
analysis times. This includes free access to community nodes for researchers at
all institutes of higher learning in West Virginia, with CPU and GPU units. Researchers
can also purchase individual nodes that can be used on a first-come-first-serve
basis and are otherwise shared by the community.
Research Data Storage
HPC users have access to more than 500 TB of data storage accessible for processing
inside the HPC clusters. Researchers can also purchase group storage on the cluster
that will allow data to be easily shared between researchers. The Research Office
also offers storage in the centrally managed and secure Research Data Depot where
storage can be purchased at a cost-effective rate for five years. This data storage
is not intended for storing protected or regulated data.
HPC Clusters
Our current HPC cluster contains 178 compute nodes with a total of 8344 CPU cores.Of those compute nodes, 79 are community nodes with a total of 4,824 CPU cores, 9.3 Terabytes of Memory, and 18 NVIDIA P6000 GPUs. The remaining nodes are nodes purchased by faculty members and departments that comprise an additional 99 nodes with and additional 3,520 cores and 29 additional NVIDIA GPUs ranging from NVIDIA RTX6000 to A100. These additional nodes are available to community members in four-hour increments to increase the utilization of the system.
A new cluster is set for deployment in 2023 focused on artificial Intelligence and machine learning. It will consist of 30 nodes with 32 CPU cores and four A30 GPUs each, four nodes with 32 CPU cores and four A40 GPUs, two nodes with 64 CPU cores and eight SXM A100 GPUs, and one node with three A40 GPUs for visualization and testing. All nodes are connected to a high-speed low-latency HDR100 Infiniband fabric to support tightly coupled multi-gpu and multi-node work.
Contacts
Contact the Research Computing team by submitting a help desk ticket or find more information in the HPC documentation.