A three-year National Science Foundation grant totaling nearly $1 million will let West Virginia University develop its next-generation High Performance Computing, or HPC, cluster to advance computationally intensive research in a wide array of fields, from drug delivery to genomics and astrophysics.
The design and implementation of the new computing cluster will be funded by a $990,000 NSF Major Research Instrumentation Program, known as MRI, grant awarded to a group of 22 faculty and led by Blake Mertz, assistant professor of chemistry in the Eberly College of Arts and Sciences.
The Pittsburgh Supercomputing Center will host and operate the cluster at its machine room in Monroeville, Pennsylvania. PSC will provide ongoing support, including hardware troubleshooting, on-site technical support and managing WVU’s network connection with the cluster. PSC is a joint program of Carnegie Mellon University and the University of Pittsburgh.
High performance computing uses the ability to pack central processing units into a dense, relatively small footprint called a cluster. The cluster can then take on much larger computational jobs that would be impossible to do on a traditional desktop computer. This capability lets dozens of faculty members carry out groundbreaking research in all major areas of science and engineering.
Mertz’s research uses a computational approach called molecular dynamics simulations to develop applications in targeted drug delivery and solar energy harvesting. For the last three years, he and his team have been focused on a peptide that binds to cancer cells, creating potential for highly effective cancer drug delivery to patients.
These simulations provide a birds-eye view of how their peptide interacts with the cell surface, essentially acting as a “computational microscope.” With this extremely detailed picture, they can make informed decisions to design peptides that are more effective delivery systems. None of these scientific insights would be possible without WVU’s current HPC cluster, Spruce Knob.
Mertz and his team are one of the major users of the cluster on campus, “and we could use twice as much computing power as we presently have. The user base of HPC resources at WVU has significantly grown since I started here in 2012.”
“The Research Office is very excited about this award,” says Vice President for Research Fred King. This new computational resource is key to our continued success in computational chemistry, materials design, bioinformatics, neurosciences and astrophysics, but we also anticipate expanded use by social scientists and humanities faculty as these computational tools are exploited for natural language processing and other cutting-edge applications. Our work is ongoing with Information Technology Services to support additional staffing to assist new users in taking full advantage of the new instruments.”
“PSC is pleased to continue our long-standing relationship with WVU by sharing our data center and operations experience,” said J. Ray Scott, PSC’s Senior Director of Facilities Technology. “Both schools will benefit from this collaboration.”
In addition to the grant from NSF, the WVU Research Office and the Deans’ offices of the Statler College of Engineering and Mineral Resources and the Eberly College of Arts and Sciences are providing supplemental project funding. That money will support both hardware and the hiring of additional staff for the Information Technology Services Research Computing team that manages WVU’s HPC resources.
Research Computing will design a new HPC cluster, called “Thorny Flat,” named after one of West Virginia’s peaks, to supplement Spruce Knob. Thorny Flat will not only better support current HPC users who pay for access, but also dramatically expand the availability of free resources for other prospective researchers at WVU and across West Virginia.
Thorny Flat also will feature the latest graphics processing units, allowing researchers to develop code that utilizes the latest in computer hardware acceleration.
With Centers of Research Excellence in gravitational waves and cosmology, STEM education, water security, regional health disparities, and responsible natural gas use, WVU is devoting new resources to leverage existing strengths in neuroscience and cybersecurity.
Co-investigators on this project represent the diversity of uses for HPC at WVU:
• George Spirou, the John W. and Jeannette S. Straton research chair in neurosciences and co-director of the WVU Blanchette Rockefeller Neurosciences Institute, and his team create high-resolution maps and 3D models of the brain. Using Spruce Knob, he recently improved his computational jobs to analyze his data from 80 hours to 2 hours for a single run.
• Zachariah Etienne, assistant professor of mathematics and senior member of the WVU Center for Gravitational Waves and Cosmology, uses HPC to pursue state-of-the-art Laser Interferometer Gravitational-Wave Observatory, called LIGO, research.
A primary objective of the MRI grant is to grow the community of HPC users in West Virginia, a state challenged by lack of economic diversity. WVU has publicly committed to supporting prosperity and economic development, recently joining the state Department of Commerce and Marshall University in launching a comprehensive study on expanding a select group of industries, dubbed the West Virginia Forward initiative.
“New sectors the state can capture that promise high growth are cybersecurity, cloud services and data centers, and higher-end tourism,” said WVU President E. Gordon Gee at the most recent meeting of the West Virginia Chamber’s annual Business Summit.
The Thorny Flat HPC cluster will help accelerate research projects in state-of-the-art code development for materials optimization and stimulate new efforts in Appalachian watershed dynamics, business analytics and quantitative neuroscience. The power of HPC also will benefit less traditionally computational areas, helping WVU researchers to tackle water security, opiate addiction and health care disparities.
Researchers can purchase their own dedicated nodes on WVU’s HPC systems using start-up funds or sponsored research funding. Free, shared community nodes are also currently available for those who want to learn how HPC can optimize their computational projects and will be a major part of the Thorny Flat cluster.