About

The Arkansas High Performance Computing Center is a core research facility at the University of Arkansas and provides high performance computing hardware, storage, and support services including training and education to enable computationally intensive research at the university and within the state as well as collaborators elsewhere. The AHPCC, through its association with the National Science Foundation’s Extreme Science and Engineering Digital Environment (XSEDE) and XSEDE Campus Champions programs, also assists researchers in acquiring and using resources at national supercomputer centers.

Mission, Vision and Goals

Mission

AHPCC strives to substantially enhance the productivity of a growing community of researchers, engineers, and scholars through seamless access and integration to computational resources that support open research; and to coordinate and add significant value to the research and discovery effort.

Vision

The AHPCC envisions an academic community of digitally enabled researchers, engineers, and scholars participating in multidisciplinary collaborations to more effectively and efficiently understand and improve societal challenges.

Goals

Deepen and extend the use of the computational research services ecosystem by further increasing use by existing researchers, engineers, and scholars, extending use to new communities, and preparing the current and next generation via education, training, and outreach as well as raising the general awareness of the value of computational resources for research.

Advance the computational research services ecosystem by creating an open and evolving e-infrastructure and enhancing the array of tools, lowering the barrier of efficient usage by new users, and extending the technical expertise and support services offered including close collaboration with researchers on key projects.

Sustain the computational research services ecosystem by assure and maintain a reliable and secure infrastructure, provide excellent user support services, and operate an effective and innovative core research facility.

History

Peter Pulay with Red Diamond in 2005
Peter Pulay with Red Diamond in 2005.

Red Diamond retirement photo.
Red Diamond, retired.

Star of Arkansas photo.
Star of Arkansas (retired) and Jeff Pummill.

Star of Arkansas disassembly photo.
Star of Arkansas disassembly by AHPCC staff David Chaffin (left) and Pawel Wolinski (right).

AHPCC was designated by the Arkansas Department of Higher Education as a University of Arkansas Research Center in 2008. However, high performance computing began to play a central role in research in 2004 with a successful MRI grant (NSF Award Number: 0421099) for a large-scale computing resource. PI and CSCE Professor Amy Apon collaborated with Professors Huaxiang Fu, Panneer Selvam, Peter Pulay, and Russell Deaton on the grant which led to the deployment in 2005 of Red Diamond and it’s 256 compute cores. When operational Red Diamond made the June 2005 Top 500 list as 379th fastest system in the world.


The machine immediately impacted research in computational chemistry and material science focusing on density-functional theory (DFT) approaches, DNA Sequence design for biotechnology and nanotechnology applications to accelerate search for large sets of non-cross hybridizing DNA sequences, multi-scale modeling and computation of electronic and optical properties of nano-devices and crack propagation in alloys and metals, finite modeling of volcano and tectonic deformation, next generation networking, geospatial databases and early big data mining.


AHPCC added a second large-scale system in 2008 with support from an NSF MRI grant (NSF Award Number: 0722625). The 1256 core Star of Arkansas debuted on the Top 500 at 339th in the world. Additional NSF support since 2010 has added new resources - Razor I, II and III and, with the Star of Arkansas and Red Diamond both retired from service, the Center continues to grow with the addition of Trestles from the San Diego Supercomputing Center, the addition of a 100Gb research network, and plans for a new hybrid cluster in 2017.