What is supercomputing?
The term “supercomputing” refers to the processing of massively complex or data-laden problems using the compute resources of multiple computer systems working in parallel (i.e., a supercomputer). Supercomputing also denotes a system working at the maximum potential performance of any computer. Its power can be applied to weather forecasting, energy, the life sciences and manufacturing.
Despite being a top-tier university, enthusiasm for supercomputing at UT ebbed and flowed over the years. This meant the HPC headquarters shifted locations across campus (including a brief stint where operations were conducted out of a stairwell in the UT Tower). “It’s remarkable how far things have come since then,” Oden says.
Calls from HPC advocates — Oden and many others — had grown so loud during the 1980s and 1990s that university leaders couldn’t ignore them any longer.
“When I became president of UT in 1998, there was already a lot of discussion around the need to prioritize advanced computing — or ‘big iron’ as we called it back then — at the university,” says former UT President Larry Faulkner.
In 1999, Faulkner named Juan M. Sanchez as vice president for research; that appointment proved central to the TACC story. Instrumental in a variety of university initiatives, Sanchez echoed the calls from HPC advocates to build a home for advanced computing at UT.
In 2001, a dedicated facility was established at UT’s J.J. Pickle Research Campus in North Austin with a small staff led by Jay Boisseau, who influenced the future direction of TACC and the wider HPC community. TACC grew rapidly in part due to Boisseau’s acceptance of hand-me-down hardware, an aggressive pursuit of external funding, and success forging strong collaborations with technology partners, including notable hometown success story Dell Technologies. The center also enjoyed access to a rich pipeline of scientific and engineering expertise at UT.
One entity in particular became a key partner — the Institute for Computational Engineering and Sciences (ICES, established in 2003). Renamed the Oden Institute for Computational Engineering and Sciences, in recognition of its founder J. Tinsley Oden, the institute quickly became regarded as one of the leading computational science and engineering (CSE) institutes in the world.
Thanks to the unwavering support of Texan educational philanthropists Peter and Edith O’Donnell, Oden was able to recruit the most talented computational scientists in the field and build a team that could not only expand the mathematical agility of CSE as a discipline but also grow the number of potential real-world applications.
“ICES was such a successful enterprise, it produced great global credibility for Texas as a new center for HPC,” Faulkner says.
Computational science and advanced computing tend to move forward symbiotically, which is why Oden Institute faculty have been instrumental in planning TACC’s largest supercomputers, providing insights into the types of computing environment that researchers require to deliver impactful research outcomes.
“Computational science and engineering today is foundational to scientific and technological progress — and HPC is in turn a critical enabler for modern CSE,” says Omar Ghattas, who holds the John A. and Katherine G. Jackson Chair in Computational Geosciences with additional appointments in the Jackson School of Geosciences and the Department of Mechanical Engineering.
Ghattas has served as co-principal investigator on the Ranger and Frontera supercomputers and is a member of the team planning the next-generation system at TACC.
“Every field of science and engineering, and increasingly medicine and the social sciences, relies on advanced computing for modeling, simulation, prediction, inference, design and control,” he says. “The partnership between the Oden Institute and TACC has made it possible to anticipate future directions in CSE. Having that head start has allowed us to deploy systems and services that empower researchers to define that future.”