High Performance Computing
Research Computing and Data staff manage the campus HPC facility, and we facilitate access to federally-funded HPC and private cloud resources, including the the National Science Foundation’s ACCESS and NAIRR Pilot programs, or the Department of Energy’s ASCR program. You can also watch our overview video of federally-funded resources, presented at the University of Oklahoma’s 2025 Virtual Residency Workshop.
Select an item from the table of contents, or read through the rest of this page for more information.
Table of Contents
On Campus
Tennessee Tech University’s high performance computing (HPC) facility is located in Clement Hall 226. This facility includes the Impulse cluster, launched in 2017, and the Warp 1 cluster, launched in 2022 (NSF Award: 2127188). Both clusters are managed by Information Technology Services (ITS) and are available to all students, faculty and staff for computationally intensive computing research and classes.
The Impulse cluster includes:
- 42 CPU compute nodes, each with 28 CPU cores (Intel Xeon E5-2680v4, 2.4 GHz) and 64–896 GB RAM.
- 2 GPU compute nodes, each with 28 CPU cores (Intel Xeon E5-2680v4, 2.4 GHz), 384 GB RAM, and two NVIDIA K80 GPUs (presented as four GPU devices).
- 56 Gb non-blocking InfiniBand network.
The Warp 1 cluster includes:
- 10 GPU compute nodes, each with 128 CPU cores (AMD Epyc 7713, 2.0 GHz), 512 GB RAM, and two 40 GB NVIDIA A100 GPUs.
- 100 Gb non-blocking InfiniBand network.
Both clusters share 300 TB of NFS all-flash file storage. Both clusters include a variety of open-source and commercial applications supporting research and education for areas including artificial intelligence/machine learning, bioinformatics, computational fluid dynamics, finite element analysis, and molecular dynamics; plus support for secure container-based applications promoting reproducible research, computational mobility, and increased compatibility.
All HPC resources are available to all Tennessee Tech researchers after a brief consultation. Researchers can get help with software installation and licensing, working with the queueing system, developing workflows, and necessary end-user training.
Off Campus
NSF-funded
Advanced Cyberinfrastructure Coordination Ecosystem: Services & Support (ACCESS)
NSF’s ACCESS program contains over 30 computational, storage, and novel computing architecture resources at no cost to the researcher. Research Computing and Data staff already have access to many of these resources, making it easier to find the right resource for your particular research or academic need. Click the “Details” item below to see a full list of the resources, or book an appointment with us to discuss your options. Once one or more resources have been selected for your project, an ACCESS project can often be up and running within a matter of days.
National Artificial Intelligence Research Resource (NAIRR) Pilot
The NAIRR Pilot program is an NSF-led partnership of over 10 federal agencies and over 25 non-governmental entities, including Anthropic, Google, Meta, Microsoft, and OpenAI. This provides a shared national research infrastructure connecting US researchers to computational and AI, data, software, training, and educational resources. NAIRR is unique in partnering with non-governmental companies in the AI and cloud hyperscaler markets to provide no-cost access to commercial resources including AI models, inference services, and software as a service offerings.
DOE-funded: Advanced Scientific Computing Research (ASCR)
If ACCESS isn’t the best fit for what you’re doing, or especially if there’s DOE interest in your research, ASCR is another zero-cost option. There are four main avenues into ASCR resources, each offering access to different DOE facilities and with their own requirements:
- Innovative and Novel Computational Impact on Theory and Experiment (INCITE): multi-year awards for open science using majority of machine at Oak Ridge or Argonne
- ASCR Leadership Computing Challenge (ALCC): 1-year awards for advancing DOE mission or broadening the community capable of using large computing resources at Oak Ridge, Argonne, or NERSC
- Energy Research Computing Allocations Process (ERCAP): 1-year awards for advancing DOE Office of Science and SBIR/STTR mission at NERSC
- Center Reserves: 1-year awards for advancing science and engineering fields at Oak Ridge, Argonne, or NERSC