Arctic Region Supercomputing Center

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

Lua error in package.lua at line 80: module 'strict' not found.

Arctic Region Supercomputing Center
File:ARSC Logo.png
Established 1993
Director Dr. Gregory Newby
Location ,
Affiliations University of Alaska Fairbanks
Website arsc.edu

Script error: No such module "Check for clobbered parameters".

The Arctic Region Supercomputing Center (ARSC) is a research facility organized under the University of Alaska Fairbanks. Located on the University of Alaska Fairbanks (UAF) campus, the Arctic Region Supercomputing Center (ARSC) offers high-performance computing (HPC) and mass storage to the UAF and State of Alaska research communities. Funding for ARSC operations is primarily supplied by UAF, with augmentation through external grants and contracts from various sources such as the National Science Foundation[1] and Lockheed Martin[2] (through the Department of Defense High Performance Computing Modernization Program.)

In general, the research supported with ARSC resources focuses on the Earth's arctic region. Common projects include arctic weather modeling, Alaskan summer smoke forecasting, arctic sea ice analysis and tracking, arctic ocean systems, volcanic ash plume prediction, and tsunami forecasting and modeling.

History

Since its founding in 1993, ARSC has hosted a variety of HPC systems. The following is a listing of various HPC systems acquired by ARSC over the course of time:

  • 1993 - Cray Y-MP named Denali with 4 CPUs and 1.3 GFLOPS, StorageTek 1.1 TB Silo.
  • 1994 - Cray T3D named Yukon with 128 CPUs and 19.2 GFLOPS.
  • 1997 - Updated Yukon to a Cray T3E 600 with 88 CPUs and 50 GFLOPS.
  • 1998 - Cray J90 named Chilkoot with 12 CPUs and 2.4 GFLOPS, Updated Yukon to a Cray T3E 900 with 104 CPUs, Expanded StorageTek to 330+ TB.
  • 2000 - Expanded Yukon to 272 CPUs and 230 GFLOPS, Updated Chilkoot to a Cray SV1 with 32 CPUs and 38.4 GFLOPS, Doubled StorageTek Hardware.
  • 2001 - IBM SP named Icehawk with 200 CPUs and 276 GFLOPS.
  • 2002 - Cray SX-6 named Rime with 8 CPUs and 64 GFLOPS, IBM P690 Regatta named Iceflyer with 32 POWER4 CPUs and 166.4 GFLOPS.
  • 2004 - IBM P690+/P655+ named Iceberg with 800 CPUS and 5 TFLOPS, Cray X1 named Klondike with 128 CPUS and 1.6 TFLOPS, Mechdyne MD Flying Flex 4 projector Virtual Environment, Two Sun Fire 6800 Storage Servers.
  • 2005 - Cray XD1 named Nelchina with 36 CPUs.
  • 2007 - Sun Opteron Cluster named Midnight with 2312 CPUs and 12.02 TFLOPS, StorageTek SL8500 Robotic Tape Library with 3+ PetaByte capacity.
  • 2009 - Cray XT5 name Pingo with 3456 CPUs, BladeCenter H QS22 Cluster with 5.5 TFLOPS and 12 TB Filesystem.
  • 2010 - Penguin Computing Cluster named Pacman with 2080 CPUs and 89 TB Filesystem, Sun SPARC Enterprise T5440 Server named Bigdipper with 7 Petabyte Storage Capacity, Cray XE6 named Chugach with 11648 CPUs and 330 TB Filesystem, Sun SPARC Enterprise T5440 Server named Wiseman with 7 Petabyte Storage Capacity, Cray XE6 named Tana with 256 CPUs and 2.36 TFLOPS
  • 2011 - Expanded Pacman to 3256 CPUs and 200 TB Filesystem.

Current ARSC hardware and projects

Academic resources

  • Bigdipper — A Sun SPARC Enterprise T5440 Server connected to a Sun StorageTek SL8500 robotic tape library. Potential long term storage potential capacity is seven petabytes.[4]

Research systems

  • Quasar - IBM QS222 System [5]
  • Various additional research systems, including general-purpose GPU computing systems

HPCMP Enhanced User Environment

ARSC is under contract to operate the High Performance Computing Modernization Program Enhanced User Environment Test Lab for the Department of Defense. This consists of a small number of test systems with no operational data at the Lab. Also, ARSC is hosting a Cray XE6 named Tana with 256 CPUs and 2.36 TFLOPS. This is a test and development system for the HPCMP Open Research System's Cray, Chugach.

References

  1. http://www.nsf.gov/od/opp/wwwsites.jsp
  2. http://www.appro.com/press/view.asp?Num=202
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. Lua error in package.lua at line 80: module 'strict' not found.
  5. Lua error in package.lua at line 80: module 'strict' not found.

External links

Lua error in package.lua at line 80: module 'strict' not found.