MPI2007 license: | 4 | Test date: | May-2007 |
---|---|---|---|
Test sponsor: | SGI | Hardware Availability: | Jul-2006 |
Tested by: | SGI | Software Availability: | Apr-2007 |
Benchmark | Base | Peak | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Ranks | Seconds | Ratio | Seconds | Ratio | Seconds | Ratio | Ranks | Seconds | Ratio | Seconds | Ratio | Seconds | Ratio | |
Results appear in the order in which they were run. Bold underlined text indicates a median measurement. | ||||||||||||||
104.milc | 128 | 149 | 10.5 | 147 | 10.7 | |||||||||
107.leslie3d | 128 | 234 | 22.3 | 232 | 22.5 | |||||||||
113.GemsFDTD | 128 | 438 | 14.4 | 442 | 14.3 | |||||||||
115.fds4 | 128 | 211 | 9.22 | 202 | 9.67 | |||||||||
121.pop2 | 128 | 567 | 7.27 | 567 | 7.28 | |||||||||
122.tachyon | 128 | 489 | 5.72 | 489 | 5.72 | |||||||||
126.lammps | 128 | 395 | 7.37 | 396 | 7.37 | |||||||||
127.wrf2 | 128 | 376 | 20.7 | 375 | 20.8 | |||||||||
128.GAPgeofem | 128 | 210 | 9.83 | 209 | 9.88 | |||||||||
129.tera_tf | 128 | 334 | 8.28 | 335 | 8.27 | |||||||||
130.socorro | 128 | 176 | 21.7 | 179 | 21.3 | |||||||||
132.zeusmp2 | 128 | 282 | 11.0 | 280 | 11.1 | |||||||||
137.lu | 128 | 139 | 26.4 | 138 | 26.6 |
Hardware Summary | |
---|---|
Type of System: | SMP |
Compute Node: | SMP |
File Server Node: | SMP |
Total Compute Nodes: | 1 |
Total Chips: | 64 |
Total Cores: | 128 |
Total Threads: | 128 |
Total Memory: | 512 GB |
Base Ranks Run: | 128 |
Minimum Peak Ranks: | -- |
Maximum Peak Ranks: | -- |
Software Summary | |
---|---|
C Compiler: | Intel C Itanium Compiler for Itanium-based Applications Version 9.1 (Build 20070320) |
C++ Compiler: | Intel C++ Itanium Compiler for Itanium-based Applications Version 9.1 (Build 20070320) |
Fortran Compiler: | Intel Fortran Itanium Compiler for Itanium-based Applications Version 9.1 (Build 20070320) |
Base Pointers: | 64-bit |
Peak Pointers: | 64-bit |
MPI Library: | SGI Message Passing Toolkit (MPT) Version 1.15 |
Other MPI Info: | None |
Pre-processors: | None |
Other Software: | None |
Hardware | |
---|---|
Number of nodes: | 1 |
Uses of the node: | compute, file server |
Vendor: | SGI |
Model: | SGI Altix 4700 Bandwidth System (Itanium 2 Processor 9040 1.6GHz/18M) |
CPU Name: | Dual-Core Intel Itanium 2 9040 |
CPU(s) orderable: | 1-512 chips |
Chips enabled: | 64 |
Cores enabled: | 128 |
Cores per chip: | 2 |
Threads per core: | 1 |
CPU Characteristics: | 533MHz FSB |
CPU MHz: | 1600 |
Primary Cache: | 16 KB I + 16 KB D on chip per core |
Secondary Cache: | 1 MB I + 256 KB D on chip per core |
L3 Cache: | 9 MB I+D on chip per core |
Other Cache: | None |
Memory: | 512 GB (8*1GB DDR2-400 DIMMS per 2 core module) |
Disk Subsystem: | 36 x 73 GB FibreChannel (Seagate Cheetah 15k rpm) |
Other Hardware: | None |
Adapter: | None |
Number of Adapters: | 0 |
Slot Type: | Not applicable |
Data Rate: | Not applicable |
Ports Used: | 0 |
Interconnect Type: | None |
Software | |
---|---|
Adapter: | None |
Adapter Driver: | Not applicable |
Adapter Firmware: | Not applicable |
Operating System: | SUSE Linux Enterprise Server 10 + SGI ProPack 5 Service Pack 1 |
Local File System: | 36 x 73 GB FibreChannel (Seagate Cheetah 15k rpm) |
Shared File System: | None |
System State: | Multi-user |
Other Software: | None |
setenv MPI_DSM_DISTRIBUTE 1 Ensures that each MPI process gets a unique CPU and physical memory on the node with which that CPU is associated. The CPUs are chosen by simply starting at cpuset-relative CPU 0 and incrementing until all MPI processes have been forked. setenv MPI_REQUEST_MAX 65536 Determines the maximum number of nonblocking sends and receives that can simultaneously exist for any single MPI process. MPI generates an error message if this limit (or the default, if not set) is exceeded. Default: 16384 limit stacksize unlimited Removes limits on the maximum size of the automatically- extended stack region of the current process and each process it creates.
icc |
126.lammps: | icpc |
ifort |
icc ifort |
121.pop2: | -DSPEC_MPI_CASE_FLAG |
127.wrf2: | -DSPEC_MPI_LINUX -DSPEC_MPI_CASE_FLAG |