Standard Performance Evaluation Corporation


The Standard Performance Evaluation Corporation is an American non-profit corporation that aims to "produce, establish, maintain and endorse a standardized set" of performance benchmarks for computers.
SPEC was founded in 1988. SPEC benchmarks are widely used to evaluate the performance of computer systems; the test results are published on the SPEC website.
SPEC evolved into an umbrella organization encompassing four diverse groups; Graphics and Workstation Performance Group, the High Performance Group, the Open Systems Group and the newest, the Research Group.

Structure

The Open Systems Group (OSG)

The High-Performance Group (HPG)

The Graphics and Workstation Performance Group (GWPG)

SPEC Research Group (RG)

Membership

Membership in SPEC is open to any interested company or entity that is willing to commit to SPEC's standards. It allows:
The list of members is available on SPEC's membership page;.

Membership Levels

The benchmarks aim to test "real-life" situations. There are several benchmarks testing Java scenarios, from simple computation to a full system with Java EE, database, disk, and network.
The SPEC CPU suites test CPU performance by measuring the run time of several programs such as the compiler GCC, the chemistry program gamess, and the weather program WRF. The various tasks are equally weighted; no attempt is made to weight them based on their perceived importance. An overall score is based on a geometric mean.

Cloud

Measuring and comparing the provisioning, compute, storage, and network resources of IaaS cloud platforms.
Measuring and comparing combined performance of CPU, memory and compiler.

Graphics and Workstation Performance

Measuring performance of an OpenGL 3D graphics system, tested with various rendering tasks from several popular 3D-intensive real applications on a given system.

SPECwpc

High Performance Computing, OpenMP, MPI, OpenACC, OpenCL

OMP
The SPEC OMP is the first one for evaluating performance based on OpenMP applications, for measuring the performance of SMP systems.

Java Client/Server

JBB

evaluates the performance of server side Java by emulating a three-tier client/server system.

jEnterprise

A multi-tier benchmark for measuring the performance of Java 2 Enterprise Edition technology-based application servers.

Mail Servers

Storage

SPEC SFS is for measuring file server throughput and response time supporting both NFS and SMB protocol access.

Power

Virtualization

Web Servers

SPEC Tools

SPEC benchmarks are written in a portable programming language, and the interested parties may compile the code using whatever compiler they prefer for their platform, but may not change the code. Manufacturers have been known to optimize their compilers to improve performance of the various SPEC benchmarks. SPEC has rules that attempt to limit such optimizations.

Licensing

In order to use a benchmark, a license has to be purchased from SPEC; the costs vary from test to test with a typical range from several hundred to several thousand dollars. This pay-for-license model might seem to be in violation of the GPL as the benchmarks include software such as GCC that is licensed by the GPL. However, the GPL does not require software to be distributed for free, only that recipients be allowed to redistribute any GPLed software that they receive; the license agreement for SPEC specifically exempts items that are under "licenses that require free distribution", and the files themselves are placed in a separate part of the overall software package.

Culture

SPEC attempts to create an environment where arguments are settled by appeal to notions of technical credibility, representativeness, or the "level playing field". SPEC representatives are typically engineers with expertise in the areas being benchmarked. Benchmarks include "run rules", which describe the conditions of measurement and documentation requirements. Results that are published on SPEC's website undergo a peer review by members' performance engineers.