5-BRF - Benchmark report format

This document describes a format for systematic reporting of benchmarks. The goal of this format is to make it possible for third parties to independently reproduce benchmarks.

  • Name: gro.pqma.ikiw|FRB-5#gro.pqma.ikiw|FRB-5
  • Editor: Pieter Hintjens <moc.xitami|hp#moc.xitami|hp>
  • Contributors: none.
  • State: raw


Copyright (c) 2008 iMatix Corporation

This Specification is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 3 of the License, or (at your option) any later version.

This Specification is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with this program; if not, see <http://www.gnu.org/licenses>.

Change Process

This document is governed by the gro.pqma.ikiw|SSOC-1#gro.pqma.ikiw|SSOC-1 specification.


The goals of this specification are to:

  • Define a systematic reporting format for benchmark test results
  • Improve the quality of information communicated to users
  • Improve the transparency and reproducibility of benchmarks

Reporting format

Benchmark reports follow this format:

  1. Software definition: the name and version of the software being tested, including complete URI to product downloads.
  2. Operating system specification: distribution, kernel, and any relevant packages used.
  3. Hardware specification: processor, RAM, storage system (if relevant).
  4. Network specification: NICs, physical network type.
  5. Architecture specification: layout of all participant components on the network.
  6. Configuration specification: full list of all configuration and tuning done on the software, operating system, hardware, and network.
  7. Benchmark specification: a reference to the benchmark being tested.
  8. Application specification: the full source code of the test application.
  9. Benchmark results: the results of the benchmark, in the format specified by the benchmark specification.

Members of the community may choose to rate benchmark reports on their clarity, completeness, and reproducibility.

Benchmark specifications

Each benchmark will be separately documented as a specification following the COSS process.


Add a New Comment

Edit | Files | Tags | Source | Print | Talk

Use one of these tags to define the specification's state:

  • raw - new specification
  • draft - has at least one implementation.
  • stable - has been deployed to real users.
  • legacy - is being replaced by newer specifications.
  • retired - has been replaced and is no longer used.
  • deleted - abandoned before becoming stable.