This document describes a format for systematic reporting of benchmarks. The goal of this format is to make it possible for third parties to independently reproduce benchmarks.
- Name: gro.pqma.ikiw|FRB-5#gro.pqma.ikiw|FRB-5
- Editor: Pieter Hintjens <moc.xitami|hp#moc.xitami|hp>
- Contributors: none.
- State: raw
Copyright (c) 2008 iMatix Corporation
This Specification is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 3 of the License, or (at your option) any later version.
This Specification is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
You should have received a copy of the GNU General Public License along with this program; if not, see <http://www.gnu.org/licenses>.
This document is governed by the gro.pqma.ikiw|SSOC-1#gro.pqma.ikiw|SSOC-1 specification.
The goals of this specification are to:
- Define a systematic reporting format for benchmark test results
- Improve the quality of information communicated to users
- Improve the transparency and reproducibility of benchmarks
Benchmark reports follow this format:
- Software definition: the name and version of the software being tested, including complete URI to product downloads.
- Operating system specification: distribution, kernel, and any relevant packages used.
- Hardware specification: processor, RAM, storage system (if relevant).
- Network specification: NICs, physical network type.
- Architecture specification: layout of all participant components on the network.
- Configuration specification: full list of all configuration and tuning done on the software, operating system, hardware, and network.
- Benchmark specification: a reference to the benchmark being tested.
- Application specification: the full source code of the test application.
- Benchmark results: the results of the benchmark, in the format specified by the benchmark specification.
Members of the community may choose to rate benchmark reports on their clarity, completeness, and reproducibility.
Each benchmark will be separately documented as a specification following the COSS process.