Table of Contents

Testing Dashboard

Admin

It was decided to adopt the Mongo DB/API/Json approach dashboard meeting minutes

Dashboard 1st Demo suppport

etherpad

Introduction

All the test projects generate results in different format. The goal of a testing dashboard is to provide a consistent view of the different tests from the different projects.

Overview

We may describe the overall system dealing with test and testbeds as follow:

We may distinguish

Test collection

The test collection API is under review, it consists in a simple Rest API associated a Mongo DB.

Most of the tests will be run from the CI but test collection shall not affect CI in any way that is one of the reason to externalize data collection and not using a jenkins Plugin.

Dashboard

A module shall extract the raw results pushed by the test projects to create a testing dashboard. This dashboard shall be able to show:

It shall be possible to filter per:

For each test case we shall be able to see:

And also the severity (to be commonly agreed) of the errors…

description comment
critical not acceptable for release
major failure rate? , failed on N% of the installers?
minor

The expected load will not be a real constraint.

In fact the tests will be run less that 5 times a day and we do not need real time (upgrade evey hours would be enough).

Role of the test projects

Each test project shall:

Project testcases Dashboard ready description
Functest vPing graph1: duration = f(time)
graph 2 bar graph (tests run / tests OK)
Tempest graph1: duration = f(time)
graph 2 (nb tests run, nb tests failed)=f(time)
graph 3: bar graph nb tests run, nb failed
odl
rally-*
yardstick Ping graph1: duration = f(time)
graph 2 bar graph (tests run / tests OK)
VSPERF
QTip

First studies for dashboarding

Visualization examples

Test results server

Test results server:

Port assignment (for FW):

Port assignment (local)