This is an old revision of the document!
It was decided to adopt the Mongo DB/API/Json approach dashboard meeting minutes
All the test projects generate results in different format. The goal of a testing dashboard is to provide a consistent view of the different tests from the different projects.
We may describe the overall system dealing with test and testbeds as follow:
We may distinguish
The test collection API is under review, it consists in a simple Rest API associated a Mongo DB.
Most of the tests will be run from the CI but test collection shall not affect CI in any way that is one of the reason to externalize data collection and not using a jenkins Plugin.
A module shall extract the raw results pushed by the test projects to create a testing dashboard. This dashboard shall be able to show:
It shall be possible to filter per:
For each test case we shall be able to see:
And also the severity (to be commonly agreed) of the errors…
description | comment | |
---|---|---|
critical | testcase failed on 80% of the installers? | |
major | ||
minor |
The expected load will not be a real constraint. In fact the tests will be run less that 5 times a day and we do not need real time (upgrade evey hours would be enough).
Each test project shall:
The test projects shall agree on
Based on Arno, related to the result collection API could look like:
The projects are described as follow in the database: http://213.77.62.197:80/test_projects {
"meta": { "total": 1, "success": true }, "test_projects": [ { "test_cases": 0, "_id": "55e05b08514bc52914149f2d", "creation_date": "2015-08-28 12:58:48.602898", "name": "functest", "description": "OPNFV Functional testing project" } ]}
If you want to know the testcases of this project http://213.77.62.197:80/test_projects/functest/cases
{
"meta": { "total": 1, "success": true }, "test_cases": [ { "_id": "55e05dba514bc52914149f2e", "creation_date": "2015-08-28 13:10:18.478986", "name": "vPing", "description": "This test consist in send exchange PING requests between two created VMs" } ]}
Functest used to run automatically 4 suites:
The results are available on Jenkins Functest calls the API to store the raw results of each tests.
curl -X POST -H "Content-Type: application/json" -d '{"details": {"timestart": 123456, "duration": 66, "status": "OK"}, "project_name": "functest", "pod_id": 1, "case_name": "vPing"}' http://213.77.62.197:80/results
From a Python client the API can be invoked as follow:
def push_results_to_db(payload):
url = TEST_DB params = {"project_name": "functest", "case_name": "vPing", "pod_id": 1, "details": payload} headers = {'Content-Type': 'application/json'} r = requests.post(url, data=json.dumps(params), headers=headers) logger.debug(r)
with the payload var
{'timestart': start_time_ts, 'duration': duration,'status': test_status})
Please note that that can put whatever you want in your payload. For vPing we put the useful information, for Rally we can directly put all the json report already built by Rally (json result for Rally opnfv-authenticate scenario)
In the DB you can see the results
http://213.77.62.197:80/results?projects=functest&case=vPing
{
"meta": { "total": 17 }, "test_results": [ { "project_name": "functest", "description": null, "creation_date": "2015-08-28 16:43:00.965000", "case_name": "vPing", "details": { "timestart": 123456, "duration": 66, "status": "OK" }, "_id": "55e08f94514bc52b1791f949", "pod_id": 1 }, .......... { "project_name": "functest", "description": null, "creation_date": "2015-09-04 07:17:51.239000", "case_name": "vPing", "details": { "timestart": 1441350977.858555, "duration": 84.5, "status": "OK" }, "_id": "55e9459f514bc52b1791f959", "pod_id": 1 } ]}
The raw results will be stored in the Mongo DB (json files)
A module will perform post processing (cron) to generate json file usable for dashboarding (like in bitergia). In the exemple we build json file for LF2 POD (we could do it for the last 30 days, last 365 days, since the beginning)
several files shall be produced according to the filters For vPing we shall produce:
with
The result-functest-vPing.json (all installers/POD/version since first day of the project) could look like:
{ { "description": "vPing results for Dashboard" }, { "info": { "xlabel": "time", "type": "graph", "ylabel": "duration" }, "data_set": [ { "y": 66, "x": "2015-08-28 16:43:00.965000" }, ................. { "y": 84.5, "x": "2015-09-04 07:17:51.239000" } ], "name": "vPing duration" }, { "info": { "type": "bar" }, "data_set": [ { "Nb Success": 17, "Nb tests": 17 } ], "name": "vPing status" }}