This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
pharos_testing [2015/03/17 17:36] Morgan Richomme [Test tools] |
pharos_testing [2015/11/05 22:16] Trevor Cooper |
||
---|---|---|---|
Line 1: | Line 1: | ||
- | ====== Base principles of the testing ====== | + | Pharos is dedicated to test infrastructure (specification, process, ...) |
- | Key objectives: | + | Functional tests can be found on functest project page: [[opnfv_functional_testing|Project: OPNFV - Base system functionality testing]] |
- | * Define tooling for tests | + | |
- | * Define test suites (SLA) | + | |
- | * Installation and configuration of the tools | + | |
- | * Automate test with CI | + | |
- | Procedure shall be independant from the installer. | + | Test strategy from Arno to B-Release: {{:test_strategy.pptx|}} |
- | The procedure shall be performed on any of the installer. | + | |
- | + | ||
- | ===== Test tools ===== | + | |
- | + | ||
- | * [[https://wiki.openstack.org/wiki/Rally|Rally]] (benchmark, Tempest) | + | |
- | * [[http://robotframework.org/|Robot framework]] | + | |
- | * [[http://sipp.sourceforge.net/|Sipp]] (for SIP related test e.g vIMS) | + | |
- | + | ||
- | TODO: shall we be more prescritive on the toloing environment (creation of the VM, installation of the tools)? | + | |
- | + | ||
- | + | ||
- | ===== Test scenario ===== | + | |
- | + | ||
- | === Rally === | + | |
- | + | ||
- | The default scenario are: | + | |
- | * authenticate | + | |
- | * nova | + | |
- | * cinder | + | |
- | * glance | + | |
- | * keystone | + | |
- | * neutron | + | |
- | * quotas | + | |
- | * requests | + | |
- | * tempest-do-not-run-against-production | + | |
- | * heat | + | |
- | * mistral | + | |
- | * sahara | + | |
- | * vm | + | |
- | * ceilometer | + | |
- | * designate | + | |
- | * dummy | + | |
- | * zaqar | + | |
- | + | ||
- | the first ones (authenticate, nova, cinder, glance, keystone, neutron, quotas, requests, tempest-do-not-run-against-production) can be re-used. | + | |
- | However scenario shall be tuned especially for the bench | + | |
- | * which image (size, OS,..) TODO: check for reco in Spirent ETSI NFV doc on test | + | |
- | * which SLA (booting time, error rate,..) | + | |
- | + | ||
- | Note: during first manual launched on alpha Orange platform installed with opensteak installer, there were lots of errors (196) running Tempest scenario and some in Rally scenario (results to be analyzed) | + | |
- | + | ||
- | Studies on the testcase shall be done | + | |
- | + | ||
- | === Open question === | + | |
- | * Shall we create new scenario (to remove tests or add new ones)? | + | |
- | * not sure to be able to play the default scenario (ping VM) => which tool? | + | |
- | + | ||
- | + | ||
- | + | ||
- | ===== Test automation ===== | + | |
- | + | ||
- | To be discussed with CI team asap | + | |
- | + | ||
- | + | ||
- | ===== status regarding installers ===== | + | |
- | + | ||
- | ^ "Experiment" ^ #1 - "Foreman-Quickstack" ^ #2 "Fuel" ^ #3 "OpenSteak" ^ #4 "[[slapstick|SlapStick]]" ^ #5 "Juju" ^ | + | |
- | | Installation of tools | Rally natively integrated in Fuel | | Rally installed manually on a de dedicated tooling VM | | | | + | |
- | | Tests | | | Rally scenarios including tempest launched manually from the VM, see [[https://github.com/Orange-OpenSource/opnfv/blob/master/docs/TEST.md]] | | | | + | |
- | | Automation of scripts | | | | | | | + | |
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + |