User Tools

Site Tools


opnfv_functional_testing

This is an old revision of the document!


Project: OPNFV - Base system functionality testing

Project description

Project “OPNFV – Base system functionality testing” will provide comprehensive testing methodology, test suites and test cases to test and verify OPNFV Platform functionality that covers the VIM and NFVI components.

This project uses a "tops-down" approach that will start with chosen ETSI NFV use-case/s and open source VNFs for the functional testing. The approach taken will be to

  • break down the use-case into simple operations and functions required.
  • specify necessary network topologies
  • develop necessary test traffic and traffic profiles. Ideally VNFs will be Open Source however proprietary VNFs may also be used as needed.

This project will develop test suites that cover detailed functional test cases, test methodologies and platform configurations which will be documented and maintained in a repository for use by other OPNFV testing projects and the community in general. Developing test suites will also help lay the foundation for a test automation framework that in future can be used by the continuation integration (CI) project (Octopus). We envisage that certain VNF deployment use-cases could be automatically tested as an optional step of the CI process.

The project targets testing of the OPNFV platform in a hosted test-bed environment (i.e. using the OPNFV test labs world wide). It will leverage output of the "BGS" project.

The key objectives are:

  • Define tooling for tests
  • Define test suites (SLA)
  • Installation and configuration of the tools
  • Automate test with CI

Scope

“OPNFV – Base system functionality testing” will deliver a functional testing framework along with a set of test suites and test cases to test and verify the functionality OPNFV platform. The testing framework (tools, test-cases, etc.) are also intended to be used by the CI framework for the purpose of qualifying the OPNFV platform on bare metal servers. In this context, OPNFV Tester will use open source VNF components. Functional testing includes

  • Testing the basic VIM functionality that includes tenant, user CRUD operations, VNF Image CRUD operations etc.
  • Testing the VIM functionality to support VNF operations (create, modify, grow, shrink, destroy)
  • Testing the VIM functionality to support basic VNF network connectivity
  • Testing the inter working between the VIM and the SDN controller.
  • Testing the NFVI functionality as a black box to ensure that it meets the VIM requirements.

The project requires the following components:

  • OPNFV Lab setup with complete access to a set of Bare metal servers for Controller and Compute nodes (as defined by BGS project (OPNFV stack) and Pharos project (hardware)); associated switches and routers.
  • OPNFV platform software bundle from the repository that includes several upstream software components.
  • OPNFV "Bootstart Get Started!" software bundles that includes the installer

Tooling

Intel POD2 (contact Trevor Cooper) is dedicated to functional testing.

Functional tests shall be

  • independant from the installer (Fuel, Foreman/Puppet, Juju,..)
  • automated and integrated in CI

TODO: shall we be more prescritive on the toloing environment (creation of the VM, installation of the tools)?

Release 1

R1 follow up

for release 1 we target the automation of the following tests

  • bench Rally scenarios
  • tempest Rally
  • vPing (boot 2 VM, VM1 ping VM2, delete VM)
  • vIMS (Sipp VM trigering basic calls (REGISTER, INVITE) to clearwater compact vIMS VM)
  • ODL scenario / Robot framework

List of testcases can be found here

At the end of a Fresh install, the status of the OPNFV solutions according to the selected installer can be summarized as follow:

Fuel Foreman
Images none
Networks none
Flavors
OpenStack creds admin/octopus

Testcases

VIM test suites

Bench scenarios

We re-used Rally scenario to test OpenStack (bench + Tempest) The default scenario are dealing with the following modules: authenticate, nova, cinder, glance, keystone, neutron, quotas, requests, tempest-do-not-run-against-production, heat, mistral, sahara, vm, ceilometer, designate, dummy, zaqar

The first ones (authenticate, nova, cinder, glance, keystone, neutron, quotas, requests, tempest-do-not-run-against-production) can be re-used "as provided".

However scenario shall be tuned especially for the bench

  • which image (size, OS,..) TODO: check for reco in Spirent ETSI NFV doc on test
  • which SLA (booting time, error rate,..)
Bench suite Orange POD Ericsson POD LF POD1 LF POD2
Glance 1 test KO (GlanceImages.create_image_and_boot_instances) +call sh file, watch path in scenario) 100% OK - KO (400 Bad Request URL invalid)
Cinder all KO (config?) 100% OK - 80% OK (some Time out)
Heat N.A 33% OK - N.A
Nova all KO (config?) except create and list keypair and create and delete keypair ~50% OK. Problems with live migration and security groups - ~50% OK. Problems with live migration and security groups
Authenticate OK 100% OK - 100% OK
Keystone OK 100% OK - 100% OK
Neutron OK some tests 100% and some other 29% success, but maybe because of limited range of range for neutron. Maybe 100 times is too much stress for a small environment. - 80% OK
VM KO (config / floating IP) KO. Probably due to network setup. - KO (floating IP)
Quotas OK 100% OK - 100% OK
Request OK 100% OK - KO (IncompleteRead ~ TimeOut)
Tempest

tempest is a special case of scenarios that can be run by Rally.

Tempest Orange POD Ericsson POD LF POD1 LF POD2
smoke 20 failures on 108 tests 33 Failures on 84 tests
all 170 failures on 951 tests 243 Failures on 875

List of Tempest smoke tests

Note: during first manual launched on alpha Orange platform installed with opensteak installer, there were lots of errors (196) running Tempest scenario and some in Rally scenario (results to be analyzed)

Studies on the testcase shall be done

SDN controller test suite

<Peter>

vPinG

<Malla>

vIMS

<Andrew & Martin> see vIMS Functional Testing for details.

Open question

  • Shall we create new scenario (to remove tests or add new ones)?
  • not sure to be able to play the default scenario (ping VM) ⇒ which tool?

Test automation

See Octopus etherpad: https://etherpad.opnfv.org/p/octopus

Community platforms connected to CI

  • Ericsson
  • Intel
  • Huawei
  • Orange

R1 functest status

Jira ref Documentation Manual test Automated Doc BGS link Comments
Rally Bench https://jira.opnfv.org/browse/FUNCTEST-1 installation procedure described https://github.com/Orange-OpenSource/opnfv/blob/master/docs/TEST.md OK OK KO Installed on Jump host server of Intel POD 2 #1; Rally natively integrated in Fuell #2;Tested with opensteack #3; Morgan. Rally Test suite can be performed but flavours, images are missing on Openstack SUT deployed on POD 2
Tempest https://jira.opnfv.org/browse/FUNCTEST-2 OK KO KO Use of khalisi for foreman/puppet #1; Rally natively integrated in Fuell #2;Tested with opensteack #3; Tempest not working on POD - same issue than on #3, patch applied but seems there is a pb ⇒ contact openstack-rally.
vPing https://jira.opnfv.org/browse/FUNCTEST-3 KO KO KO Malla
vIMS https://jira.opnfv.org/browse/FUNCTEST-4 Based on Clearwater solution KO KO KO Martin see vIMS Functional Testing for details
ODL https://jira.opnfv.org/browse/FUNCTEST-5 ? ? KO Peter ⇒ https://etherpad.opnfv.org/p/robotframework

Beyond R1

A new page has been created to list the task for functest beyond R1.

Dependencies

  • The project is a contributor to project “Octopus”.
  • The project leverages "Boot Strap/Get Started"
  • The project relies on the following upstream projects:
    • OpenStack Juno release: Components: Nova, Glance, Keystone, Horizon, Neutron, Ceilometer, Heat
    • OpenDaylight Helium release: Components: MDSAL, OVSDB, RESTCONF, ML2 plugin/ODL Neutron drivers
    • Installer: TBD
    • Puppet (for instance configuration)
    • QEMU/KVM
    • OpenWRT (as example VNF - for routing, firewall, NAT)
    • Snort (as example VNF - for IDS)
    • Linux Ubuntu 14.04/Centos 7 distribution

Key Project Facts

This URL is not allowed for scraping
  • Additional Contributors
    • Frank Brockners (fbrockne@cisco.com)
    • Sajeev Manikkoth (sajeevmanikkoth@gmail.com)
    • Jun Li (matthew.lijun@huawei.com)
    • Sean Chen (s.chen@huawei.com)
    • Rajeev Seth (rseth@sonusnet.com)
    • Kevin Riley (kriley@sonusnet.com)
    • Justin Hart (jhart@sonusnet.com)

Planned deliverables

Project deliverable: The project delivers the following components:

  • Documentation of Test Suites and test cases that cover the OPNFV platform functionality testing including pass/fail criteria.
  • Test software and scripts for testing OPNFV Platform functionality that is essentially broken down into two main suites as follows:
    • OPNFV Platform Smoke test suite: A sanity testing suite for basic verification of the platform.
    • OPNFV Platform Regression test suite: A comprehensive collection of detailed test suites.
  • Automated Test Framework based on Tempest/Robot/Jenkins, with necessary scripts and tools to automatically test and verify OPNFV functionality

Proposed Release Schedule

OPNFV release #1.

opnfv_functional_testing.1431345781.txt.gz · Last modified: 2015/05/11 12:03 by Morgan Richomme