Controller Performance Testing
The Controller Performance Testing (cperf) project will serve as a performance testing environment for the SDN controller portion of the large, realistic, automated deployments required by OPNFV.
The newly created OpenDaylight Performance Group, which consists of the upstream ODL perf testing community, has committed to bootstrapping the effort. Because of the group's background, the initial focus will be ODL, but the cperf project explicitly hopes to leave the controller part of the test matrix open and will welcome collaboration with other controller communities.
The cperf project also hopes to serve as a community that fosters collaboration between benchmarking experts from academic/standards backgrounds and the upstream engineers who implement actual perf benchmarks in modern CI environments.
The Controller Performance Testing project's scope is to build performance tests for SDN controllers in realistic, large, automated deployments.
Cperf, in the spirit of OPNFV's upstream-first focus, will work to solve problems primarily in the relevant upstream communities. For example, when creating OpenDaylight performance tests, cperf will consume and contribute to the OpenDaylight Integration Team's Python and Robot test libraries vs re-implementing that logic. When deploying OpenDaylight in a complex automated deployment, cperf will leverage the ODL Integration/Packaging project's automated deployment tools as well as OPNFV's subsequent OpenStack integrations (via BGS/Genesis and subsequent projects).
Cperf explicitly encourages collaboration with other relevant communities, including additional SDN controller communities. There's a huge amount of work to be done, and inter-community collaboration is one of the possible big wins that may come from working in the context of OPNFV. We expect the considerable upstream experience of the OpenDaylight community members who will bootstrap this project to expedite such collaboration.
The OPNFV Yardstick, Qtip, and VSPERF testing projects have distinct, non-overlapping scopes (different types of perf testing). Cperf will complement Functest+ Octopus with additional CI validations. Cperf will consume OPNFV's lab space via Pharos for realistic full-stack tests.
We give committer rights based a history of contributions over time. We typically at least expect committers to have:
You don't have to be an "official" Contributor to contribute to cperf! The people above have documented that they explicitly hope to contribute to cperf, but that's optional - all contributions are equally welcome.
You don't need to ask permission to add yourself to the list of contributors above.
As described in detail in the Scope section, the cperf project will strive to contribute its changes back to upstream communities. These "deliverables" include additional performance testing logic, improvements to ODL+OpenStack automated deployment tools and feedback+data to academic and standards communities.
The cperf project doesn't necessarily need to "release" artifacts. Its output will consist of contributions to upstream projects (ODL perf tests, OPNFV automated deployment logic, etc) and performance data for consumption by academic/standards communities and the profiled projects. Any logic hosted in the cperf repository will be versioned and released according to semantic versioning, which can additionally aline with OPNFV's release cadence.
Project Name: Controller Performance Testing (cperf)
Repo name: cperf
Project Category: Integration & Testing
Lifecycle State: Incubation
Primary Contact: Daniel Farrell
Project Lead: Daniel Farrell
Jira Project Name: Controller Performance Testing
Jira Project Prefix: cperf
Mailing list tag: cperf
Daniel Farrell, email@example.com, IRC: dfarrell07
Luis Gomez, firstname.lastname@example.org, IRC: LuisGomez
Jamo Luhrsen, email@example.com, IRC: jamoluhrsen
Marcus Williams, firstname.lastname@example.org, IRC: mgkwill
Tim Rozet, email@example.com, IRC: trozet
Link to TSC approval: TSC vote
Link to approval of additional submitters: None