This shows you the differences between two versions of the page.
Next revision | Previous revision | ||
releases:brahmaputra:release_plan:functest_milestone_e_report [2015/12/17 18:31] Debra Scott created |
releases:brahmaputra:release_plan:functest_milestone_e_report [2016/01/07 14:28] (current) Jose Lausuch |
||
---|---|---|---|
Line 3: | Line 3: | ||
1. Has your test framework been executed against a deployed OPNFV platform? | 1. Has your test framework been executed against a deployed OPNFV platform? | ||
+ | * successfully tested on fuel/no controller | ||
+ | * partly tested on compass/no controller | ||
+ | * partly tested on joid/odl | ||
+ | * first test on apex/odl | ||
+ | |||
+ | ref: [[https://build.opnfv.org/ci/view/functest/| Functest jenkins page]] | ||
2. Has your deploy tool been executed in more than one Pharos lab? | 2. Has your deploy tool been executed in more than one Pharos lab? | ||
+ | |||
+ | * yes | ||
3. Have your test framework and test cases been run against more than one deployment tool? | 3. Have your test framework and test cases been run against more than one deployment tool? | ||
+ | |||
+ | * yes | ||
4. Do you have all target Brahmaputra test cases available in your test suites? | 4. Do you have all target Brahmaputra test cases available in your test suites? | ||
+ | |||
+ | * no (internal testcase OK with doubt on capability to extend odl suite), onos and opencontrail test suites not tested yet (no target solution available), no news from other feature tests | ||
+ | |||
5. Is your test framework able to be repeated on the same infrastructure at least 4 consecutive times without failures? Does it meet the agreed upon criteria of "stable"? | 5. Is your test framework able to be repeated on the same infrastructure at least 4 consecutive times without failures? Does it meet the agreed upon criteria of "stable"? | ||
+ | |||
+ | * achieved only once on fuel/no controller (wich is not the target) | ||
+ | * no on all the other targets | ||
6. Have all known test issues (Jira tickets against your project) been documented and that documentation either integrated into the release or made available on a public page and pointed to through documents that //are// integrated into the release? | 6. Have all known test issues (Jira tickets against your project) been documented and that documentation either integrated into the release or made available on a public page and pointed to through documents that //are// integrated into the release? | ||
+ | |||
+ | * JIRA for integration still open | ||
+ | * JIRA for feature project integration ready for Sprint 4 | ||
+ | * Additional JIRAs linked to troubleshooting to be planned (for the moment mainly due to mis configuration (e.g. no external network)) | ||
7. Are all known issues (Jira Tickets) prioritized by severity of impact (critical, high, medium, low) to users? | 7. Are all known issues (Jira Tickets) prioritized by severity of impact (critical, high, medium, low) to users? | ||
+ | |||
+ | * Integration JIRA are the top priority now, prioritization will be discussed before E milestone | ||
8. Do all known critical Jira Tickets have a workaround in place? | 8. Do all known critical Jira Tickets have a workaround in place? | ||
+ | |||
+ | * not really if no target lab, no test..I do not see any possible workaround for that | ||
9. Do all non-critical Jira Tickets have a planned disposition in place (ie. deferred-fix in later release or closed- won't fix)? | 9. Do all non-critical Jira Tickets have a planned disposition in place (ie. deferred-fix in later release or closed- won't fix)? | ||
+ | |||
+ | * not yet | ||
10. Is your documentation complete and checked in? If not, how do you plan to deliver it independent of the release? | 10. Is your documentation complete and checked in? If not, how do you plan to deliver it independent of the release? | ||
+ | |||
+ | * doc initiated but work on documentation planned from week 2-3 |