You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 4 Current »

The following is the proposal for the information model (i.e. required bits) for test results and tooling used in the badging.

ParameterDescription 

For each budge submission

This information would be collected from the submitter, when the results are submitted for review to the program.  A likely approach will be use of a templated merge request comment, that makes the specific fields clear to the submitter.

Applicant Information
  • Vendor Name
  • Product Name
  • Product Version


For each set of results generated
Results ID
  • Uniquely identifies the set of test results generated during the test run.
  • Must include
    • Date of test run
    • Badge Type (Cloud Native or VM)
    • Test Type (i.e. ONAP, Anuket Interop, CNCF) 
Test Tool Details
  • Must list the tool name, including the project originator.
  • Version of the tool.
  • This must contain enough information, a user could reproduce the exact test run with the same test tooling.
SUT Information
  • Submitting Vendor Name
  • Product Name
  • Product Version 
  • Inventory


For each test case included in submitted results
Test Case ID
  • A unique identifier for the test case which has been executed. It should be possible to use the Test Case ID to trace back to the specific requirement(s) that are covered by the unique test case.
  • Requirements and test case implementations are driven by the individual communities.
  • This must be a unique identifier across all projects and tests.
Test Case Result
  • Pass / Fail indication
Test Execution Log
  • One or more files that capture logging and debugging output from the test run. 
  • The specific format of the files is undefined, and determined by the test tool. 
  • The link between the Test Case ID and log files MUST be clear, through either file naming or path locations.






  • No labels