Date

Attendees

Topic Leaders: Lincoln Lavoie Yan Yang Rabi Abdel Trevor Lovett

Overview

Provide an update to the community on the current status of the OVP programs and the latest 2019.12 release, followed by discussions of open questions for the evolution of the programs.  Questions / Discussions: 1) Some operator participants have suggested updates to the review process and teams may help encourage more vendors to participate in the programs, by limiting expose of test results to competitors.  How can we evolve the programs to best meet the needs of the participants, while protecting their investments in their implementations?  2) As the programs evolve, there may be more optional test cases or levels of testing, especially within the Cloud Native testing of CNFs.  In these cases, how should the listings of the programs evolve to enabled external viewers to clearly understand what has been tested for each badge? For example, should a "badge record" include some type of report or documentation of what test cases were run? 3) How to deal with previous releases of OVP, and how "long" of a tail should be officially supported?

Recording

Minutes

OVP Overview

OVP Status and Evolution June 2020.pptx

Discussion Items

  • How to deal with previous releases of OVP, and how "long" of a tail should be officially supported?
    • General agreement the program governance needs to allow for releases to be retired
    • Will support current release and 1 previous
    • Results from "retired releases" would not be accepted for review / badge awarding
    • Previously award badges will always be listed on the portal (a retired badge doesn't remove those listings)
  • As the programs evolve, there may be more optional test cases or levels of testing, especially within the Cloud Native testing of CNFs.
    • In these cases, how should the listings of the programs evolve to enabled external viewers to clearly understand what has been tested for each badge?
    • For example, should a "badge record" include some type of report or documentation of what test cases were run?
    • Levels of "badges"
      • binary - you pass and get listed, or fail and aren't listed
      • Divide tests into categories - individual categories are pass/fail
      • scoring system - list "how well" or "how many" tests were passing
      • Idea - how do you assign some value of badge, such as "xNF can be on boarded with in one week, two weeks, etc."  This is contrasted against just a color 
  • Some operator participants have suggested updates to the review process and teams may help encourage more vendors to participate in the programs, by limiting expose of test results to competitors. How can we evolve the programs to best meet the needs of the participants, while protecting their investments in their implementations?

Next Steps

Action items

  •