Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

AL/ML testing session: Presentation from Lingli DengYan Yang

Testing and Certification Requirements from EUAG Intelligent Network and AI Survey -v1.1.pdf

...

EUAG New updates - Summary and Next StepsSteps Lei Huang

Add Slides here Lei HuangEUAG Newly Updates - Summary and Next Step Plan.pdf

Current work includes ONAP top priorities, Network Intelligence survey, VNF Testing White paper and representation of the Telecom community to the projects.  Review of the current work of the white papers.  Ranny Haiby Intelligent networking question.  What is the common platform and where should it be hosted?  Possibly an open lab where operators bring their data and the vendors bring their tools to test the AI/ML assumptions.  Morgan Richommeto do the lab work need to share.    Labs take resources and how can they be properly shared.  Morgan Richomme lab is for mostly used by ONAP project.  Open Labs balanced with staging labs.  Labs are often unstable, so how can the lab be used to create a standard platform.  More vendors are creating their own labs, so the lab is being used less.  Mostly was used for CNF and VNFs.  AI/Ml is a different animal.  No Admin access, but user access.  Saad Sheikh brings up some questions about the labs.  What types of labs do we actually need? End to End validation is the key.  CNF and VNF is not enough.  The integration of all the elements is what the Telecom community is more interested in. SaiSeshu MUDIGANTI says that labs are a limited resource, need to be used wisely.  Heather Kirkseynotes that Lab as a service might be worth pursuing.  Cross project labs are harder to support.  Adding lots of automation is a possibly, but it requires lots of support as the requirements are constantly changing. 

 Lei Huangnotes that maybe would gather requirements from the Operators, then look for a third party to be used for open lab testing.  OVP testing potential?  Very limited in scope.  Morgan Richommesays we need to work on the testing/certification process.  As a service provider, the certification cannot be too lightweight.  The certification adds value only if it reduces the operators' effort to bring a given service/product into production.  Badges need a static definition, while testing frameworks are living products that are more able to change to meet the changes in technology.  The static definition of the RFP process has to be balanced with the rapid changes in the technology.  Operators have this problem with the long time frames between the original RFP and the final execution.  

Lei HuangSrinivasa Addepalli how many operators are willing to share data that can be used.  Sorry, I can't ask this as I don't have access to microphone. There are three things that EUAG can help with. 1. AI/ML use cases for 5G RAN and 5GC (Note that NWDAF specified use cases can be satisfied with policy based analytics) that operators need 2. How many operators are willing to share data with open source communities to create algorithms or even models. Continuous access to data is important to make the models better. Note that models are as good as accuracy of input data. If there is willingness, what is the process of collaboration. It is good to understand from operators perspective 3. Operators that intend to create their own AI algorithms and models themselves and the expectation they have from open source communities, is it just the generic Data Analytics Framework platform? 

...