Skip to main content

Merging, ranking and metrics reporting – Unified regression reporting flow using ranking

Conference: Verification Futures 2018 (click here to see full programme)
Speaker: Mark Daniel, Senior Staff Engineer in Verification. Infineon Technologies UK Ltd.
Presentation Title: Merging, Ranking and Metrics Reporting – Unified Regression Reporting Flow Using Ranking.
Abstract: Design IPs have become complex and there are considerable challenges involved in verifying them. Constrained random tests generate random stimuli and the DUT behaviour is checked against a model. This abstract lists out some of the typical problems faced in metrics reporting during different phases of IP Verification and the evolution of a process. Throughout the verification process, verification teams need to demonstrate consistent progress in verifying the design. The tests are ranked on how much they contribute to the coverage. Ranked seeds from the previous regression are collated into a golden test list and they are run in every regression cycle in addition to the new random tests. Tests that contribute to coverage will be added to the golden test list, which grows over time.

Ranking on the full random regression introduces passing and failing tests into the golden test list and in turn reduces the pass percentage. In this early phase, ranking on functional coverage metrics is beneficial as functional coverage could be reported only on passing tests.

As the testbench matures, the need to rank passing tests on all metrics (code and functional coverage metrics) becomes more important. A unified regression reporting flow to perform ranking on passing tests was developed. Identifying all failing tests and maintaining failing seeds helps to classify the kind of failures seen in a regression. There could be verification plans loaded on top of the coverage databases to report requirements to functional coverage mapping. All these metrics generated till now may have to be fed into in-house tools like tStatus that produce graphs to represent weekly progress and TriCE that helps classify failure categories.

Key points covered include:

  • The tests are ranked on how much they contribute to the coverage.
  • A Unified regression reporting flow to perform ranking on passing tests was developed.
  • The fastest running ranked tests are added to a ‘Mini regression’
Speaker Bio: Mark has been with Infineon for 18 years, during this time he has worked on the verification of 3rd party and in-house IP and CPUs. He has expertise in constrained-random verification environments, such as eVCs and ISGs, and flows.  Most recently he has taken responsibility for Tools & Flows enabling best in class tools used, as well as compliance for ISO26262. Prior to this he was leading the TriCoreTM CPU verification for the AURIX2GTM. He developed a new flow for the CPU verification, ensuring that it’s maintainability, performance, ease-of-use and reporting are best in class. He architected the CPU verification environment to be highly re-usable for deliveries of TriCoreTM into the AURIXTM, AURIX2GTM and next generation families of microcontrollers.

Prior to working at Infineon Mark worked at GEC Plessey Semiconductors (Mitel) on development of IP’s and SOC’s for the Set-top-box market. Mark has a BEng Electrical & Electronic Engineering from University of Bath.

Close Menu