Product Documentation
Virtuoso ADE Verifier User Guide
Product Version IC23.1, November 2023

6


Managing Verification Results

This topic provides information on how you can access the verification status of your project managed through Virtuoso ADE Verifier (Verifier).

The status of your design verification project depends on various factors, such as the number of unmapped or missing requirements. For example, if your verification plan is incomplete, and the overall verification status is 100 percent, the verification project cannot be deemed as complete.

ADE Verifier lets you access the overall verification status of your project. You can drill down to the detailed status of each verification requirement in the project to generate various types of design verification reports.

It is possible to manually override failed requirements by signing them off. You can also sign off the requirements of the type Manual.

Related Topics

Factors that Impact Verification Status

Reviewing the Current Overall Verification Status

Verification Status of Requirements

Requirements Signoff in Virtuoso ADE Verifier

Detailed Verification Report

Factors that Impact Verification Status

The status of your design verification project depends on the following factors:

Changes in these factors impact the overall design verification status. For example, if you sign off some failed requirements, the percentage of the overall verification progress increases.

The verification failures can include requirements that are not mapped, mapped implementations that are not simulated or do not have results, failed specification checks, and cases where simulation results did not match the specifications.

Location of Run Summary Data

Verifier stores the run summary data (PALS) file as .xml files in separate locations, based on whether the cellview is editable or read-only.

When you open an existing Verifier cellview or update a referenced cellview, Verifier loads the existing and last modified run summary data file from the default results directory in the current cellview or from these locations.

Evaluation of Metric Prefix

You can check for any mismatch in the implementation specifications with the requirement specifications mapped to that implementation. This happens when the Requirement specification and check to implementation option is set in the Preferences form.

However, by default this check only evaluates the specification numbers and the unit. It does not do any calculations to check if the value and unit combination matches.

Requirement: MaxSpec = 10m, unit = V;
Implementation: MaxSpec = 10, unit = mV;

This will be flagged as a mismatch.

To check for such matches or mismatches, enable the preference option Evaluate metric prefix on the General tab.

The Evaluate metric prefix option is hidden when the Specification Search Order is set to Implementation Specification because no check for specification is performed in such cases.

Enable Check unit, Evaluate metric prefix and use Requirement specification with check in the Preferences form. The metric prefix of the requirement specification unit is used for both specification check and evaluation. Assume that the values are as follows:

Requirement: MaxSpec = 10m, unit = V;

Implementation: MaxSpec = 10, unit = mV, Implementation results: minValue = 1.5, maxValue = 5.5;

When evaluation is done, the status is as follows:

Spec Check: Pass (10m V == 10 mV);

No specification error is reported. The unit of the requirements is used for the displayed results.

Result status shown: Verification: OverallStatus = Pass, MinValue = 1.5m, MaxValue = 5.5m , Unit = v.

In the above figure, the units are different in the requirement and implementation specifications. When you enable Evaluate metric prefix, the specification check is 'pass' as the numerical values are identical when the unit prefix is considered.

You can also enable the Evaluate metric prefix preference by using the environment variable evaluateMetricPrefix.

The Evaluate metric prefix preference only checks the first single prefix, such as:

Y|Z|E|P|T|G|M|K(?!elvin)|k(?!ilo)|%|m(?!il|persec)|u|µ|n|p|f|a|z|y

Here the supported prefix is K but not Kelvin; k but not kilo; m but not mil or mpersec. Except the above prefixes, complex combinations are not supported.

Reviewing the Current Overall Verification Status

Verifier displays the current overall design verification status of your project on the Overall Progress bar. You can view further details in the status tooltip.

To display the overall verification status, if it is not already displayed:

To display the status snapshot tip:

The status snapshot tip includes the following information:

Additional Information

Overall Progress, Coverage, and Overall Coverage

There might be instances when you see a difference in the Pass and Fail percentages in the values shown for Overall Progress and Overall Coverage on the Verifier toolbar.

The calculation for Overall Progress is based on the number of requirements. For example, the following illustration shows the tooltip as "14 out of 23 pass".

In this example, the total number of requirements with results is 23, and the number of requirements that have passed is 14. The Overall Progress considers only the requirements that have passed or failed.

Coverage in Verifier is calculated by considering all the design point combinations that are added to verification spaces and available in the implementation history. The coverage values are reported as "x% passed, y% failed", which can be interpreted as:

x% design points from SPACE are pass -- Green

y% design points from SPACE are fail -- Red

(100%-(x+y)%) design points from SPACE are not covered -- Grey

If a design variable defined in a verification space that is assigned to an implementation, is not needed for simulation then Verifier ignores this variable for calculating the overall coverage.

The Overall Coverage value is calculated by considering all the design points specified in the verification spaces and reporting these points as Pass, Fail, or No Coverage.

Consider the following figure that shows the tooltip as "4 out of 45 pass".

In this example, the total number of design points is 45, and the number of points that have passed is 4. The 45 design points include points that are defined in the verification space, as well as those points that are not defined in verification spaces but simulated during a run. The 4 points that show as pass are those points that are successfully implemented or covered.

When you select the Overall Coverage progress bar, the OverallStatus, MinValue, and MaxValue values are updated. Calculation of these values is based only on the design points specified in the assigned verification space. All other design points are ignored.

For example, consider the following Setup Library shown in the following illustration:

Based on the design points defined in this library, the OverallStatus, MinValue, and MaxValue are as follows:

Selecting the option Show results for Verification Space parameter conditions ensures that Verifier calculates the coverage based on design points from verification spaces for all subsequent simulations. Additionally, this preference ensures the following:

Verification Status of Requirements

In a requirements-driven verification flow, it is important to review the status of each requirement to identify the requirements that need attention. You can access the verification results of each requirements in Verifier. You can also filter the requirements hierarchy based on the verification results.

Typically, you investigate the failed requirements to determine the corrective actions. You can make appropriate changes to the design so that the results meet the specifications, and re-simulate the implementation cellview. Alternatively, if the specifications have scope for updates, you can change the specifications in the implementation cellview and the requirement so that the results are within the new specifications. To facilitate such actions, Verifier lets you view simulation results and edit the implementation cellview in the default application of that cellview.

When you investigate the verification status, you can also consider making appropriate changes to your verification plan to achieve your design verification goals and quality parameters. For example, you can choose to make the plan more granular by adding new requirements and mapping them with their implementations, or change the specifications of some requirements.

Typically, the goal of a verification project is to achieve 100% overall verification status. However, you can consider to tape out when your verification status is 95% after a careful analysis and a good understanding of the missing 5% of the verification objectives. By analyzing the verification status of the requirements in Verifier, you can achieve this understanding. If required, you can choose to manually override those 5% of requirements.

The following figure illustrates how you can identify the verification status and get relevant details from the Results tab.

The Results tab contains the following columns:

The Results tab can also display the following optional columns:

To view these optional columns, right-click anywhere in the Results tab header row, and select the required columns.

To view additional details related to the implementations mapped to the requirements, right-click anywhere in the Results tab, choose Show, and select the filter options as required.

In case of Monte Carlo runs with sweeps or SigmaToTarget values defined with range specifications, you can get different Min or Max values for each statistical value. In such scenarios, the values are displayed as <minValue>, <maxValue> values. In all other cases, the Min and Max values remain the same, and show a single value that uses the same notation as the requirement specifications. You can also use the calcStatisticsForNonMC environment variable to save or load the statistical values to or from the run summary data file. In case of Monte Carlo runs, the statistical values are always non-empty.

The Results tab also displays the detailed verification results of the selected requirement in the Information assistant, as well as the corresponding requirement details in the Requirement Editor Assistant.

You can show or hide the columns of the requirements list in the Results tab. For this, right-click anywhere on the column heading in the results area and select the columns you want to show. Verifier hides the columns that are not selected.

To sort columns, you can use the drag-and-drop method and change the display of columns. Additionally, you can specify the default column order and hidden columns by using the environment variables resColumnOrder and resHiddenColumns.

Viewing Verification Results in the Information Assistant and Requirements Editor

The following figure shows how you can review the verification results of a requirement in the Information assistant and Requirement Editor.

To review the verification results:

For requirements that have passed verification, the results meet the specifications.

To filter the requirements based on overall status:

To open the implementation cellview mapped with a requirement:

Verification Status Indicators

The following table lists the keywords and icons used to indicate the verification status of requirements. The icons display with the hierarchical numbers, and the keywords display in the Overall Status column.

Icon Indicators Description

Pass

The requirement passed verification.

If a requirement is mapped to multiple implementation outputs, the OverallStatus column shows Pass as its status only when all implementation outputs mapped to it have also passed.

Fail

The requirement failed verification. A tooltip displays the failed and canceled points for the failed tests or outputs.

Signoff Required

The requirement is of the type Manual and it is not signed off.

Not Run

The implementation mapped to the requirement has not been run and must be run to capture simulation results.

Spec Check Fail

The specification checks failed for the requirement.

For details, see the corresponding requirement icon.

Not Mapped

The requirement of the type Ran OK or Spec Pass is not mapped to any implementation. It needs to be mapped to its corresponding implementation.

No Results

Verifier does not have the simulation results of the implementations associated with the requirement.

Test Disabled

The test mapped to the requirement is disabled in the implementation cellview or the specified history, and cannot be run.

Requirements Signoff in Virtuoso ADE Verifier

Verifier lets you sign off requirements of type Manual and requirements that have overall status of Fail, Not Mapped, Spec Check Fail, or Signed Off. Verifier considers signed off requirements as requirements that passed verification. Therefore, when you sign off a requirement, the overall verification progress percentage increases. However, requirements can be signed off only if the corresponding simulation points are covered in verification. Requirements mapped to implementations that have uncovered points are not signed off. For details, see Coverage Calculation in Virtuoso ADE Verifier.

The procedure to sign off the failed, unmapped, or spec check failed requirements is similar to signing off a requirement of the type Manual. After signing off a requirement, select it in the requirements hierarchy of the Results tab and review the signoff details in the Information assistant.

You can specify the validity period of a requirement signoff. For example, you can choose to retain the signoff indefinitely or for a specific period.

The signed off requirements are documented in the verification reports. For details on these reports, see Detailed Verification Report.

When you change the mapping of a signed off requirement, the signoff gets removed because it loses relevance with the new map.

Signing off Requirements

To sign off a failed requirement:

  1. In the Results tab, right-click the failed requirement and select Signoff.
    The Sign Off Form form displays.
  2. Type a name in the User Name field.
  3. Specify the valid period of the signoff from the Lifetime of Signoff drop-down list. The available options are:
    • Once: The signoff is valid only for a specific set of simulation results and will become disabled when newer simulation results become available.
    • Indefinitely: The signoff is valid for an infinite time.
    • Until Date: The signoff is valid up to the date specified in the corresponding field.
  4. Specify if sign off information is stored after applying, or without applying the signoff to the requirement.
  5. Type details of the signoff, such as the reason, in the Comments field.
  6. Click OK.

The status of the requirement changes to Signed Off.

The following figure illustrates how you can sign off a failed requirement.

Additional Information

Updating the Completion Progress of Requirements Manually

In the Sign Off form, you can add or update the completion percentage for requirements of type Manual. This indicates that verification is in progress, but not complete.

To add the completion percentage for a Manual requirement:

  1. Select the Percentage Complete check box.
    This enables a list box and a slider.
  2. Do one of the following:
    • In the list box, select the percentage of completion progress that you want to specify.
    • Move the slider to select the completion percentage that you want to specify.

The specified percentage is used to calculate the status of Manual requirements and also impacts the status of Note requirements. Both these values are shown in the OverallStatus column. For example, if you specify the completion percentage as 50%, the requirement shows in the Results tab with 50% value for OverallStatus. If you specify the value as 100% then the requirement status shows Signed Off, and if you specify 0%, the status shows as Fail.

You can select either Sign Off or Percentage Complete to specify the sign off for your requirements. When you select the Percentage Complete check box, the Sign Off radio options are disabled. A percentage value less than 100% is still considered as a Fail for the overall status of the requirements. If you move the percentage to 100%, the requirement status automatically shows as Pass.

Viewing Plotted Results

The Results tab in Verifier helps you to view the results of your verification setup. Additionally you can also view the results with their waveforms in the Virtuoso Visualization & Analysis window when the Save Simulation Data option is set to All in the Save Options form.

To view the results in the Virtuoso Visualization & Analysis window:

  1. Select a requirement from the Results page.
  2. Choose Open Results Browser from the shortcut menu.

The Virtuoso Visualization & Analysis window is displayed with the results.

The Virtuoso Visualization & Analysis window title displays the name of the Verifier cellview. If the implementation has waveform data, the loaded results are displayed in the form of a waveform, and the Browser assistant in the window displays the results directory path in the following format:

RunDir/psf/list of tests/psf

If the implementation has no waveform data, only the results information is displayed in the Virtuoso Visualization & Analysis window. For Monte Carlo simulations, the results might not be displayed because by default the Save Simulation Data option is disabled.

Detailed Verification Report

Verifier creates HTML reports that contain the overall verification status and the verification details of each requirement, including the mapped implementations and verification results. These reports are available in the following formats:

You can only publish a report in one of these formats at once. By default, Verifier uses the default web browser to display the HTML report in basic format. You can specify the required settings in the Reporting tab of the Preferences form.

Related Topics

HTML Reports in Basic Format

HTML Reports in Interactive Format

HTML Reports in Basic Format

The HTML report in basic format lets you view the detailed verification status by including the overall verification and run status, requirement hierarchy with links and status, and the detailed verification status table of each requirement.

Verifier stores this report in the current directory using the naming format cell.html. For example, verification.html. If required, you can specify a custom name and location for the report.

You can use the verification report to monitor the implementation run status and verification status when a batch script is running. The report updates after each implementation cellview run is completed to let you monitor the progress. For details, see Scheduling Runs.

The verification report in basic format includes the following elements:

Viewing the HTML Report in Basic Format

To view the verification report:

  1. In the CIW, set the following environment variable.
    envSetVal("verifier.html" "useTemplate" 'boolean nil)
  2. Choose ToolsPublish HTML Report.

The report is displayed in the browser window.

Typically, the requirement IDs are hidden in the HTML report. To display the requirement ID as shown in the following illustration, set the environment variable includeRequirementId to t.

In a Verifier setup with referenced cellviews, the default behavior of the basic HTML report is to hide the ExtRef ID from the beginning of referenced implementation LCVH. You can display the ExtRef ID as shown in the following illustration by setting the environment variable includeReferenceIDPrefix to t.

Related Topics

Detailed Verification Report

HTML Reports in Interactive Format

HTML Reports in Interactive Format

You can use the interactive format to display more structured HTML reports. In this format, the report contains two panes. The left pane displays two tabs, Requirements and Results and Implementations. When you click a requirement or implementation on the left pane, its corresponding verification information is displayed in the vertically placed Information tab on the right.

Cadence recommends using a browser with JavaScript support to view reports in interactive format.

The verification report in interactive HTML layout includes the following elements:

Viewing HTML Reports in Interactive Format

To view the verification report:

  1. In the CIW, set the following environment variable.
    envSetVal("verifier.html" "useTemplate" 'boolean t)
    Setting this variable to t updates the Reporting preferences as follows:
  2. Specify a directory name for the HTML report in the Directory field.
    If the specified directory does not exist, it is created automatically.
  3. Specify a datetime suffix format in the Datetime suffix format field.
    The default format is yyyyMMdd_HHmmss.
  4. Choose ToolsPublish HTML Report.

The report is displayed in the browser window.

Viewing Additional Columns in the Verification Report

By default, the HTML report displays the HierID, Title, Description, MinSpec, MaxSpec, Unit, OverallStatus, MinValue, TypicalValue, MaxValue, and Coverage.

To view additional columns in the HTML report:

  1. Right-click anywhere on the column header area in the report.
    A checklist is displayed with all columns. The visible columns are selected and the additional columns are unselected.
  2. Select the required columns from the checklist.

The additional columns are displayed.

You can sort the displayed columns by clicking the required column header. Additionally, you can drag a column header to the another column to reorder the displayed columns.

Related Topics

Detailed Verification Report

HTML Reports in Basic Format

Customization of HTML Reports in Interactive Format

Reporting

Customization of HTML Reports in Interactive Format

When you set the useTemplate environment variable to t, and run the ToolsPublish HTML command, Verifier generates a template-based interactive report by performing the following operations in the given order:

  1. Creates a report directory where the published HTML report is saved. The directory name and location is determined by the Directory and Datetime suffix format fields in the Reporting tab of the Preferences form.
    If the specified directory does not exist, it is created automatically.
  2. Exports a .json file for the session into the newly created report directory. For example, HTML_dir/export.json.
  3. Saves ADE Assembler datasheets into the same report directory. This is optional.
    Use the Create ADE Assembler datasheet at run or reload time option to enable or disable the creation of these datasheets.
  4. Copies any documents or images from the Verifier session into the report directory.
  5. Copies and extracts the data from /dfII/verifier/templates/ located by the Cadence Search Framework (CSF) into the report directory.
    This extracted data is customizable. The default data and any additional information, such as the stylesheet.css or JavaScript used by the template, is available within install-path/share/cdssetup/dfII/verifier/templates.
    If you have saved customized templates in a custom directory such as .cadence/dfII/verifier/templates, the data from this directory is also copied into the report directory.
  6. Runs the Python script file publishHtml.py, which is available in /dfII/verifier/scripts/ located by CSF. The publishHtml.py file is created using the Jinja scripting language.
    You can customize a copy of the script by saving the script file publishHtml.py at a location such as .cadence/dfII/verifier/scripts/publishHtml.py.
    The publishHtml.py script uses the HTML_dir/export.json and /dfII/verifier/templates/templateName identified by CSF. Here, the value of <templateName> is set using following environment variable.
    envSetVal("verifier.html" "templateName" 'string "index.html")
  7. Applies the template defined by the script and uses the export.json to expand the contents into the final output HTML_dir/index.html.
    You can modify the HTML report name, index.html, by setting the templateName environment variable with the new filename. The specified filename should exist in the template directory.

The architecture provided by the Jinja template language is highly flexible and can let you export your report to multiple formats, such as .txt, .csv, .pdf, and more. For example, the template contains the following source code to display the report header:

<h1>Cellview: {{ data["cellview"][0] }} {{ data["cellview"][1] }} {{data["cellview"][2]}}</h1>
<p>Report created on {{ data["export"]["date"] }} by {{ data["export"]["username"] }} in {{ data["export"]["cwd"] }} using {{ data["export"]["subVersion"] }}</p>

Here, {{}} indicates expressions defined in the Jinja template language and data indicates the contents of the export.json file. After the template is rendered, the final HTML output contains the following details:

<h1>Cellview: opamp090 full_diff_opamp_AC verification1</h1>
<p>Report created on 13 Oct 2021 13:29:16 by Tom in /projects/sco_ams/users/Tom/work/adexl/designs/verifier/V3_Examples using sub-version  IC6.1.8-xxx.xxxx.xxx </p>

The export.json file contains all your verification data and, with advanced usage of the template, can be used to display in-depth and highly accurate information in the final HTML report.

Related Topics

Detailed Verification Report

HTML Reports in Interactive Format

Customizing the Interactive HTML Report Template

To customize the template of your interactive HTML report:

  1. Type the following commands in the terminal window:
    cp –r $CDSHOME/share/cdssetup/dfII/verifier/templates ./.cadence/dfII/verifier 
    cp –r $CDSHOME/share/cdssetup/dfII/verifier/scripts ./.cadence/dfII/verifier
  2. Create a copy of the default template file, index.html, and save it as myIndex.html in ./.cadence/dfII/verifier.
  3. Set the following variables in the Virtuoso CIW:
    envSetVal("verifier.html" "useTemplate" 'boolean t)
    envSetVal("verifier.html" "templateName" 'string "myIndex.html")
  4. Open and edit the myIndex.html file.

The interactive HTML report is customized, as required.

Related Topics

Detailed Verification Report

HTML Reports in Interactive Format

Customization of HTML Reports in Interactive Format

Examples of Interactive HTML Report Customization

This section provides some examples of how you can customize your interactive HTML reports.

Example: How to add the Cadence logo above the header of an interactive HTML report

Consider that you want to add the Cadence logo to the HTML report by modifying the report template written in the Jinja language.

Jinja is a mix of the target language and some extra entries for data processing. Here, the target language in use is HTML.

<!DOCTYPE html>
<html>
...
<body>
<img src="https://www.cadence.com/content/dam/cadence-www/global/en_US/images/site-images/icons/navigation-icons/cadence-logo-black.png" alt="Home" title="" class="cq-image-placeholder">
<h1>Cellview: {{ data["cellview"][0] }} {{ data["cellview"][1] }} {{data["cellview"][2]}}</h1>
<p>Report created on {{ data["export"]["date"] }} by {{ data["export"]["username"] }} in {{ data["export"]["cwd"] }} using {{ data["export"]["subVersion"] }}</p>

In the above example, the code enclosed within the <img src> tag adds the Cadence logo to the HTML report.

Example: How to use the export functionality to create alternative outputs

The template directory, install-path/share/cdssetup/dfII/verifier/templates, contains a few examples of alternative export formats, such as the example_reqs.txt and example_reqs.json template files.

Specify the following in the Virtuoso CIW:

envSetVal("verifier.html" "templateName" 'string "example_reqs.txt")

Now, when you create the report using the Publish HTML Report command, instead of an HTML report, the report is created in text format.

Similarly, if you set the templateName environment variable to example_reqs.json, the report is created in json format.

Example: How to use the template to create custom template and process languages

Consider that you want to create a PDF report. The creation of PDF using the HTML template involves a two-step process.

The following CIW command exports the template in the intermediary LaTex format.

envSetVal("verifier.html" "templateName" 'string "example_reqs.tex")

Clicking the Publish HTML Report command after this displays the LaTex report.

Now, you can specify the following command in the CIW to convert the LaTex report to PDF format.

envSetVal("verifier.html" "templateName" 'string "example_reqs.tex pdf")

This will save the report in the PDF format.

You can check the ./scripts/publishHTML.py file to observe the changes made by the conversion.

Similarly, you can create your custom report template using the Jinja template examples and change the environment variable settings accordingly.

Example: Post-processing of the custom template

Consider that you have created a custom template and now you want to define the post-processing steps. For this, you can overwrite the publishHTML.py file as follows:

This example shows how to filter the text export using the string 1.1 and saves the results in a file with the extension .txt2.


Return to top
 ⠀
X