Product Documentation
Virtuoso ADE Assembler User Guide
Product Version IC23.1, November 2023

22


Working with Specifications

You define performance specifications on the Outputs Setup tab of the Outputs pane. After simulation, the program displays information so that you can see whether your design met, nearly met, or failed to meet your performance specifications.

You can view specification (spec) results from one or more history items (checkpoints) on the Results tab of the Outputs pane. You can select which measurement result for which corner or sweep to display. The program displays specification information such as the corner or sweep name, the analysis name, the measurement name, the conditions, the minimum or maximum values. You can view the spec for each measurement along with information about whether each spec passed or failed. You can create a summary report of pass/fail results.

You can control the format for each specification such that the program uses particular units or engineering or scientific notation when displaying each spec. You can sort by measurement name and export your spec sheet to a comma-separated values file.

See the following topics for details:

Working with Specifications

The outputs of type Expression, OCEAN Script, MATLAB Script, Area Goal, or Op Region Spec that are added in the Outputs Setup tab of the Outputs pane are called measurements.

You can define a performance specification for a measurement in the Spec column on the Outputs Setup tab of the Outputs pane or in the Override Specifications form.

By default, the specification defined for a measurement applies to all the corners enabled for the test. You can override the measurement specification for a corner or disable the corner specification.

For more information, see the following topics:

Defining a Specification

To define a specification for a measurement on the Outputs Setup tab, do the following:

  1. On the Outputs Setup tab of the Outputs pane, double-click in the Spec column.
    The Spec cell has two parts: a specification type (which you select from a drop-down list) and a target value.
  2. In the drop-down list that appears in the left half of the cell, select one of the following specification types and fill in the right half of the cell with an appropriate target information as indicated:

    Specification Types

    Objective Target Description

    minimize

    targetValue

    Minimizes the result; optional target value

    maximize

    targetValue

    Maximizes the result; optional target value

    <

    targetValue

    Returns the pass status when the result is less than the target value

    >

    targetValue

    Returns the pass status when the result is greater than the target value

    range

    lowerBound upperBound

    Returns the pass status when the result falls between the boundary values

    tol

    targetValue percentageDeviation

    Returns the pass status when the result is within the specified percentage deviation of the target value

    info

    Returns the pass status when the result is a scalar value. It shows the fail status if the result is a waveform.

    wave

    Compares and validates the analog and digital waveform signals for the reference and compared data sources. For details, refer to Comparing Waveforms Using Wave Specifications.

    A target specification value can be specified in the form of constant values or expressions using global variables or other output names. For more details, refer to Using Variables in Specifications.
  3. (Optional) In the Weight column next to the specification, specify the weighting factor for the specification.

Using Variables in Specifications

The target values for specifications can be specified as a constant or an expression using variables. In case the value is specified as an expression, the evaluated value of the expression is compared with the output value to identify the pass or fail status. By using expressions in specifications, you can vary the specifications across sweep points.

Currently, only the Single Run, Sweeps, and Corners and Monte Carlo run modes support expressions in specification target values.

The expressions can be created for specification target values in any of the following ways:

Important Points to Note

Also see: Viewing Specification Results in the Results Tab.

Overriding the Measurement Specification for a Corner

By default, the specification defined for a measurement applies to all the corners enabled for the test. For example, assume that the nominal corner and two corners named C0 and C3 are enabled for a test named myTest. If you define a measurement named Gain_commonMode for the test, the specification for the Gain_commonMode measurement applies to the nominal corner and to corners C0 and C3.

For example, in the following figure, the specification <550m for the Gain_commonMode measurement applies to all the corners displayed in the Corner column.

If you modify the specification for a measurement in the Outputs Setup tab of the Outputs pane or in the Global Spec field on the Override Specifications form, the change will apply to all the corners selected for the test. For example, the following figure indicates that when the specification for the Gain_commonMode measurement is changed from <550m to <600m, the change is applied to all the corners displayed in the Corner column.

You can override the measurement specification for a corner by specifying a different specification for the corner.

To define a specification for a measurement in the Override Specifications form, do the following:

  1. Right-click the row for a measurement in the Outputs Setup tab of the Outputs pane and choose Override Specifications from the context-sensitive menu.
    The Override Specifications form appears.
    Field Description

    Measurement

    Displays the measurements for which you can define specifications. Note the following:

    • By default, the name of the measurement you selected on the Outputs Setup tab of the Outputs pane is displayed.
    • This drop-down list displays only the names of the measurements for the test whose measurement you selected on the Outputs Setup tab. To define a spec for a measurement of another test, right-click a measurement for that test and choose Override Specifications.
    • If you have not specified the name for a measurement in the Name column on the Outputs Setup tab, the expression for that measurement is displayed as its name in this drop-down list.

    Global Spec

    Displays the specification for the measurement.

    To define or modify a specification, select one of the specification types from the drop-down list, and fill in the text field next to the drop-down list with appropriate target information for the measurement.

    Corner

    Displays the list of corners enabled for the test.

    You can disable or enable a corner specification for a specific measurement. For more information, see Disabling and Enabling Corner Specifications

    Spec

    Displays the specification for each corner enabled for the test.

    For more information about specifying a different specification for a corner, see Overriding the Measurement Specification for a Corner.

  2. In the Measurement drop-down list, select the measurement for which you want to define the specification.
  3. In the Global Spec drop-down list, select one of the specification types from the drop-down list, and fill in the text field next to it with appropriate target information for the measurement.
  4. In the Spec column, specify an overridden value next to the corner for which you want to use a different specification.
    For example, in the following figure, the specification <450m for corner C0 overrides the specification <550m for the Gain_commonMode measurement.
    In case of an overridden specification, a corner can use only a constant value as an overridden value. You cannot specify an overridden expression different from that specified for the global target value.
  5. Click OK.
If you later modify the specification for a measurement in the Outputs Setup tab of the Outputs pane or in the Global Spec field on the Edit Specification form, the change will apply only to the corners for which you did not override the specification. For example, the following figure indicates that when the specification for the Gain_commonMode measurement is changed from <550m to <600m, the change is applied only to the Nominal corner and corner C3. The change is not applied to corner C0 because the specification <450m for corner C0 overrides the specification for the Gain_commonMode measurement.

By default, the specification applies to all the corners selected for the test. You can do the following:

The next time you run a simulation, the program measures the result against the performance specification and displays the results in the Results tab of the Outputs pane. For more information about how results are displayed in the Results tab, see Viewing Specification Results in the Results Tab.

If you add or modify measurement expressions or specifications after running a simulation, you can click the button on the Results tab to update the results displayed in the Results tab based on the new or modified expressions and specifications.

The results will also be updated based on the new or modified expressions and specifications if you click the button on the Results tab to plot across all points. For more information about using the button, see Refreshing Graphs.

For information about setting up specifications for optimization, see Setting Up Specifications.

See also:

Removing a Corner Specification Override

To remove a corner specification override, do the following in the Override Specifications form:

  1. In the Measurement drop-down list, select the measurement for which you want to undo a corner specification override.
  2. In the Spec column, right-click the specification for the corner for which you want to undo the specification override and choose Set to Global from the context-sensitive menu.
    To undo the overrides for multiple corners, hold down the Shift key (for contiguous selection) or the Ctrl key (for noncontiguous selection), right-click and then choose Set to Global from the context-sensitive menu.
    The specification for the measurement (displayed in the Global Spec field) is applied to the corner.

Disabling and Enabling Corner Specifications

By default, the specification defined for a measurement applies to all corners enabled for the test. You can do the following:

To disable or enable a corner specification for a specific measurement, do the following:

  1. Right-click the row for a measurement in the Outputs Setup tab of the Outputs pane and choose Override Specifications from the context-sensitive menu.
    The Override Specifications form appears.
  2. In the Measurement drop-down list, select the measurement for which you want to disable or enable the corner specification.
  3. Do one of the following:
    • To disable a corner specification for the measurement, clear the check box next to the corner.
      For example, in the following figure, the specification for corner C0 is disabled.
    • To enable a corner specification for the measurement, select the check box next to the corner.
    • To disable a corner specification for all the measurements for the test, select All in the Measurement drop-down list, then clear the check box next to the corner.
      For example, in the following figure, the specification for corner C0 is disabled for all the measurements for the test.
    • To enable a corner specification for all the measurements for the test, select All in the Measurement drop-down list, then select the check box next to the corner.
      You can also right-click the first column in the Override Specifications form and choose Disable All or Enable All to disable or enable all the corner specifications.
  4. Click OK.

See also:

Viewing Specification Results in the Results Tab

Comparing Waveforms Using Wave Specifications

ADE Assembler provides various types of specifications, such as <, >, range, tol, etc. that you can use to compare scalar values in simulation results with the given targets. Depending on the comparison results, the status of the resulting scalar values is shown as pass/fail.

Similarly, you can also compare and validate the waveforms in simulation results with the given target waveforms using the wave type of specification as per the acceptable tolerance values.

ADE Assembler supports waveform comparison for the following formats:

For waveform comparison, you need to specify:

When you run waveform comparison for two signals, the waveforms or signals are plotted in Virtuoso Visualization and Analysis XL graph window.

You can identify the difference between the waves, zoom into them for a closer view, and see the passing or failing areas as per the given tolerance settings.

You can view a video demonstration of this feature at Using Waveform Specifications in Assembler.

Performing Waveform Comparison

You can use the menus and commands in ADE Assembler to specify the data sources and tolerance settings.

To compare waveform signals, do the following:

  1. Specify the global options (reference history and tolerance settings) to be used for waveform comparison. You can do this by using the Global Waveform Compare Settings form. For more details, refer to Specifying Global Options for Waveform Comparison.
  2. Ensure that your test in ADE Assembler contains an analysis of type tran. If it does not have, create a new analysis of this type.
    Currently, waveform comparison is supported only for tran analysis.
  3. For tran analysis, create outputs of type signal or expr that you need to compare to a reference output from a reference data source.
    For digital outputs, you can choose bus signals. For example, /I0/net1<0:10>.
  4. Ensure that the check box in the Save column for this output is selected.
    For waveform comparison, it is required to save the raw waveform data.
  5. Specify wave specification for the output to be compared. For this, click in the Spec column of the output and select wave from the drop-down list.
    By default, the global comparison options specified in step 1 are applied to the wave spec for this output. However, you can override the global options at the output level. For this, click ( ) to the right of the drop-down list in the Spec column.
    You can set wave specifications for multiple outputs together by using the Create wave spec for all signals or Create wave spec for selected outputs commands available in the ( ) drop-down list on the Outputs Setup pane toolbar. For more details, refer to Applying Waveform Options to Multiple Outputs Together.
  6. (Optional) Verify the comparison options for the output in any of the following ways:
    • Move the mouse over the ( ) button. A tooltip is displayed to show the waveform comparison settings applied for this output.
    • Click ( ) to open the Waveform Compare Setup form in which the waveform comparison settings are displayed.
  7. (Optional) If required, change the values to override the options. For more details, refer to Specifying Local Options for Waveform Comparison.
  8. Run simulation to generate results that are saved in a history.
  9. View the wave comparison results on the Results tab.
    As shown in the figure below, the status of waveform comparison is displayed in the Pass/Fail column.
    Observe the following:
    • If difference between the source and reference wave is within the tolerance range, the output row is colored green and the pass/fail status is displayed as pass. In addition, the error count next to the plot icon is set to 0.
    • If the difference is out of the tolerance range, the pass/fail status is displayed as fail and the error count is greater than 0.

    Move the mouse over the error count to view the failure details in a tooltip.
  10. Right-click in the result cell and choose Plot to view the comparison of waveforms or traces. For more details, refer to Analyzing Waveform Comparison Results.

The following topics provide more details about waveform comparison:

Specifying Global Options for Waveform Comparison

You can use the options in the Global Waveform Compare Settings form to control how all the waveforms are validated. These options apply to all the outputs with wave specs defined for them.

Choose Options – Wave Compare Settings to open the Global Waveform Compare Settings form, as shown below.

If required, you can override the global options for any specific signal. For more information, see Specifying Local Options for Waveform Comparison.

By default, the fields on the Global Waveform Compare Settings form are blank. Fill in the required values.

The following sections describe the values you can set in this form:

Reference History to be Used as Data Source

You can use a saved history from the current or any other cellview as a reference history for waveform comparison.

To choose a reference history from another cellview:

  1. Select Select reference from another cellview on the Global Waveform Compare Settings form.
    The form fields to specify the library, cell, and view of another cellview are enabled. In addition, the Reference History field is also included in the group displayed on top of the form.
  2. Use the Library, Cell, and View drop-down lists to specify the maestro cellview of the reference history.
    All the histories available in the specified cellview are listed in the Reference History drop-down list.
  3. Select a reference history from the Reference History drop-down list.

To choose a reference history from the current cellview:

  1. Select a history from the Reference History drop-down list, which shows the list of histories available in the current maestro cellview.
  2. Select the Lock Reference History check box to save the reference history from being overwritten. It is recommended to lock the history if you need to use its results for many waveform comparisons. Otherwise, depending on the History Entries to Save settings on the Save Options form, when new simulations are run, the selected history might get overwritten.

Important Points to Note

Analysis and Tolerance Settings for Waveform Comparison

Currently, ADE Assembler supports waveform comparison only for transient analysis. To ensure that comparison are run for the results of this analysis, select the Tran check box.

The tolerance settings that you can specify on this form are grouped in the following categories:

Analog Settings

These settings are used when comparing waveforms for analog signals.

Field Description

Relative Tolerance

The maximum acceptable % tolerance value for the worst-case relative deviation reported for analog signals.

For more information, refer to Maximum Acceptable Tolerance Values.

Absolute Tolerance

The maximum acceptable tolerance value for the worst-case absolute deviation reported for analog signals

For more information, refer to Maximum Acceptable Tolerance Values.

Time Tolerance (seconds)

The maximum acceptable tolerance value for the time difference (skew) between transition times reported for analog signals

For more information, refer to Time Tolerance Values.

Bias Level (y shift)

An integer value by which the waveform needs to be shifted on y axis

For more information, refer to Parameters to Shift Data of a Compared Waveform Before Comparison.

Comparison Algorithm

Algorithm to be used to compare analog waveforms

For more information, refer to Algorithm for Comparison.

Logic Settings

These settings are used when comparing waveforms for digital signals.

Field Description

Time Tolerance (seconds)

Specifies the maximum acceptable tolerance value for the time difference (skew) between transition times reported for digital signals.

The default unit of time for this field is seconds.

For more information, refer to Specifying the Time Tolerance Values for Digital Signals.

Glitch Filter (seconds)

Specifies the maximum glitch size (in seconds) up to which the glitches in digital signals are acceptable and ignored while comparing waveforms.

For more information, refer to Filtering Glitches in Digital Signals.

Common Settings

These settings are common for both analog and digital signals.

Field Description

Time Exclude (seconds)

The time ranges that should be excluded when validating each signal. Only the time ranges that fall outside the specified time ranges will be validated for each signal.

For more information, refer to Time Ranges to be Excluded from Compared Waveforms.

Time Delay (x shift) (seconds)

An integer value by which the waveform needs to be shifted on x axis

For more information, refer to Parameters to Shift Data of a Compared Waveform Before Comparison.

Maximum Acceptable Tolerance Values

You can specify the maximum acceptable tolerance values—the maximum acceptable differences between the reference and compared analog signals. The compared signal is resampled and interpolated at the reference waveform time points and the values at these time points are compared using the specified tolerances. The tolerances define a window around the reference waveform within which the compared waveform values should fall—if they do not, they fail. For more details, refer to Waveform Compare Algorithms.

To specify the maximum acceptable tolerance values, do either or both of the following:

If only one of these tolerance values is defined, Assembler uses only that and ignores the other. If both relative and absolute tolerance are defined, Assembler compares their value for each reference data point and uses the one that is larger.

If a reference waveform value is zero or very close to zero, the relative tolerance evaluates to a very small value. Therefore, the relative tolerance criteria becomes difficult to achieve. In such cases, you should specify a small absolute tolerance value to ensure that the signals are compared with a reasonable tolerance value.

For example, the value of reference waveform is 13mV at some point. The difference of reference wave and compared wave is 2mV and the timing tolerance is not specified.

Scenario A:

Relative Tolerance is 1% (13mV*0.01=0.13mV) and absolute Tolerance not specified

relative tolerance = fail it is a fail overall

In this case, the relative tolerance criteria is difficult to match, given that the reference signal is close to zero.

Scenario B:

Relative Tolerance is 1%. (13mV*0.01=0.13mV) and Absolute Tolerance is 10mV

relative tolerance = fail, absolute tolerance = pass it is a pass overall

In this case, the small absolute tolerance helps to overcome the limitation of a small relative tolerance.

The Pass/Fail column on the Results tab displays a pass status for the waveforms that fall within the specified tolerance value, and a fail status for the waveforms that fall outside the specified tolerance value. This allows you to validate only the waveforms with the fail status.

Time Tolerance Values

You can specify the maximum acceptable time difference (skew) between transition times on the reference and compared waveforms in the Time Tolerance field. For more details, refer to Waveform Compare Algorithms

Note the following:

Algorithm for Comparison

Waveform comparison can use the following three algorithms for waveform comparison:

By default, ADE Assembler uses the asymmetrical algorithm. You can select any other algorithm from the Comparison Algorithm drop-down list.

For details on how these algorithms compare the waveforms, see Waveform Compare Algorithms.

Time Ranges to be Excluded from Compared Waveforms

You can specify the time ranges that should be excluded when validating each waveform. Only the time ranges that fall outside the specified time ranges will be validated for each waveform.

To specify the time ranges that should be excluded when comparing each waveform specify the time ranges in the Time Exclude field.

Use the following syntax to specify a comma-separated list of time ranges:

from:to[, from:to]

For example, specify 70n:72n,110n:112n to exclude the time range starting from 70n to 72 and 110n to 112n when validating each waveform.

If the specified exclude time interval falls within a failure region, the failure count is increased by 1.

Parameters to Shift Data of a Compared Waveform Before Comparison

You can shift the data of compared waveforms on x or y axis before comparison.

To shift the data on x axis, specify an integer value in the Time Delay field. This adds a delay of the given number of seconds. A positive value shifts the data to the right and a negative value shifts the data to the left.

To shift the data on y axis, specify an integer value in the Bias Level field. A positive value shifts the data upwards by the given number of units and a negative value shifts the data downwards.

Specifying the Time Tolerance Values for Digital Signals

You can specify the maximum acceptable time difference (skew) between transition times on the reference and compared digital signals in the Time Tolerance field.

A time tolerance, when specified, checks both the timing and function of the digital signal even if there is functional non-equivalence. As a result, if the digital values are same and the timing change in digital values take place within the given time tolerance in the reference and compared signals, the timing and functional check passes. Otherwise, it fails.

Note the following:

Filtering Glitches in Digital Signals

If there are small acceptable glitches in digital signals because of spurious noise or behavior, you can specify a time value (in seconds) in the Glitch Filter field to filter glitches that are equal to or smaller in width than the specified time value. The glitches are ignored when reference and compared digital signals are compared.

The glitch filter, when specified, filters out digital pulses faster than the entered time value before the validation process. Therefore, the glitch filter time should be set carefully to a value significantly smaller than the minimum signal pulse length.

Applying Waveform Options to Multiple Outputs Together

When you need to apply similar wave compare specifications to all or multiple outputs defined in the Outputs Setup pane, you can use the commands available in the ( ) drop-down list on the Outputs Setup pane toolbar.

Important Points to Note

Specifying Local Options for Waveform Comparison

By default, the options specified in the Global Waveform Comparison Options form are applied to all the waveforms. You can override the global options for specific signal outputs by specifying local options for wave specifications on those signals.

To specify local waveform comparison options for a specific signal output, click in the Spec column of that output and choose wave from the drop-down list. The Waveform Compare Setup form is displayed for that signal output.

The settings on the form are same as the global waveform compare options.

Note the following points related to these settings:

Analyzing Waveform Comparison Results

As described earlier, the pass/fail status of signal outputs on the Results tab shows the overall result of waveform comparison for each compared signal. In addition, the error count is shown in the result cell and its tooltip, which also shows the details of the maximum difference between the reference and compared waveform.

For example, in the figure shown below, the tooltip indicates the following points:

Important Points to Note

The tooltip display is different in case of logic signals. For more details, refer to Analyzing Waveform Comparison Results for Digital Signals.

You can do a detailed analysis of waveform comparisons by viewing the graphs plotted in the graph window.

To plot the result of a waveform comparison, right-click in a result cell and choose Plot. The reference and compared waveforms are plotted in the graph window. By default, both the waves are shown in the same color. You can change their colors for a better analysis.

If the reference and compare analog signals have different time ranges, the time ranges for which only one signal is available are ignored during comparison.

A comparison graph with both the reference and compared waveforms is shown only when the Spec Markers check box on the ADE Assembler Plotting/Printing Options form is selected. It is recommended to keep this check box selected while working with waveform specifications.

Click any point on the compared waveform. As shown in the figure given below, two black, dashed lines are shown to indicate the hints of tolerance region of the reference waveform that is calculated using the waveform comparison options. The points for these lines are calculated as:

In addition, region bookmarks are also shown to highlight the failing regions of the compared waveform.

If the failure region falls in between the specific exclude time interval, the error count is increased by 1.
The tolerance region markers provide only a guidance of the region where a result may pass, but they are not necessarily 100% accurate. The pass or fail status of a result depends on the comparison algorithm. The tolerance markers provide a good estimate, but there can be a few corner cases where the marks show different results than the pass or fail status set by the algorithm. For example, it is possible to a see a point that falls outside the tolerance region is marked as pass by the algorithm.

Zoom into a smaller area to view the details either by using the zoom slider on top of the graph or by using the right-click context menu of a region bookmark.

Tolerance region markers and region bookmarks are the properties of spec markers applied to the compared waveform. By default, these properties are visible only when you select the compared waveform trace on the graph. However, you can change their default visibility mode by setting these properties using the Spec Marker Properties for Specification form or by using the commands in the graph legend of the compared waveform. For more details, refer to Visibility Settings for Specification Markers.

The Fit Bookmark command in the context menu zooms in the graph to show the failing region of the compared waveform.

Use the Zoom to next Bookmark and Zoom to previous Bookmark commands for quick transition to the next or previous failing regions of the compared waveform. You can also use Shift + N and Shift + P bindkeys.

Move the mouse over the trace to view the values in the tracking cursor.

Based on your analysis, you can make the required changes in your design to improve the results.

If you make changes in the tolerance values in the waveform comparison options, re-evaluate the results to view the revised waveforms.

Visibility Settings for Specification Markers

This section describes the properties that you can set to customize the visibility of tolerance markers and region bookmarks on the plots for waveform comparison.

For this:

  1. Ensure that the graph legend is visible to the left of graph. You can do this by setting the Legend Position option on the Graph Properties form to left.
  2. Set the visibility setting of tolerance markers by changing the icon in the Spec column of the legend.
  3. Set the visibility setting of region bookmarks by changing the icon in the Region column of the legend.
    Possible values that you can set in the Spec and Region columns in the graph legend are:
    • On ( ): Always show
    • Off ( ): Never show
    • OnWhenSelected ( ): Shows only when the compared waveform is selected

The following figure shows how these properties are set in the graph legend:

You can also set the spec properties on the Spec Marker Properties for Specification form.

All the spec marker properties that you can set on this form are:

Analyzing Waveform Comparison Results for Digital Signals

For logic or digital signals, the tooltip on the Results tab shows only the failure count.

Click the plot icon to plot the graph in Virtuoso Visualization and Analysis XL window.

For a clear view of the failing region, use the Fit Bookmark command in the context menu of the compared waveform.

Also see: Visibility Settings for Specification Markers

You can now analyze the failing areas and make modifications in the active setup or the waveform comparison settings to resolve the issues.

Saving Waveform Comparison Results in a Datasheet

The datasheet saved from ADE Assembler includes the details of wave specifications and resulting values. The error count for a particular result is in the Failures: x format, as shown below.

Waveform Compare Algorithms

ADE Assembler uses the following algorithms for waveform comparison:

It is important to understand these algorithms so that you can make an appropriate choice for the waves you need to compare.

Asymmetrical Comparison

This is the default algorithm used for waveform comparison in ADE Assembler. This shows a better performance as compared to the other algorithms and is sufficient for most of the cases.

For each data point of the reference waveform, an acceptance region is calculated based on the absolute and relative tolerance specified. Note the upper black waveform and the vertical acceptance region for the second data point in the graph in the figure shown below. If the compared waveform is within the region, the point is marked as passed, else it is a failure. This procedure is repeated for all the reference data points.

In addition, it uses time tolerance, if provided, for time and value comparison. Time tolerance specifies an absolute time difference that is acceptable between the two analog waveforms. An acceptance region is calculated based on the absolute and relative value tolerance and the provided time tolerance. This defines a rectangular region, as shown in the figure below.

If the compared waveform is within the region, the point is marked as passed, else it fails.

When the time tolerance value is not given (left blank), an acceptance region is calculated based on the absolute and relative tolerance values.
When asymmetrical algorithm is used, it is recommended to define the more ideal waveform signal out of the two as reference, while the waveform signal with glitches or over- or undershoots should be defined as the compared waveform.

Limitations of Asymmetrical Algorithms

Some of the other limitations of the asymmetrical algorithm are as follows:

Symmetrical (Conservative)

In this algorithm, the comparison of waveforms is done twice—first, the reference waveform is compared with the compared waveform and then, the compared waveform is compared with the reference waveform. An area is marked as failed if any one of the runs fails at the same time.

This comparison process takes more time because the comparison is being done twice.

Symmetrical (Liberal)

Similar to the conservative approach, in this algorithm too, the comparison of waveforms is done twice, but an area is marked as failed if both the runs fail at the same time.

This comparison process takes more time because the comparison is being done twice.

Defining Operating Region Specifications

Operating region specifications define expressions that ensure that your devices are operating in a desired region (saturation, triode, and so on).

For more information, see the following topics:

Copying a Specification

To copy a specification, do the following:

  1. In the Spec column on the Outputs Setup tab of the Outputs pane, select the specification.
  2. Press Ctrl+C to copy the specification.
  3. Click in the Spec column next to the expression for which you want to paste the specification.
  4. Press Ctrl+V to paste the specification.
To copy a specification from one test to the other, both the tests should have same the schematic/design.

Setting Up Operating Region Specifications

To set up an operating region specification for a test, do the following:

  1. Ensure that a DC analysis with option to save DC operating point information is set up for the test as shown below:
    For more information about setting up a DC analysis, see Adding an Analysis.
  2. On the Outputs Setup tab, click the Add new output toolbar button, to view the drop-down list of available tests.
  3. In the drop-down list, select a test and choose Op Region Spec.
    You can also right-click a test name in the Outputs Setup tab and choose Add Op Region Spec to set up an operating region specification for that test.
    The schematic for the test is displayed with the Operating Region and Navigator assistant panes docked.
  4. Specify operating region expressions using the Operating Region assistant pane or the Operating Region Specification form.
    The Operating Region assistant pane enables you to quickly specify expressions while the Operating Region Specification form provides advanced methods for specifying expressions.
    You can specify multiple operating region specifications for a single test.
    For more information, see the following topics:
  5. Click the adexl tab when you are done.
    The operating region specification for the test is displayed in the Outputs Setup tab as shown below:

Specifying Operating Region Expressions Using the Operating Region Assistant Pane

To specify operating region expressions using the Operating Region assistant pane, do the following:

  1. On the schematic or in the Navigator assistant pane, select the instance for which you want to specify operating region expressions.
    You can also select more than one instance of the same type to simultaneously specify operating region expressions for all of them.
    • To select more than one instance at a time on the schematic, do one of the following:
      • Hold down the Shift key and click instances.
      • Click and drag the mouse over the instances you want to select.
        All the instances that are within the yellow bounding box that appears are included in the selection.
    • To select more than one instance at a time in the Navigator assistant pane, hold down the Shift key (for contiguous selection) or the Ctrl key (for noncontiguous selection) and click the next instance to add it to the selection set.

    For example, in the following figure, four instances of the nmos2v device are selected in the Navigator assistant pane.
  2. Specify operating region expressions for the selected instances by doing the following in the Operating Region assistant pane:
    1. (Optional) If the selected instance has subcircuit instances, select the subcircuit instance for which the operating region expression is to be applied from the SubInst drop-down list.
    2. In the Expr column on the Operating Region assistant, click where it says <Click to add> and enter an operating region expression. For example:
      vgs - vth 

      To view the list of operating point parameters that you can use in expressions, click the Show Parameters button on the Operating Region assistant.
      The operating point parameters for the selected instances are displayed in the Op Point Parameters form.
      Identify the parameters from this list and create expression in the Expr column of the Operating Region assistant.
    3. Click in the Spec column and select the specification for the expression.
    4. Click in the Value column and specify the target value for the specification.
      For example, in the following figure, two expressions have been specified for the selected instances.
  3. By default, the specified expressions are enabled. Ensure that the expressions you want to add for the selected instances are enabled, and the expressions you do not want to add for the selected instances are disabled.
    • To enable an expression, select the check box next to it in the Enable column.
    • To disable an expression, deselect the check box next to it in the Enable column.
  4. Click Add.
    The enabled expressions are added for the selected instances and displayed in the table at the bottom of the Operating Region assistant pane as shown below:
The Operating Region assistant form displays only the operating region expressions specified for the instances that are currently selected on the schematic or in the Navigator assistant pane. However, if no instance is selected, expressions for all the instances are displayed in the Operating Region assistant. Alternatively, you can click the Show All button to view all the operating region expressions specified for the test in the Operating Region Specification form. For more information about using the Operating Region Specification form, see Modifying Operating Region Specifications.

See also:

Specifying Operating Region Expressions

You can use the advanced view of the Operating Region Specification form to specify operating region expressions. To do this:

  1. Click Show All on the Operating Region assistant pane.
    The Operating Region Specification form appears.

  2. Select Evaluate over DC Sweep to evaluate the specification over the DC sweep specified through the Choosing Analyses form for DC analysis.
    Selecting this check box adds the option enable_dcsweep_op_info=yes to the netlist.
    This check box is available from Spectre 20.1 ISR15 and Spectre 21.1. ISR6.
  3. Select the Advanced check box to display the advanced view of the form.
  4. Select the instances for which you want to add operating region expressions by doing one of the following:
    Select To

    Master

    Select instances based on the library in which they exist.

    Then, do the following to select instances:

    1. Use the Library and Cell drop-down lists to select the library and cell in which cellviews for instances in the design for the test exist.
      All the instances of the cellviews in the design for the test are displayed in the Instances list box.
    2. In the Instances list box, select the instances for which you want to add operating region expressions.
    To select more than one instance at a time, hold down the Shift key (for contiguous selection) or the Ctrl key (for noncontiguous selection) and click the next instance to add it to the selection set.
    1. (Optional) If the selected instance has subcircuit instances, select the subcircuit instance for which the operating region expression is to be applied from the Subcircuit drop-down list.

    Model

    Select instances based on the models specified for the instances.

    Then, do the following to select instances:

    1. From the Model drop-down list, select a model.
      All the instances in the design for the test for which the model is specified are displayed in the Instances list box.
    2. In the Instances list box, do one of the following:
      • Select All to add operating region expressions to all instances for which the model is specified.
      • Select the specific instances for which you want to add operating region expressions.
        To select more than one instance at a time, hold down the Shift key (for contiguous selection) or the Ctrl key (for noncontiguous selection) and click the next instance to add it to the selection set.
    The list of operating point parameters that you can use in expressions for the selected instances are displayed in the Parameters field.

  1. Specify operating region expressions for the selected instances by doing the following:
    1. In the Expression column, click where it says <Click to add> and enter an operating region expression.
    2. Click in the Spec column and select the specification for the expression.
    3. Click in the Value column and specify the target value for the specification.
      For example, in the following figure, two expressions have been specified for the selected instances.

  2. By default, the specified expressions are enabled. Ensure that the expressions you want to add for the selected instances are enabled, and the expressions you don’t want to add for the selected instances are disabled.
    • To enable an expression, select the check box next to it in the Enable column.
    • To disable an expression, deselect the check box next to it in the Enable column.
  3. Click Add.
    The enabled expressions are added for the selected instances and displayed in the table at the bottom of the Operating Region Specification form as shown below.

See also:

Modifying Operating Region Specifications

To modify an operating region specification, do the following:

  1. In the Outputs Setup tab, double-click the Details field in the row for the operating region specification. An ellipse button appears in this field. Click the ellipse button.

The Operating Region Specification form appears displaying the operating region expressions specified for the test.

You can use this form to perform the following tasks:

Task Description

View operating region expressions

Modify operating region expressions

To modify an expression, double-click in the Expr/Param, Spec or Value columns and make the required changes.

If you need to use a common expression for multiple parameters, you can modify expressions for their rows together. For more details, see Modifying Multiple Expressions Together.

Copy and paste operating region expressions

For more information, see Copying and Pasting Operating Region Expressions

Delete operating region expressions

For more information, see Deleting Operating Region Expressions

Enable or disable operating region expressions

For more information, see Disabling and Enabling Operating Region Expressions

Click the Launch Schematic Assistant to open the schematic for the test with the Operating Region and Navigator assistant panes docked.

You can use the Operating Region assistant pane to quickly add or modify operating region expressions for instances you select on the schematic or in the Navigator assistant pane.

The Operating Region assistant pane displays only the operating region expressions for the instances that are currently selected on the schematic or in the Navigator assistant pane.

For more information, see Specifying Operating Region Expressions Using the Operating Region Assistant Pane

Click the Advanced check box to access a detailed user interface for adding or modifying operating region expressions

For more information, see Specifying Operating Region Expressions

  1. Modify the operating region expressions as required.
  2. Click OK.

Modifying Multiple Expressions Together

To modify multiple operating region expressions together, do the following:

  1. Hold down the Ctrl key and select more than one expressions.
  2. Right-click and choose Edit Selected Rows.

The Editing Op Region Form Rows form appears.

  1. Enter an expression in the Expr/Param field.
  2. Enter a spec value in the Value field.
  3. Click OK.

The expression that you specified in this form is updated in the Expr/Param column of all the selected rows in the Operating Region Specifications form.

Copying and Pasting Operating Region Expressions

To copy an operating region expression, do the following:

To paste an operating region expression, do the following:

Disabling and Enabling Operating Region Expressions

The Enable column in the Operating Region Specification form indicates whether an operating region expression is enabled or disabled. By default, the operating region expressions you add are automatically enabled.

To disable an operating region expression, do the following:

To enable an operating region expression, do the following:

Deleting Operating Region Expressions

To delete an operating region expression, do the following:

Saving Operating Region Expressions to a File

You can save the operating region expressions specified in the Operating Region assistant pane or the detailed view of the Operating Region Specification form to a file. You can then load the file to quickly add operating region expressions. For more information about loading operating region expressions, see Loading Operating Region Expressions from a File.

To save operating region expressions to a file, do the following:

  1. On the Operating Region assistant pane or the detailed view of the Operating Region Specification form, in the Label field, type a name for the specified set of operating region expressions.
    For example, in the following figure, the two expressions specified in the Operating Region assistant pane will be saved with the name myOpRegionExpr.
    Figure 22-1 Saving Expressions in Operating Region Assistant Pane
    Figure 22-2 Saving Expressions in Operating Region Specification Form
  2. Ensure that the expressions you want to save are enabled, and the expressions you don’t want to save are disabled.
    • To enable an expression, select the check box next to it in the Enable column.
    • To disable an expression, deselect the check box next to it in the Enable column.
  3. Click the Save button.
    The Select Operating Region Expression File form appears.
  4. Specify the name and location of the file and click Save.
    The enabled expressions are saved in the file.
    If you specify name of an existing file, the tool prompts you to check if you want to replace the existing file or specify another file name. The tool does not append the expressions to the existing file.
    You can concatenate the expressions specified in multiple files and save in a common file. In this case, when the file is loaded, all expressions saved with a common label name are displayed together.

Loading Operating Region Expressions from a File

You can load operating region expressions from a file into the Operating Region assistant pane and the detailed view of the Operating Region Specification form.

To load operating region expressions from a file, do the following:

  1. Click the Load button in the Operating Region assistant pane or the detailed view of the Operating Region Specification form.
    The Select Operating Region Expression File form appears.
  2. Select the file and click Open.
    The expressions in the file are displayed. The name specified for the set of expressions in the file is also displayed in the Label drop-down list.
    The names specified for the sets of expressions in the files loaded during the current ADE Assembler session are displayed in the Label drop-down list. Therefore, during the current ADE Assembler session, you can select a name from the Label drop-down list to add expressions, instead of loading the corresponding file again.
    Expressions saved in a tab- or comma-separated file can also be loaded in the Operating Region assistant pane or the Operating Region Specification form. In this case, if the expressions are not associated with a label name, the Label list displays <none>.

Deleting Operating Region Specifications

To delete an existing operating region specification, do the following:

Disabling and Enabling Operating Region Specifications

To disable an operating region specification, do the following:

To enable an operating region specification, do the following:

Viewing Specification Results in the Results Tab

When you run a simulation, the simulation results are displayed in the Results tab of the Outputs pane. For more information about running simulations, see Chapter 17, “Running Simulations.”

Figure 22-3 Display of Simulation Results in the Results Tab

After running a simulation, if you add or modify measurement expressions or specifications in the Outputs Setup tab, you can click the button on the Results tab to update the results displayed in the Results tab based on the new or modified expressions and specifications. For more information about using the button, see Re-Evaluation of Expressions and Specifications.

The results will also be updated based on the new or modified expressions and specifications if you click the button on the Results tab to plot across all points. For more information about using the button, see Refreshing Graphs.

The Results tab displays a plot icon for the signals that you have selected for plotting or saving on the Outputs Setup tab of the Outputs pane. For more information about selecting signals for plotting or saving, see Specifying Whether a Result Will Be Saved or Plotted.

For measurements defined in the Outputs Setup tab, the program measures the result against the performance specification and displays the following information for each simulation in the Results tab.

Viewing Operating Region Violations

When you run a simulation, ADE Assembler combines the individual device expressions in the operating region specification into a single output and calculates the total error. The total error is the sum of all the individual violation margins plus an offset of 1.

An operating region specification is met if the results for all its operating region expressions fall within their corresponding target values. The Results tab displays a value of 0 and a pass status if an operating region specification is met. For example, in the following figure, the operating region specification for the ACGainBW test has a pass status because its value is 0.

A violation occurs when the result for any operating region expression in the specification falls outside its specified target value. In this case, the Results tab displays a value greater than 0 that is calculated as, shown below:

failed operating region result = 1 + sum of fail margin for failed expressions

The operating region result is never between 0 and 1.

A near or fail status depending on the range of deviation from the target spec value. For example, in the following figure, the operating region specification for the AC test, Op_Region, has a near status because its value is slightly greater than 1 whereas the spec requires the value to be less than 1.

To view operating region violations, do the following:

Two columns are shown for each corner—the column with corner name shows the result of expression evaluation for that corner and the Margin column next to it shows the margin by which the result deviates from the spec value. Depending on the difference between the spec and margin values, cells in the Margin column are colored as green, red, and yellow to highlight the pass, fail, or near status, respectively.

You can right-click the entries in this table and select Expand Sweep to view the DC sweep result values in a separate table.

This table displays the test name, corner name, point value, and the DC sweep variable information.

This Expand Sweep option is available only when the Evaluate over DC Sweep check box is selected in the Setup tab.

Related Topics

Re-evaluating Operating Region Expressions

Highlighting Operating Region Violations on the Schematic

Unhighlighting Operating Region Violations on the Schematic

Hiding and Showing a Single Column

Hiding and Showing Multiple Columns

Hiding and Showing Only Expression Columns or Margin Columns

Re-evaluating Operating Region Expressions

Make sure that you load the setup to active before re-evaluating outputs on the Outputs Setup tab. You cannot re-evaluate operating region expressions only by viewing results.

If required, you can edit the expressions that do not meet the specifications and re-evaluate results. For this, do the following in the Operating Region Specification form:

  1. Open the Setup tab and add, edit, or delete expressions, as required.
  2. Open the Results tab and click Re-evaluate.

The results for all the expressions are re-evaluated for all the corners and updated on the Results tab. If you get the desired results, click OK to save the changes in the operating region specification. Now, to see the result of change in operating region specification on the operating region output, click on the Results tab of ADE Assembler Outputs pane. All the expressions and operating region outputs are re-evaluated. The operating region output is now the result of new operating region specifications.

Highlighting Operating Region Violations on the Schematic

You can highlight the instance on the schematic for which an operating region expression has failed. This helps you to quickly correct the design or the operating region expression.

To highlight the instance for which an operating region expression has failed, do the following:

  1. Right-click the row for which an expression has failed.
  2. Choose Highlight.
    The instance for which the expression failed is highlighted in yellow on the schematic. For example, in the following figure, the instance M10 for which an expression failed is highlighted on the schematic.
    To highlight the instances corresponding to multiple failed expressions, hold down the Shift key (for contiguous selection) or the Ctrl key (for noncontiguous selection) and click the row for the next failed expression to add more expressions to the selection set, then click the Highlight button.
In a hierarchical design, a block is highlighted in yellow if an operating region expression has failed for an instance in the block. When you descend into the block, the instance for which the expression has failed is also highlighted in yellow.

Unhighlighting Operating Region Violations on the Schematic

To clear the highlight from all the instances corresponding to the operating region violations on the schematic, do the following:

  1. Right-click the row for which an expression has failed.
  2. Choose Clear.

Hiding and Showing a Single Column

You can hide and show the required columns in the Setup or Results tab of the Operating Region Specification form.

To hide a column that is currently displayed, do the following:

  1. Right-click a column name to display a context-sensitive menu.

A tick mark next to a column name in the context-sensitive menu indicates that the column is currently displayed.

  1. Choose the name of the column you want to hide.

To show a column that is currently not displayed,

Hiding and Showing Multiple Columns

You can hide and show the required columns in the Setup or Results tab of the Operating Region Specification form.

To hide multiple columns that are currently displayed, do the following:

1. Hold down the Ctrl key and click in any cell in the columns that you want to hide.

2. Right-click in any of the selected cells and choose Hide Columns.

All the selected columns are hidden.

To display all the columns that are currently hidden,

All the hidden columns become visible on the Results tab.

Hiding and Showing Only Expression Columns or Margin Columns

If multiple columns are displayed on the Results tab of the Operating Region Specification form, you can show only expression columns or margin columns. This helps in showing only one type of results at a time.

To hide all expression columns and to show only the margin columns:

To unhide the hidden margin columns, right-click in any cell and choose Show Hidden Columns.

Working with the Specification Summary

The Spec Summary form provides a mechanism for managing and organizing simulation results for different purposes:

For more information, see the following topics:

Viewing the Spec Summary

To view the specification (spec) summary, do the following on the Results tab of the Outputs pane:

You can view the spec summary only for the results of a Single Run, Sweeps and Corners or Monte Carlo Sampling simulation run. You cannot view the spec summary for the results of a Global Optimization, Local Optimization, Size Over Corners or Sensitivity Analysis simulation run.

The spec summary for the results in the currently active Results tab is displayed in the Spec Summary form.

The toolbar in the Spec Summary form is described in the following table:

Table 22-1 Spec Summary Toolbar

Icon Name Description

Back to Summary View

Switches from the detail view to the summary view.

For more information, see Viewing the Detailed Results for Specifications.

Show Detail View

Displays the detailed results for the selected specifications in the detail view.

For more information, see Viewing the Detailed Results for Specifications.

Delete

Deletes the selected specifications.

For more information, see Deleting a Specification from the Spec Summary.

Save Spec Summary

Saves the spec summary.

For more information, see Saving a Spec Summary.

Transpose Table

Displays the results in the detail view in horizontal or vertical format.

For more information, see Viewing the Detailed Results for Specifications.

Update with Latest Results Data

Updates the spec summary with the latest results from the history item selected for each specification.

For more information, see Updating the Spec Summary with the Latest Results.

Recreate for the Current History Results

Recreates the spec summary using the results in the active Results tab.

For more information, see Recreating the Spec Summary from the Results in the Active Results Tab.

Plot Signals

Plots the results for the selected row.

For more information, see Plotting the Results for Specifications.

Export to CSV or HTML File

Exports a spec summary to a HTML or comma-separated values (CSV) file

For more information, see Exporting a Spec Summary to a HTML or CSV File.

The columns in the Spec Summary form are described in the following table:

Table 22-2 Spec Summary Columns

Column Description

Output

Displays the names of the outputs for which the specifications are evaluated.

History

Displays the names of the history items from which the results of  outputs are displayed.

Test

Displays the names of the tests for which the results of outputs are displayed.

Conditions

Displays the values of each swept parameter as a list or range of values.

For example, if a parameter p is swept through a list of values x, y, and z, then the Conditions column displays the values as
p=x,y,z

If a parameter p is swept through a range of values, the Conditions column displays the values as
p=startValue:increment:stopValue

Min

Displays the minimum measured value for each output.

Max

Displays the maximum measured value for each output.

Std Dev

Displays the standard deviation of the measured values for each output.

You can hover the mouse pointer over a standard deviation value to view the following information in a pop-up:

  • The number of simulations for the output expression.
  • The mean of the measured values for all simulations for the output expression.
  • The variance in the measured values.

Note: For Single Run, Sweeps and Corners, this is the population standard deviation. For other run modes, this is the sample standard deviation.

Spec

Displays the specifications for outputs.

Pass/Fail

Displays a pass, near or fail status for each specification.

  • pass means that all the measured values are within the limits defined by the specification.
  • near means that one or more measured values are no more than 10% outside the target value of the specification.
  • fail means that one or more measured values are greater than 10% outside the target value of the specification.

To investigate the measured values further, see Viewing the Detailed Results for Specifications.

Saving a Spec Summary

To save a spec summary in the Spec Summary form, do the following:

  1. In the Name field, type a name for the spec summary.
  2. Click the button.

The spec summary is saved to the documents directory of the maestro view and displayed in the Documents tree on the Data View pane. For more information about working with documents, see Chapter 23, “Working with Documents and Datasheets.”

The spec summary file is saved with the .specsummary extension.

Opening a Spec Summary

To open an existing spec summary, do one of the following:

The spec summary is displayed in the Spec Summary form.

Deleting a Spec Comparison

To delete an existing spec summary, do the following:

Adding a Specification to the Spec Summary

You can add a specification from the results in the active Results tab or from the results of a history item.

You can add specifications only from the results of a Single Run, Sweeps and Corners or Monte Carlo simulation run. You cannot add specifications from the results of a Global Optimization, Local Optimization, Size Over Corners or Sensitivity Analysis simulation run.

To add a specification from the results in the active Results tab for a Single Run, Sweeps and Corners or Monte Carlo simulation run, do the following:

To add a specification from the results of a history item for a Single Run, Sweeps and Corners or Monte Carlo simulation run, do the following:

  1. Display the results for the history item. For more information, see Working with History Checkpoints.
  2. On the Results tab for the history item, right-click a specification and choose Add to Spec Summary.
    To add multiple specifications, hold down the Shift key (for contiguous selection) or the Ctrl key (for noncontiguous selection), right-click and choose Add to Spec Summary.

Deleting a Specification from the Spec Summary

To delete a specification from the spec summary, do the following.

Changing the History Item from which Results are Displayed

The History column in the Spec Summary form displays the names of the history items from which the results of the specifications are displayed. You can change the history item for a specification to display the results from that history item.

You can change only to a history item for a Single Run, Sweeps and Corners or Monte Carlo simulation run. You cannot change to a history item for a Global Optimization, Local Optimization, Size Over Corners or Sensitivity Analysis simulation run.

To change the history item for a single specification, do the following:

  1. In the History column, click the name of the history item for the specification.
    A button with the name of the history item appears.
  2. Click the button and select a different history item.
    The results for the specification from the selected history item are displayed in the spec summary.

To change the history item for multiple specifications, do the following:

  1. Hold down the Shift key (for contiguous selection) or the Ctrl key (for noncontiguous selection) and click the name of a history item for any specification in the selection.
    A button with the name of the history item appears.
  2. Click the button and select a different history item.
    The results for the specifications from the selected history item are displayed in the spec summary.

Updating the Spec Summary with the Latest Results

To update the spec summary with the latest results from the history item selected for each specification, do the following:

If the history item selected for a specification does not exist, the row for the specification is highlighted in red color. You can select a different history item for the specification (see Changing the History Item from which Results are Displayed) or delete the specification from the spec summary (see Deleting a Specification from the Spec Summary).

Recreating the Spec Summary from the Results in the Active Results Tab

To clear the current spec summary view and display all the specifications from the active Results tab, do the following.

You can recreate the spec summary only from the results of a Single Run, Sweeps and Corners or Monte Carlo simulation run. You cannot recreate the spec summary from the results of a Global Optimization, Local Optimization, Size Over Corners or Sensitivity Analysis simulation run.

Viewing the Detailed Results for Specifications

To view the detailed results for a specification, do the following:

To go back to the summary view, do the following:

Plotting the Results for Specifications

To plot the results for the specifications in the summary view, do the following:

To plot the results in the detail view, do the following:

  1. Click the button to display the results in the detail view in the vertical format.
    See Detailed Results in Vertical Format figure for an example of the detail in the vertical format.
  2. Select a row in the table on the left-hand side and click the button.
    To plot the results for multiple rows, hold down the Shift key (for contiguous selection) or the Ctrl key (for noncontiguous selection) and click the button.

Exporting a Spec Summary to a HTML or CSV File

To export a spec summary to a HTML or comma-separated values (CSV) file, do the following:

  1. Click the button.
    The Export Results form appears.
  2. In the File name field, type a file name with the .html, .htm or .csv extension.
  3. Click Save.
    The program exports the results to the file you specified. By default, the file is saved in the documents directory of the maestro view and displayed in the Documents tree on the Data View pane. For more information about working with documents, see Chapter 23, “Working with Documents and Datasheets.”
If you click the button in the detail view, the resulting HTML or CSV file contains the contents of the detail view and summary view.

Sorting Data in the Spec Summary Form

To sort data in the Spec Summary form, do the following:

Hiding and Showing Columns in the Spec Summary Form

To hide a column, do the following:

To display a hidden column, do the following:


Return to top
 ⠀
X