CSE 497A Spring 2002 Functional Verification Lecture 2/3 Vijaykrishnan Narayanan - PowerPoint PPT Presentation

About This Presentation
Title:

CSE 497A Spring 2002 Functional Verification Lecture 2/3 Vijaykrishnan Narayanan

Description:

CSE 497A Spring 2002 Functional Verification Lecture 2/3 Vijaykrishnan Narayanan Course Administration Instructor Vijay Narayanan (vijay_at_cse.psu.edu ... – PowerPoint PPT presentation

Number of Views:319
Avg rating:3.0/5.0
Slides: 124
Provided by: kaat60
Learn more at: https://www.cse.psu.edu
Category:

less

Transcript and Presenter's Notes

Title: CSE 497A Spring 2002 Functional Verification Lecture 2/3 Vijaykrishnan Narayanan


1
CSE 497ASpring 2002Functional
VerificationLecture 2/3Vijaykrishnan Narayanan
2
Course Administration
  • Instructor Vijay Narayanan (vijay_at_cse.psu.
    edu)
  • 229 Pond Lab
  • Office Hours T
    1000-1100 W 100-215
  • Tool Support Jooheung Lee (joolee_at_cse.psu.edu)
  • TA TBA
  • Laboratory 101 Pond Lab
  • Materials www.cse.psu.edu/vijay/verify/
  • Texts
  • J. Bergeron. Writing Testbenches Functional
    Verification of HDL Models. Kluwer Academic
    Publishers.
  • Class notes - on the web

3
Grading
  • Grade breakdown
  • Midterm Exam 20
  • Final Exam 25
  • Verification Projects (4) 40
  • Homework (3) 15
  • No late homeworks/project reports will be
    accepted
  • Grades will be posted on course home page
  • Written/email request for changes to grades
  • April 25 deadline to correct scores

4
  • Secret of Verification
  • (Verification Mindset)

5
  • The Art of Verification
  • Two simple questions
  • Am I driving all possible input scenarios?
  • How will I know when it fails?

6
Three Simulation Commandments
Thou shalt not move onto a higher platform until
the bug rate has dropped off
  • Thou shalt stress thine logic harder than it will
    ever be stressed again

Thou shalt place checking upon all things
7
  • General Simulation Environment

Compiler (not always required)
C/C HDL Testbenches Specman e Synopsis' VERA
Event simulator Cycle simulator Emulator
Initialization Run-time requirements
Testcase results
Event Simulation compiler Cycle simulation
compiler .... Emulator Compiler
VHDL Verilog
8
Logic Designer
Environment Developer
Verification Engineer
  • Transfer Testcase

Model Builder
Project Manager
9
  • Some lingo
  • Facilities a general term for named wires (or
    signals) and latches. Facilities feed gates
    (and/or/nand/nor/invert, etc) which feed other
    facilities.
  • EDA Engineering Design Automation--Tool vendors.

10
  • More lingo
  • Behavioral Code written to perform the function
    of logic on the interface of the
    design-under-test
  • Macro 1. A behavioral 2. A piece of logic
  • Driver Code written to manipulate the inputs of
    the design-under-test. The driver understands
    the interface protocols.
  • Checker Code written to verify the outputs of
    the design-under-test. A checker may have some
    knowledge of what the driver has done. A check
    must also verify interface protocol compliance.

11
  • Still more lingo
  • Snoop/Monitor Code that watches interfaces or
    internal signals to help the checkers perform
    correctly. Also used to help drivers be more
    devious.
  • Architecture Design criteria as seen by the
    customer. The design's architecture is specified
    in documents (e.g. POPS, Book 4, Infiniband,
    etc), and the design must be compliant with this
    specification.
  • Microarchitecture The design's implementation.
    Microarchitecture refers to the constructs that
    are used in the design, such as pipelines,
    caches, etc.
  • Escape An error that appears in test floor
    escaping verification

12
  • Typical Verification diagram

Coverage Data
Stimulus
Device
types
FSMs
latency
conditions
address
transactions
sequences
transitions
13
  • Verification Cycle

Develop environment
Create Testplan
Debug hardware
Escape Analysis
Regression
Hardware debug
Fabrication
14
  • Verification Testplan
  • Team leaders work with design leaders to create a
    verification testplan. The testplan includes
  • Schedule
  • Specific tests and methods by simulation level
  • Required tools
  • Input criteria
  • Completion criteria
  • What is expected to be found with each test/level
  • What's not covered by each test/level

15
Verification is a process used to demonstrate the
functional correctness of a design. Also called
logic verification or simulation.
16
Reconvergence Model
  • Conceptual representation of the verification
    process
  • Most important question
  • What are you verifying?

Transformation
Verification
17
What is a testbench?
  • A testbench usually refers to the code used to
    create a pre-determined input sequence to a
    design, then optionally observe the response.
  • Generic term used differently across industry
  • Always refers to a testcase
  • Most commonly (and appropriately), a testbench
    refers to code written (VHDL, Verilog, etc) at
    the top level of the hierarchy. The testbench is
    often simple, but may have some elements of
    randomness
  • Completely closed system
  • No inputs or outputs
  • effectively a model of the universe as far as the
    design is concerned.
  • Verification challenge
  • What input patterns to supply to the Design Under
    Verification and what is expected for the output
    for a properly working design

18
  • Show Multiplexer Testbench

19
Importance of Verification
  • Most books focus on syntax, semantics and RTL
    subset
  • Given the amount of literature on writing
    synthesizeable code vs.. writing verification
    testbenches, one would think that the former is a
    more daunting task. Experience proves otherwise.
  • 70 of design effort goes to verification
  • Properly staffed design teams have dedicated
    verification engineers.
  • Verification Engineers usually outweigh designers
    2-1
  • 80 of all written code is in the verification
    environment

20
  • The Line Delete Escape
  • Escape A problem that is found on the test
    floor and therefore has escaped the verification
    process
  • The Line Delete escape was a problem on the H2
    machine
  • S/390 Bipolar, 1991
  • Escape shows example of how a verification
    engineer needs to think

21
  • The Line Delete Escape
  • (pg 2)
  • Line Delete is a method of circumventing bad
    cells of a large memory array or cache array
  • An array mapping allows for removal of defective
    cells for usable space

22
  • The Line Delete Escape
  • (pg 3)

If a line in an array has multiple bad bits (a
single bit usually goes unnoticed due to
ECC-error correction codes), the line can be
taken "out of service". In the array pictured,
row 05 has a bad congruence class entry.
23
  • The Line Delete Escape
  • (pg 4)

Data enters ECC creation logic prior to storage
into the array. When read out, the ECC logic
corrects single bit errors and tags Uncorrectable
Errors (UEs), and increments a counter
corresponding to the row and congruence class.
24
  • The Line Delete Escape
  • (pg 5)

Data in
When a preset threshhold of UEs are detected from
a array cell, the service controller is informed
that a line delete operation is needed.
Data out
25
  • The Line Delete Escape
  • (pg 6)

The Service controller can update the
configuration registers, ordering a line delete
to occur. When the configuration registers are
written, the line delete controls are engaged and
writes to row 5, congruence class 'C'
cease. However, because three other cells remain
good in this congruence class, the sole
repercussion of the line delete is a slight
decline in performance.
26
  • The Line Delete Escape
  • (pg 7)

How would we test this logic? What must occur in
the testcase? What checking must we implement?
Data out
27
Verification is on critical path
28
Want to minimize Verification Time!
29
Ways to reduce verification time
  • Verification can be reduced through
  • Parallelism Add more resources
  • Abstraction Higher level of abstraction (i.e. C
    vs.. Assembly)
  • Beware though this means a reduction in control
  • Automation Tools to automate standard processes
  • Requires standard processes
  • Not all processes can be automated

30
  • Hierarchical Design

Allows design team to break system down into
logical and comprehendable components. Also
allows for repeatable components.
31
  • Hierarchical design
  • Only lowest level macros contain latches and
    combinatorial logic (gates)
  • Work gets done at these levels
  • All upper layers contain wiring connections only
  • Off chip connections are C4 pins

32
  • Current Practices for Verifying a System
  • Designer Level sim
  • Verification of a macro (or a few small macros)
  • Unit Level sim
  • Verification of a group of macros
  • Element Level sim
  • Verification of a entire logical function such as
    a processor, storage controller or I/O control
  • Currently synonomous with a chip
  • System Level sim
  • Multiple chip verification
  • Often utilizes a mini operating system

33
Ways to reduce verification time
  • Verification can be reduced through
  • Parallelism Add more resources
  • Abstraction Higher level of abstraction (i.e. C
    vs.. Assembly)
  • Beware though this means a reduction in
    control/additional training
  • Vera, e are examples of verification languages
  • Automation Tools to automate standard processes
  • Requires standard processes
  • Not all processes can be automated

34
Human Factor in Verification Process
  • An individual (or group of individuals) must
    interpret specification and transform into
    correct function.

RTL Coding
Specification
Interpre- tation
Verification
35
  • Need for Independent Verification
  • The verification engineer should not be an
    individual who participated in logic design of
    the DUT
  • Blinders If a designer didn't think of a
    failing scenario when creating the logic, how
    will he/she create a test for that case?
  • However, a designer should do some verification
    on his/her design before exposing it to the
    verification team
  • Independent Verification Engineer needs to
    understand the intended function and the
    interface protocols, but not necessarily the
    implementation

36
  • Verification Do's and Don'ts
  • DO
  • Talk to designers about the function and
    understand the design first, but then
  • Try to think of situations the designer might
    have missed
  • Focus on exotic scenarios and situations
  • e.g try to fill all queues while the design is
    done in a way to avoid any buffer full conditions
  • Focus on multiple events at the same time

37
  • Verification Do's and Don'ts (continued)
  • Try everything that is not explicitly forbidden
  • Spend time thinking about all the pieces that you
    need to verify
  • Talk to "other" designers about the signals that
    interface to your design-under-test
  • Don't
  • Rely on the designer's word for input/output
    specification
  • Allow RIT Criteria to bend for sake of schedule

38
Ways to reduce human-introduced errors
  • Automation
  • Take human intervention out of the process
  • Poka-Yoka
  • Make human intervention fool-proof
  • Redundancy
  • Have two individuals (or groups) check each
    others work

39
Automation
  • Obvious way to eliminate human-introduced errors
    take the human out.
  • Good in concept
  • Reality dictates that this is not feasible
  • Processes are not defined well enough
  • Processes require human ingenuity and creativity

40
Poka-Yoka
  • Term coined in Total Quality Management circles
  • Means to mistake-proof the human intervention
  • Typically the last step in complete automation
  • Same pitfalls as automation verification
    remains an art, it does not yield itself to
    well-defined steps.

41
Redundancy
  • Duplicate every transformation
  • Every transformation made by a human is either
  • Verified by another individual
  • Two complete and separate transformations are
    performed with each outcome compared to verify
    that both produced the same or equivalent result
  • Simplest
  • Most costly, but still cheaper than redesign and
    replacement of a defective product
  • Designer should NOT be in charge of verification!

42
What is being verified?
  • Choosing a common origin and reconvergence points
    determines what is being verified and what type
    of method to use.
  • Following types of verification all have
    different origin and reconvergence points
  • Formal Verification
  • Model Checking
  • Functional Verification
  • Testbench Generators

43
Formal Verification
  • Once the end points of formal verification
    reconvergence paths are understood, then you know
    exactly what is being verified.
  • 2 Types of Formal
  • Equivalence
  • Model Checking

44
Equivalence Checking
  • Compares two models to see if equivalence
  • Netlists before and after modifications
  • Netlist and RTL code (verify synthesis)
  • RTL and RTL (HDL modificiations)
  • Post Synthesis Gates to Post PD Gates
  • Adding of scan latches, clock tree buffers
  • Proves mathematically that the origin and output
    are logically equivalent
  • Compares boolean and sequential logic functions,
    not mapping of the functions to a specific
    technology
  • Why do verification of an automated synthesis
    tool?

45
Equivalence Reconvergence Model
Synthesis
RTL
Gates
Check
46
Model Checking
  • Form of formal verification
  • Characteristics of a design are formally proven
    or disproved
  • Find unreachable states of a state machine
  • If deadlock conditions will occur
  • Example If ALE will be asserted, either DTACK or
    ABORT signal will be asserted
  • Looks for generic problems or violations of user
    defined rules about the behavior of the design
  • Knowing which assertions to prove is the major
    difficulty

47
Steps in Model Checking
  • Model the system implementation using a finite
    state machine
  • The desired behavior as a set of temporal-logic
    formulas
  • Model checking algorithm scans all possible
    states and execution paths in an attempt to find
    a counter-example to the formulas
  • Check these rules
  • Prove that all states are reachable
  • Prove the absence of deadlocks
  • Unlike simulation-based verification, no test
    cases are required

48
Problems with Model Checking
  • Automatic verification becomes hard with
    increasing number of states
  • 10100 states (larger than number of protons in
    the universe) but still does not go far beyond
    300 bits of state variables.
  • Absurdly small for millions of transistors in
    current microprocessors
  • Symbolic model checking explores a larger set of
    states concurrently.
  • IBM Rulebase (Feb 7) is a symbolic Model Checking
    tool

49
Model Checking Reconvergence Model
RTL
Specification
RTL
Interpretation
Model Checking
Assertions
50
Functional Verification
  • Verifies design intent
  • Without, one must trust that the transformation
    of a specification to RTL was performed correctly
  • Prove presence of bugs, but cannot prove their
    absence

51
Functional Reconvergence Model
Specification
RTL
Functional Verification
52
Testbench Generators
  • Tool to generate stimulus to exercise code or
    expose bugs
  • Designer input is still required
  • RTL code is the origin and there is no
    reconvergence point
  • Verification engineer is left to determine if the
    testbench applies valid stimulus
  • If used with parameters, can control the
    generator in order to focus the testbenches on
    more specific scenarios

53
Testbench Generation Reconvergence Model
Code Coverage/Proof
RTL
Testbench
Metrics
Testbench Generation
54
Functional Verification Approaches
  • Black-Box Approach
  • White-Box Approach
  • Grey-Box Approach

55
Black-Box
  • The black box has inputs, outputs, and performs
    some function.
  • The function may be well documented...or not.
  • To verify a black box, you need to understand
    the function and be able to predict the outputs
    based on the inputs.
  • The black box can be a full system, a chip, a
    unit of a chip, or a single macro.
  • Can start early

56
White-Box
  • White box verification means that the internal
    facilities are visible and utilized by the
    testbench stimulus.
  • Quickly setup interesting cases
  • Tightly integrated with implementation
  • Changes with implementation
  • Examples Unit/Module level verification

57
Grey-Box
  • Grey box verification means that a limited number
    of facilities are utilized in a mostly black-box
    environment.
  • Example Most environments! Prediction of
    correct results on the interface is occasionally
    impossible without viewing an internal signal.

58
Perfect Verification
  • To fully verify a black-box, you must show that
    the logic works correctly for all combinations of
    inputs.
  • This entails
  • Driving all permutations on the input lines
  • Checking for proper results in all cases
  • Full verification is not practical on large
    pieces of designs, but the principles are valid
    across all verification.

59
  • Reality Check
  • Macro verification across an entire system is not
    feasible for the business
  • There may be over 400 macros on a chip, which
    would require about 200 verification engineers!
  • That number of skilled verification engineers
    does not exist
  • The business can't support the development
    expense
  • Verification Leaders must make reasonable
    trade-offs
  • Concentrate on Unit level
  • Designer level on riskiest macros

60
  • Typical Bug rates per level

61
Cost of Verification
  • Necessary Evil
  • Always takes too long and costs too much
  • Verification does not generate revenue
  • Yet indispensable
  • To create revenue, design must be functionally
    correct and provide benefits to customer
  • Proper functional verification demonstrates
    trustworthiness of the design

62
Verification And Design Reuse
  • Wont use what you dont trust.
  • How to trust it?
  • Verify It.
  • For reuse, designs must be verified with more
    strict requirements
  • All claims, possible combinations and uses must
    be verified.
  • Not just how it is used in a specific environment.

63
When is Verification Done?
  • Never truly done on complex designs
  • Verification can only show presence of errors,
    not their absence
  • Given enough time, errors will be uncovered
  • Question Is the error likely to be severe
    enough to warrant the effort spent to find the
    error?

64
When is Verification Done? (Cont)
  • Verification is similar to statistical
    hypothesis.
  • Hypothesis Is the design functionally correct?

65
Hypothesis Matrix
Errors No Errors
Bad Design Type II (False Positive)
Good Design Type I (False Negative)
66
  • Tape-Out Criteria
  • Checklist of items that must be completed before
    tape-out
  • Verification items, along with Physical/Circuit
    design criteria, etc
  • Verification criteria is based on
  • Function tested
  • Bug rates
  • Coverage data
  • Clean regression
  • Time to market

67
Verification VS. Test
  • Two often confused
  • Purpose of test is to verify that the design was
    manufactured properly
  • Verification is to ensure that the design meets
    the functionality intent

68
Verification and Test Reconvergence Model
HW Design
Fabrication
Specification
Silicon
Net list
Verification
Test
69
Reminders
  • Feb 7th IBM Guest lectures
  • 2 class lecture slots (415-630 p.m)
  • Homework Due Thursday (01/24)
  • HW2 on VHDL test benches will be assigned 01/24
  • TA office hours F 100-215 p.m 101 Pond

70
To do
  • Read chapter 2 from text book
  • Try the following tools on unix systems
  • Lint
  • vcs, sccs

71
Verification Tools
  • Automation improves the efficiency and
    reliability of the verification process
  • Some tools, such as a simulator, are essential.
    Others automate tedious tasks and increase
    confidence in the outcome.
  • It is not necessary to use all the tools.

72
Verification Tools
  • Improve efficiency e.g. spell checker
  • Improve reliability
  • Automate portion of the verification process
  • Some tools such as simulators are essential
  • Some tools automate the most tedious tasks and
    increase the confidence in outcome
  • Code coverage tool
  • Linting tols
  • Help ensure that a Type II mistake does not occur

73
Verification Tools
  • Linting Tools
  • Simulators
  • Third Party Models
  • Waveform Viewers
  • Code Coverage
  • Verification Languages (Non-RTL)
  • Revision Control
  • Issue Tracking
  • Metrics

74
Linting Tools
  • UNIX C utility program
  • Parses a C program
  • Reports questionable uses
  • Identifies common mistakes
  • Makes finding those mistakes quick and easy
  • Lint identified problems
  • Mismatched types
  • Misatched argument in function calls either
    number of or type

75
The UNIX C lint program
  • Attempts to detect features in C program files
    that are likely to be bugs, non-portabe or
    wasteful
  • Checks type usage more strictly than a compiler
  • Checks for
  • Unreachable statements
  • Loops not entered at top
  • Variables declared but not used
  • Logical expressions whse value is constant
  • Functions that return values in some places but
    not others
  • Functions called with a varying number or type of
    args
  • Functions whose value is not used

76
Advantages of Lint Tools
  • Know about problems prior to execution
    (simulation for VHDL code)
  • Checks are entirely static
  • Do not require stimulus
  • Do not need to know expected output
  • Can be used to enforce coding guidelines and
    naming conventions

77
Pitfalls
  • Can only find problems that can be statically
    deduced
  • Cannot determine if algorithm is correct
  • Cannot determine if dataflow is correct
  • Are often too paranoid err of side of caution
    Type I/II??errors good design but error
    reported filtering output
  • Should check and fix problems as you go dont
    wait till entire model/code is complete

78
Linting Tools
  • Assist in finding common programming mistakes
  • Only identify certain class of problems
  • Linting is a Static checker
  • Same deficiencies as in a C linter
  • Many false negatives are reported
  • Does assist in enforcing coding guidelines

79
Linting VHDL source code
  • VHDL is strongly typed does not need linting as
    much as Verilog (Can assign bit vectors of
    different lenghts to each other)
  • An area of common problems is use of STD_LOGIC

80
VHDL Example
Library ieee Use ieee.std_logic_1164.all Entity
my_entity is port (my_input in std_logic) End
my_entity Architecture sample of my_entity
is signal s1 std_logic signal sl
std_logic Begin stat1 s1 lt my_input stat2
s1 lt not my_inputs End sample
Warning file x.vhd Signal s1 is multiply
defined Warning file x.vhd Signal sl Has no
drivers
81
Naming Conventions
  • Use a naming convention for signals with multiple
    drivers
  • Multiple driven signals will give warning
    messages but with a naming convention can be
    ignored

82
Cadence VHDL Lint Tool
83
HAL Checks
  • Some of the classes of errors that the HAL tool
    checks for include
  • Interface InconsistencyUnconnected ports
    Incorrect number or type of task/function
    arguments Incorrect signal assignments to input
    ports Unused or Undriven Variables Undriven
    primary output Unused task/function/parameters
    Event variables that are never triggered 2-State
    versus 4-State Issues Conditional expressions
    that use x/z incorrectly Case equality ()
    that is treated as equality ()Incorrect
    assignment of x/z values Expression
    Inconsistency Unequal operand lengths Real/time
    values that are used in expressions Incorrect
    rounding/truncation Case Statement
    Inconsistency Case expressions that contain x or
    z logic Case expressions that are out of range
    Correct use of parallel_case and full_case
    constructs Range and Index Errors Single-bit
    memory words Bit/part selects that are out of
    range Ranged ports that are re-declared

84
Code reviews
  • Objective Identify functional and coding style
    errors prior to functional verification and
    simulation
  • Source code is reviewed by one or more reviewers
  • Goal Identify problems with code that an
    automatd tool would not identify

85
Simulators
  • Simulators are the most common and familiar
    verification tool
  • Simulation alone is never the goal of an
    industrial project
  • Simulators attempt to create an artificial
    universe that mimics the environment that the
    real design will see
  • Only a approximation of reality
  • Digital values n std logic have 9 values
  • Reality signal is a continuous value between
    GND and Vdd

86
Simlators
  • Execute a description of the design
  • Description limited to well defined language with
    precise semantics
  • Simulators are not a static tool require the
    user to set up an environment in which the design
    will find itself this setup is often called
  • Provides inputs and monitors results

testbench
87
Simulators
  • Simulation outputs are validated externally
    against design intent (specification)
  • Two types
  • Event based
  • Cycle based

88
Event Based Simulators
  • Event based simulators are driven based on events
  • An attempt to increase the simulated time per
    unit of wall time
  • Outputs are a function of inputs
  • The outputs change only when the inputs do
  • Moves simulation time ahead to the next time at
    which something occurs
  • The event is the input changing
  • This event causes simulator to re-evaluate and
    calculate new output

89
Cycle Based Simulators
  • Simulation is based on clock-cycles not events
  • All combinational functions collapsed into a
    single operation
  • Cycle based simulators contain no timing and
    delay information
  • Assumes entire design meets setup and holdtime
    for all FFs
  • Timing is usually verified by static timing
    analyzer
  • Can handle only synchronous circuits
  • Only event is active edge of clock
  • All other inputs are aligned with clock (cannot
    handle asynchronous events)
  • Moore machine state changes whenver clk changes
    mealy machines they also depend on inputs which
    can change asynchronously
  • Much faster than event based

90
  • Types of Simulators
  • (con't)
  • Simulation Farm
  • Multiple computers are used in parallel for
    simulation
  • Acceleration Engines/Emulators
  • Quickturn, IKOS, AXIS.....
  • Custom designed for simulation speed
    (parallelized)
  • Accel vs. Emulation
  • True emulation connects to some real, in-line
    hardware
  • Real software eliminates need for special
    testcase

91
  • Speed compare
  • Influencing Factors
  • Hardware Platform
  • Frequency, Memory, ...
  • Model content
  • Size, Activity, ...
  • Interaction with Environment
  • Model load time
  • Testpattern
  • Network utilization

Relative Speed of different Simulators
Event Simulator
1
Cycle Simulator
20
Event driven cycle Simulator
50
Acceleration
1000
Emulation
100000
92
  • Speed - What is fast?
  • Cycle Sim for one processor chip
  • 1 sec realtime 6 month
  • Sim Farm with a few hundred computers
  • 1 sec realtime 1 day
  • Accelerator/Emulator
  • 1 sec realtime 1 hour

93
Co-Simulation
  • Co-simulators are combination of event, cycle,
    and other simulators (acceleration, emulation)
  • Both simulators progress along time in lockstep
    fashion
  • Performance is decreased due to inter tool
    communication.
  • Ambiguities arise during translation from one
    simulator to the other.
  • Verilogs 128 possible states to VHDLs 9
  • Analogs current and voltage into digitals logic
    value and strength.

94
Third Party Models
  • Many designs use off the shelf parts
  • To verify such a design, must obtain a model to
    these parts
  • Often must get the model from a 3rd party
  • Most 3rd party models are provided as compiled
    binary models
  • Why buy 3rd party models?
  • Engineering resources
  • Quality (especially in the area of system timing)

95
Hardware Modelers
  • Are for modeling new hardware. Some hardware may
    be too new for models to available
  • Example In 2000 still could not get a model of
    the Pentium III
  • Sometimes cannot simulate enough of a model in an
    acceptable period of time

96
Hardware Modelers (cont)
  • Hardware modeler features
  • Small box that connects to network that contains
    a real copy of the physical chip
  • Rest of HDL model provides inputs to the chip and
    obtains the chips output to return to your model

97
Waveform Viewers
  • Lets you view transitions on multiple signals
    over time
  • The most common of verification tools
  • Waveform can be saved in a trace file
  • In verification
  • need to know expected output and whenever the
    simulated output is not as expected
  • both the signal value and the signal timing
  • use the testbench to compare the model output
    with the expected

98
  • Coverage
  • Coverage techniques give feedback on how much the
    testcase or driver is exercising the logic
  • Coverage makes no claim on proper checking
  • All coverage techniques monitor the design during
    simulation and collect information about desired
    facilities or relationships between facilities

99
Coverage Goals
  • Measure the "quality" of a set of tests
  • Supplement test specifications by pointing to
    untested areas
  • Help create regression suites
  • Provide a stopping criteria for unit testing
  • Better understanding of the design

100
  • Coverage Techniques
  • People use coverage for multiple reasons
  • Designer wants to know how much of his/her macro
    is exercised
  • Unit/Chip leader wants to know if relationships
    between state machine/microarchitectural
    components have been exercised
  • Sim team wants to know if areas of past escapes
    are being tested
  • Program manager wants feedback on overall quality
    of verification effort
  • Sim team can use coverage to tune regression
    buckets

101
  • Coverage Techniques
  • Coverage methods include
  • Line-by-line coverage
  • Has each line of VHDL been exercised?
    (If/Then/Else, Cases, states, etc)
  • Microarchitectural cross products
  • Allows for multiple cycle relationships
  • Coverage models can be large or small

102
Code Coverage
  • A technique that has been used in software
    engineering for years.
  • By covering all statements adequately the chances
    of a false positive (a bad design tests good) are
    reduced.
  • Never 100 certain that design under verification
    is indeed correct. Code coverage increases
    confidence.
  • Some tools may use file I/O aspect of language
    and others have special features built into the
    simulator to report coverage statistics.

103
Adding Code Coverage
  • If built into simulator - code is automatically
    instrumented.
  • If not built in - must add code to testbench to
    do the checking

104
Code Coverage
  • Objective is to determine if you have overlooked
    exercising some code in the model
  • If you answer yes then must also ask why the code
    is present
  • Coverage metrics can be generated after running a
    testbench
  • Metrics measure coverage of
  • statements
  • possible paths through code
  • expressions

105
Report Metrics for Code Coverage
  • Statement (block)
  • Measures which lines (statements have been
    executed) by the verification suite
  • Path
  • Measures all possible ways to execute a sequence
    of instructions
  • Expression Coverage
  • Measures the various ways paths through the code
    are executed

106
Statements and Blocks
  • Statement coverage can also be called block
    coverage
  • The Model Sim simulator can show how many times a
    statement was executed
  • Also need to insure that executed statements are
    simulated with different values
  • And there is code that was not meant to be
    simulated (code specifically for synthesis for
    example)

107
Path Coverage
  • Measures all possible ways you can execute a
    sequence of statements
  • Example has four possible paths

108
Path Coverage Goal
  • Desire is to take all possible paths through code
  • It is possible to have 100 statement coverage
    but less than 100 path coverage
  • Number of possible paths can be very, very large
    gt keep number of paths as small as possible
  • Obtaining 100 path coverage for a model of even
    moderate complexity is very difficult

109
Expression Coverage
  • A measure of the various ways paths through code
    are taken
  • Example has 100 statement coverage but only 50
    expression coverage

110
100 Code Coverage
  • What do 100 path and 100 expression coverage
    mean?
  • Not much!! Just indicates how thoroughly
    verification suite exercises code. Does not
    indicate the quality of the verification suite.
  • Does not provide an indication about correctness
    of code
  • Results from coverage can help identify corner
    cases not exercised
  • Is an additional indicator for completeness of
    job
  • Code coverage value can indicate if job is not
    complete

111
Functional Coverage
  • Coverage is based on the functionality of the
    design
  • Coverage models are specific to a given design
  • Models cover
  • The inputs and the outputs
  • Internal states
  • Scenarios
  • Parallel properties
  • Bug Models

112
Interdependency-Architectural Level
  • The Model
  • We want to test all dependency types of a
    resource (register) relating to all instructions
  • The attributes
  • I - Instruction add, add., sub, sub.,...
  • R - Register (resource) G1, G2,...
  • DT - Dependency Type WW, WR, RW, RR and None
  • The coverage tasks semantics
  • A coverage instance is a quadruplet ltIj, Ik, Rl,
    DTgt, where Instruction Ik follows Instruction Ij,
    and both share Resource Rl with Dependency Type
    DT.

113
Interdependency-Architectural Level (2)
  • Additional semantics
  • The distance between the instructions is no more
    than 5
  • Restrictions
  • Not all combinations are valid
  • Fixed point instructions cannot share FP
    registers

114
Interdependency-Architectural Level (3)
  • Size and grouping
  • Original size 400 x 400 x 100 x 5
  • Let the Instructions be divided into disjoint
    groups I1 ... In
  • Let the Resources be divided into disjoint groups
    R1 ... Rk
  • After grouping 60 x 60 x 10 x 5 180000

115
The Coverage Process
  • Defining the domains of coverage
  • Where do we want to measure coverage
  • What attributes (variables) to put in the trace
  • Defining models
  • Defining tuples and semantic on the tuples
  • Restrictions on legal tasks
  • Collecting data
  • Inserting traces to the database
  • Processing the traces to measure coverage
  • Coverage analysis and feedback
  • Monitoring progress and detecting holes
  • Refining the coverage models
  • Generating regression suites

116
Coverage Model Hints
  • Look for the most complex, error prone part of
    the application
  • Create the coverage models at high level design
  • Improve the understanding of the design
  • Automate some of the test plan
  • Create the coverage model hierarchically
  • Start with small simple models
  • Combine the models to create larger models.
  • Before you measure coverage check that your rules
    are correct on some sample tests.
  • Use the database to "fish" for hard to create
    conditions.
  • Try to generalize as much as possible from the
    data
  • X was never 3 is much more useful than the task
    (3,5,1,2,2,2,4,5) was never covered.

117
  • Future Coverage Usage
  • One area of research is automated coverage
    directed feedback
  • If testcases/drivers can be automatically tuned
    to go after more diverse scenarios based on
    knowledge about what has been covered, then bugs
    can be encountered much sooner in design cycle
  • Difficulty lies in the expert system knowing how
    to alter the inputs to raise the level of
    coverage.

118
Verification Languages
  • Specific to verification principles
  • Deficiencies in RTL languages (Verilog and VHDL)
  • Verilog was designed with a focus on describing
    low-level hardware structures
  • No support for data structures (records, linked
    lists, etc)
  • Not object oriented
  • VHDL was designed for large design teams
  • Encapsulates all information and communicates
    strictly through well-defined interfaces
  • These limitations get in the way of efficient
    implementation of a verification strategy

119
Verification Languages (cont)
  • Some examples of verification languages
  • Verisitys Specman Elite
  • Synopsys Vera
  • Chronologys Rave
  • System C
  • Problem is that these are all proprietary,
    therefore buying into one will lock one into a
    vendor.

120
Verification Languages
  • Even with a verification language still
  • need to plan verification
  • design verification strategy and design
    verification architecture
  • create stimulus
  • determine expected response
  • compare actual response versus expected response

121
Revision Control
  • Need to insure that model verified is model used
    for implementation
  • Managing a HDL-based hardware project is similar
    to managing a software project
  • Require a source control management system
  • Such systems keep last version of a file and a
    history of previous versions along with what
    changes are present in each version

122
Configuration Management
  • Wish to tag (identify) certain versions of a file
    so multiple users can keep working
  • Different users have different views of project

123
File Tags
  • Each file tag has a specific meaning

124
Issue Tracking
  • It is normal and expected to find functional
    irregularities in complex systems
  • Worry if you dont!!! Bugs will be found!!!
  • An issue is anything that can affect the
    functionality of the design
  • Bugs during execution of the testbench
  • Ambiguities or incompleteness of specifications
  • A new and relevant testcase
  • Errors found at any stage
  • Must track all issues if a bad design could be
    manufactured were the issue not tracked

125
Issue Tracking Systems
  • The Grapevine
  • Casual conversation between members of a design
    team in which issues are discussed
  • No-one has clear responsibility for solution
  • System does not maintain a history
  • The Post-it System
  • The yellow stickies are used to post issues
  • Ownership of issues is tenuous at best
  • No ability to prioritize issues
  • System does not maintain a history

126
Issue Tracking Systems (cont.)
  • The Procedural System
  • Issues are formally reported
  • Outstanding issues are reviewed and resolved
    during team meetings
  • This system consumes a lot of meeting time
  • Computerized Systems
  • Issues seen through to resolution
  • Can send periodic reminders until resolved
  • History of action(s) to resolve is archived
  • Problem is that these systems can require a
    significant effort to use

127
Code Related Metrics
  • Code Coverage Metrics - how thoroughly does
    verification suite exercise code
  • Number of Lines of Code Needed for Verification
    Suite - a measure of the level of effort needed
  • Ratio of Lines of Verification Code to Lines of
    Code in the Model - measure of design complexity
  • Number of source code changes over time

128
Quality Related Metrics
  • Quality is subjective
  • Examples of quality metrics
  • Number of known outstanding issues
  • Number of bugs found during service life
  • Must be very careful to interpret and use any
    metric correctly!!!
Write a Comment
User Comments (0)
About PowerShow.com