Randomized Verification - PowerPoint PPT Presentation

1 / 61
About This Presentation
Title:

Randomized Verification

Description:

As constraints became more complex (buffer size multiplication, Frame height as a mult. ... Local Indicator Chart. Objective measure in each feature ... – PowerPoint PPT presentation

Number of Views:116
Avg rating:3.0/5.0
Slides: 62
Provided by: idodin
Category:

less

Transcript and Presenter's Notes

Title: Randomized Verification


1
Randomized Verification Coverage
2
Agenda
  • Why use Randomization?
  • Case study MPEG4 SoC verification
  • - Overview
  • - Constraints
  • Existing solution
  • The Vera solution
  • OpenVera Features
  • Constraints Hierarchy
  • Runtime
  • Coverage
  • Managing a Coverage-driven project

3
Using VERA random constraints engineto test
complex MPEG4 chips
4
Emblaze Semiconductors
  • Designing MPEG4 system-on-chip (SoC).
  • Recently acquired by Zoran (Solution on chip
    design, algorithm development and system
    integration).

5
Randomization
  • Used in Verification of HW products.
  • Why?
  • Detecting problems we didnt think about
  • The problem
  • Dependencies of parameters given to the DUT.

6
The DUT
  • Video/Image display unit
  • Supports a variety of LCDs (Liquid Crystal
    Display)
  • Display Video OSD (On Screen Display)
  • e.g. channel number

7
The DUT
8
The DUT
  • Inputs
  • Video stream
  • OSD
  • Configuration parameters
  • Output
  • -Video frame to LCD (in the correct size)

9
DUT sub-units
  • Video input
  • -YCbCr 420
  • RGB 565
  • 8 bit RGB (256 entry lookup table)
  • Video display sub-unit
  • Handling above formats -gt LCD, resize frames
  • OSD sub-unit
  • Convert OSD data -gt video, resize blend into
    video frame

10
Configuration Parameters
  • - Background color size
  • - LCD physical screen size
  • - Video window size location
  • - OSD window size location
  • - LCD clock synch. parameters
  • - and more

11
Test Flow
12
Input Generation
  • A set of random scenarios for each tested
    feature
  • e.g. A fixed LCD configuration but random in all
    other aspects
  • Randomized parameters obey strict rules
  • Dependencies of randomized parameters
  • Reconstruct tests using SEED

13
Sources for Constraints
  • HW limitations
  • Specifications demands (e.g. related units
    response time)
  • Simulation time
  • Test goals (maximize/minimize relevant/irrelevant
    tests)
  • Reality (plain logic) e.g. video size 1x240

14
Testing the video display unit
  • parameters file would contain
  • Unit input video movie frames
  • Video size (environment will "cut" a video
    window from the video input file)
  • Video size after resizing
  • Video location.
  • Similar for the OSD, with others such as LCD
    screen size, and background parameters

15
Restrictions illustration
16
Restriction example
17
The Naïve Solution
  • Randomize parameters according to pre-defined
    order
  • Order determined according to parameter usage
  • Parameter generated in a given range, sometimes
    depending on previous randomized parameters

18
Naïve solution example
19
Naïve solution fails
  • constraints
  • 0lta,b,c,lt10, altbltc, clt(ab-1)
  • a 0ltalt8
  • b a1ltblt10
  • c b1ltcltmin(10,(ab-1))
  • solution (a,b,c) (2,5,6)
  • failure a1, b5, 6ltclt5 !

20
Naïve solution disadvantages
  • It fails
  • large pieces of code difficult to add/remove
    constraints
  • Randomized order pre defined (difficult when
    some parameters are supplied, and others
    randomized)

21
Brute Force method
  • A simple algorithm
  • 1. Write constraints file
  • 2. Parse file, generate variable list
    mathematical constraints
  • 3. Randomize all variables, each within its
    absolute range
  • 4. Check if they obey math. Constraints if not,
    back to 3

22
Brute Force method
  • Works well for small number of variables (lt10)
    and simple constraints (linear)
  • Fails with increase in variable number and
    constraints complexity

23
The Vera solution
  • Vera test bench automation product, works with
    CVS simulator
  • Created by a world leading
    semiconductor design SW, SW for SoC and
    electronic systems.
  • OpenVera HW verification language (like e),
    includes assertions and more

24
Vera
  • Randomization of parameters and inputs
  • Coverage analysis
  • Coverage accumulation
  • Real time data and temporal checking
  • Interface to Verilog and VHDL simulators
  • Finding corner cases using randomization

25
Vera more features
  • Compatibility with SystemC
  • Intuitive and easy to use
  • DesignWare verification library of popular
    buses, I/Os etc.
  • Company claims 89 success on 1st tape-out!

26
Vera features
Bus Functional Model
27
Using Vera as a Randomization engine
  • Object oriented usage of classes
  • Randomization preformed in Main()

28
  • Constraints are managed in constraints modules

29
Constraints Hierarchy
30
Integration
  • In order to achieve integration with existing
    environment in the MPEG4 case, the randomization
    engine was used to generate pre-simulation inputs

31
Runtime aspects
  • Goal input generation in under 3 sec
  • Easily achieved with linear constraints
  • As constraints became more complex (buffer size
    multiplication, Frame height as a mult. of 8),
    Vera failed
  • BUT performance improve dramatically following
    v6.0
  • Still randomization time vary greatly as we will
    see

32
Runtime experiment
  • Running on Linux Xeon 2.4GHz, 2GB RAM
  • A monotonic series of N random variables
  • a1lta2lta3ltltaN
  • In a single constraint block
  • 1000 runs in each benchmark test

33
runtime grows exponentially with number of random
variables !
34
  • During Validation, scenarios with 500 random
    variables and over 200 constraints were resolved
    in less than 3 sec

35
Veras Disadvantages
  • No floating point arithmetic (problematic in
    randomizing frequencies)
  • No user-friendly constraint debugger
  • Debugging of OpenVera code is still required

36
Other HVLs
  • e (Specman simulator) by Verisity (born in
    Israel, acquired by Cadance)
  • Jeda-X by Jeda Technologies
  • PSL (Property Specification Language)/Sugar
    developed by IBM and standardized by Accelera

37
Coverage
  • Defining a set of possible values per variable
    (wire, register), generating random tests until
    all values were obtained
  • Goal passing a set of tests which produce 100
    coverage
  • Is there a minimum coverage required before
    tape-out?

38
Coverage
  • Coverage accumulation accumulating the age of
    coverage on running consequent regressions
  • Regression A set of automated tests - allows
    repeating tests on each RTL modification

39
Coverage Types
40
Code Coverage
  • Measures execution of HDL constructs
  • Does not measure functionality
  • E.g. Branch, Condition, Toggle
  • Low overhead
  • Easy to use
  • Best performance

41
Structural Coverage
  • Test structures
  • Has FIFO been empty? Overflow? Reached high
    water-mark?

42
Functional Coverage
  • Verify functionalities
  • e.g. hand-shake protocol
  • (no_up_less_10, no_up_gt_10
  • Available in System Verilog (6.1), PSL(6.0)

43
Transaction Coverage
  • Protocol compliance checking
  • of reads / writes, etc.

44
Coverage vs. Directed
45
Randomization Coverage results
  • Good uniform distribution
  • Zero hits design bug, constraint block bug, or
    a design optimization needed

46
Managing Coverage Driven Verification
47
Objectives
  • Driving a Coverage-driven verification project
    to a successful and predictable conclusion
  • Efficient use of indicators
  • Use and benefit of Verification driven management

48
Objectives cont.
  • Define a clear picture to managers and engineers
  • Objective and quantified view of the
    verification progress

49
Functional Verification progress
50
Indicators
  • Indicators are given relative weights
  • Weights given according to
  • Complexity
  • Priority

51
more benefits
  • Weekly goals set strong incentive
  • Compensation for setbacks by producing other
    progress (overall progress kept steady)

52
Local Indicator Chart
53
Local Indicator Chart
  • Objective measure in each feature
  • Numbers cannot be manipulated, problems are
    easily spotted
  • Working with the Global and local charts produces
    a specific team goal for both design and
    verification teams

54
How do we get there?
  • Produce a Coverage plan from the design Spec.
  • Each coverage point is coded in HVL
  • Coverage points are grouped in Chapters, and
    are assigned weights
  • Let the testing begin

55
Running Regressions
  • Need to produce max usage of licenses and
    machines
  • In coverage-driven, usage is X10 greater,
    running regressions during nights, weekends
  • Good verification, but huge amount of data to be
    managed
  • Each engineer is assigned certain failure types,
    and is responsible for fixing it

56
Optimizing coverage regressions
  • Disable / Enable coverage points in different
    stages of the project to achieve 100 coverage
  • Different coverage plans according to features
    priority
  • Save machine-time and licenses by identifying
    optimal tests
  • Redundant tests are eliminated

57
Summary
  • Importance of Randomization
  • HVL tools
  • Coverage
  • Coverage driven project

58
Thank you!
59
Bibliography
  • Using VERA random constraints engine to test
    complex MPEG4 chips Meiraz Doron, Algavish Gal
  • Indicators help manage coverage-driven
    verification Akiva Michelson

60
Appendix
61
YCbCr
  • YCbCr signals are created from an original RGB
    (red, green and blue) source as follows.
  • Y 0.299R 0.587G 0.114B
  • Cb (B - Y) / 1.772 0.5 - 0.168736R -
    0.331264G 0.5B 0.5
  • Cr (R - Y) / 1.402 0.5 0.5R - 0.418688G
    - 0.081312B 0.5
  • Here, R, G and B are assumed to range from 0 to
    1, with 0 representing the minimum intensity and
    1 the maximum.
Write a Comment
User Comments (0)
About PowerShow.com