Integrating Physical Test into the IC Studio Workflow - PowerPoint PPT Presentation

About This Presentation
Title:

Integrating Physical Test into the IC Studio Workflow

Description:

Title: PowerPoint Presentation Last modified by: hpeterson Document presentation format: On-screen Show Company: Mentor Graphics Corporation Other titles – PowerPoint PPT presentation

Number of Views:57
Avg rating:3.0/5.0
Slides: 31
Provided by: wpYannte
Category:

less

Transcript and Presenter's Notes

Title: Integrating Physical Test into the IC Studio Workflow


1
Integrating Physical Test into the IC Studio
Workflow
  • Harry Peterson
  • Senior Director of IC Technology
  • Pixelworks Inc
  • Co-authors
  • Angus Tang
  • Marc Ranger

2
Introduction
  • Trends
  • What we can build is limited by our ability to
    manage complexity
  • Co-design is everywhere
  • Schedules keep shrinking
  • Interaction among disciplines becomes tighter
  • Simulators are more powerful
  • ATE becomes relatively more expensive
  • Focus of this presentation
  • Methodology for co-design of test program and
    mixed-signal chip

3
Objective Minimize Time to (Profitable)
Revenue
  • We must produce first-time-right mixed-signal
    designs that are testable and cost-effective.
  • At the same time, we must deliver
    first-time-right test capability that is
    comprehensive and cost-effective.

4
The Idea
  • Create two views for the testbench
  • Virtual
  • Physical
  • By making it easy to flip between these two
    representations, we speed up the development and
    verification of mixed-signal test programs.

5
Methodology
  • Instantiate both the physical and virtual view of
    the testbench.
  • Create a pipe between these views.
  • Use views interchangeably in order to validate
    and optimize both.

6
Testbench, in the virtual world
  • Resources for generating stimuli
  • Dozens of types of signal generators exist
  • For this session, we just look at PWL and PULSE
  • Resources for analyzing responses
  • EZWave
  • Third-party (Kimotion) software mines data so
    that we can automate regression tests
  • Custom scripts bridge the gap between Simulation
    software and ATE software

7
Testbench, in the real (ATE) world
  • Resources for generating stimuli
  • AWG (Arbitrary Waveform Generator)
  • Pin Driver
  • Other
  • Resources for analyzing responses
  • Waveform digitizer
  • Comparator
  • Other resources
  • Active load
  • Active and passive circuits on load board or
    probe card

8
EDA Tool Selection
  • Why we chose ADMS
  • Allows us to efficiently move between accurate
    transistor-level analysis and fast top-level
    analysis.
  • Efficient
  • Why we chose IC studio
  • Good schematic-capture
  • Gracefully accommodates a variety of other views
    and resources
  • Open

9
ATE Resource Selection
  • Credence D10
  • Powerful mixed-signal capability
  • Good analog performance
  • Open C style programming interface
  • Well-suited to characterization as well as to
    production testing
  • Low cost

10
Test Generation Methodology / Digital
  • Digital designers figured this out long ago.
  • But their problem was simpler
  • Simply translate simulation data files into test
    vectors for ATE tester
  • Examples of such vector translation tools
  • VTRAN
  • V2SCO

11
Test Generation Methodology (Mixed-signal)
  • Transfer of simulation data to tester program is
    largely a manual process.
  • We need a more automatic transfer mechanism.
  • The concepts have been considered, but have not
    yet been widely integrated into the workflow.
  • References
  • 1 Viekko Loukusa. Behavioral Test Generation
    Modeling Approach for Mixed-signal IC
    Verification, Proceedings of International
    Mixed-Signal Testing Workshop 2002
  • 2 Geert Seuren, et al, Extending the Digital
    Core-Based Test Methodology to Support
    Mixed-Signal, Proceedings of International Test
    Conference, 2004

12
Enabling technology for mixed-signal test
  • ICStudio
  • An integrated, user-friendly environment
  • Create, manage, and simulate design at different
    levels of abstractions (Spice, schematics, HDL)
  • Facilitates behavioral modeling and mixed-level
    simulation

13
Approach
  • Create top-level testbench to verify design in
    simulation.
  • Top-level simulation is made possible through the
    use of behavioral models.
  • The test bench is then used to generate the main
    components of the ATE test program, these
    includes the test vectors and tester instrument
    setups.

14
Case Study video AFE
  • Consists of the following blocks
  • Signal pre-conditioning
  • Input buffers
  • Clamps
  • Gain/Offset DAC
  • Three fast ADCs
  • 10-bit resolution
  • 162MSPS
  • Timing recovery circuit
  • Line-locked PLL

15
DUT and the Simulation Testbench /1
16
DUT and the Simulation Testbench /2
  • The top-level testbench allows us to characterize
    the ADC and extract important performance
    parameters such as INL, DNL, and ENOB.

17
DUT and the Simulation Testbench /3
  • The digital stimulus block generates the
    necessary clock and control signals to the ADC.
  • Digital stimuli are implemented as a Verilog
    block.

18
DUT and the Simulation Testbench /4
  • The analog stimulus block generates a ramp signal
    so that the output of the ADC can be measured for
    linearity. Analog stimuli are modeled
    behaviorally in Verilog AMS.

19
Case Study video AFE
20
Mapping the simulation testbench to the ATE
testbench /1
  • The digital stimuli are translated into test
    vectors in ATE tester format.
  • The digital module of the D10 tester accepts test
    vectors in STIL format, which can be readily
    generated from simulation.

21
Mapping the simulation testbench to the ATE
testbench /2
22
Mapping the simulation testbench to the ATE
testbench /3
  • Using the method described, the bulk of the ATE
    test program can be generated from the simulation
    testbench, before first silicon is available.
  • Test program development time and effort is
    reduced.
  • This flow bridges the communication gap between
    test engineers and designers.

23
Next steps
  • 1. Integrate the Kimotion Breaker tool into
    this flow, in order to more tightly close the
    loop between executable spec and verified
    Silicon.
  • 2. Adopt Mentor Checkerboard tool in order to
    gain the efficiency and quality benefits of
    greater automation and faster simulation.
  • 3. Pull software development and validation into
    the co-design and product-verification loops.
  • 4. Integrate Yield Analysis and Product
    Engineering capabilities.

24
Next steps /1Use Kimotion Breaker tool to help
mine simulation data so we can make specs more
executable
Design variables
Process and Environment
Specifications
  • User-defined testbenches measure important
    circuit performances
  • Breaker exercises testbenches to find most likely
    and/or worst-case violations for each performance

Eldo KtModels AdvanceMS
Performance models
Optimized netlist
Specification corners
25
Next steps /2Use Mentor Checkerboard tool to
greatly improve speed and effectiveness of
chip-level simulation
Verification engineer traditionally has access to
two types of block descriptions
  • .

Transistor-accurate netlistsslow, but capture
all effects
Ideal behavioral models are fast, but modeling
extra effects requires tuning
Checkerboard verification avoids the need for
full spice simulation
?
Captures impact ofblocks effectson system
behavior
?
Does not captureinteractions
System block diagram
?
26
Next steps /3 Pull software development and
validation into this codesign loop
  • In the virtual world tools already support
    this.
  • In fact, a few people have been doing this (with
    huge success) for about twenty years.
  • Example Andy Bechtolsheims team at Sun ran
    code and booted the Sparc 1 chipset before they
    actually taped it out.
  • In the physical world ATE resources to support
    this flow have been widely available for only
    about two years.
  • Example D10 (Credence)
  • Trend Open ATE initiative is gaining mainstream
    support referencehttp//www.semitest.org/news/a
    rticles/FF_19_Yuhai_Ma_05.pdf
  • The bottom line for many products, it is
    possible to do validation (including software
    validation) at probe test. This speeds up
    development schedules.

27
Next steps /4Integrate yield-analysis and
product-engineering capabilities
  • Yield Analysis Options
  • Corners Analysis
  • Corners are based on digital performance criteria
  • Requires many simulations
  • Leads to overdesign more so with newer
    processes
  • Classic Monte Carlo Analysis
  • Best accuracy
  • No overdesign
  • Requires a prohibitive number of simulations
  • Monte-Carlo with KtModel Analysis
  • Requires a limited number of simulations (user
    controlled)
  • Model evaluations replace simulation in
    monte-carlo loop
  • No overdesign

28
Next steps /4Integrate yield-analysis and
product-engineering capabilities
  • KtModeler improves the accuracy of behavioral
    models.
  • Behavioral simulations can see issues previously
    caught only with spice simulation.
  • KtModeler can also be used instead of simulator
    in monte-carlo runs.
  • Reduces the number of simulations required for
    monte-carlo by 10-100x

29
Acknowledgements
  • Pixelworks executive support
  • Hans Olsen
  • Pixelworks engineering team
  • Jenkin Wong, Thomas Nasralla, Mike Parrish, Jay
    Sulima, Pri Nallahandi, Liang Yuan
  • Insyte (ATE Consultant)
  • Peter Lindholm
  • Mentor
  • Sam Wasche, Christian Mayer, Brandon Barnes,
    Vamsi Rachapudi
  • Credence
  • Ken Skala
  • Kimotion
  • Bart De Smedt, Walter Daems

30
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com