Burnell G. West
Credence Systems
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Burnell G. West.
international test conference | 1990
Burnell G. West; Tom Napier
A novel digital functional test system architecture, called Sequencer Per Pin in which the timing and waveform generation hardware work with a sequence of events in the same manner as an IC timing/logic simulator, is presented. This architecture implements tests as sequences of events for each pin, synchronized by global period markers. This makes it possible to perform complex tests on VLSI integrated circuits without requiring extensive program development and debugging efforts. The architecture is more flexible than previous designs, permitting more precise implementation of simulation data with fewer restrictions. The event sequence concept allows significant reduction in test pattern storage requirements and optimizes this feature even further by permitting run-time assignment of pin data, avoiding duplications of test patterns and test programs for different package configurations.<<ETX>>
international test conference | 1994
Didier Wimmers; Kris Sakaitani; Burnell G. West
This paper describes an innovative way to test certain classes of complex digital devices at speeds of 500 MHz or more in a 100 MHz tester whose architecture incorporates the ability to generate a large number of edges per tester period. The means for generating the test patterns, and the means for executing them, are both described.
international test conference | 2002
Burnell G. West
The drive to reduce overall cost of test has a long history. However, the possibility to reduce this overall cost of test by developing an open ATE architecture has only recently begun to be explored in much detail. It seems clear that opening up ATE system architecture to participation by a collection of suppliers is now feasible. Open architecture standards for test systems must embody several key features, the most challenging of these being fixturing, synchronization, software, data volume, power delivery, and system integration and verification.
international test conference | 2000
Luca Sartori; Burnell G. West
Since 1990, architectural innovations and implementation breakthroughs have driven EPA from /spl plusmn/250 ps to /spl plusmn/50 ps, in a tester with the same waveform capabilities and the same cost per pin. Further progress, called for by new devices all over the application spectrum, is based on tight control of the ATE timing error budget, and addresses the fundamental challenge of source synchronous timing. The path to deep picosecond accuracy is driven by calibration technology.
international test conference | 1999
Burnell G. West
This paper analyzes the requirements of at-speed functional testing of high-speed devices. We conclude that, although the requirement forecast by the International Technology Roadmap for Semiconductors is excessive, higher accuracy at-speed functional test systems are needed to qualify the high-performance devices anticipated in the next decade. We also conclude that, while challenging, the necessary higher speeds and accuracies can be realized.
international test conference | 2003
Burnell G. West
A two-wire test strategy with simultaneous bidirectional data flow and an independent clock line enables very high site count low-cost wafer probing.
international test conference | 1999
Burnell G. West
Structural test can address only stuck-at faults unless some dynamic capability is included. The dynamic capability required starts with two-vector launch-capture delay tests, but evolves rapidly to include gated clock bursts, perhaps synchronized with primary I/Os. This implies an at-speed structural test architecture which incorporates many of the capabilities of functional test systems.
international test conference | 2004
Burnell G. West; Michael F. Jones
This work introduces a digital synchronization technique for a highly reconfigurable ATE platform that overcomes inherent scaling, multisite, and other limitations in currently used instrument synchronization methods. A new strategy for integrated circuit test synchronisation across several active test and measurement instruments are described. This strategy is much more flexible than currently available star or bus trigger arrangements, and is more accurate as well. The second synchronization mechanism in ATE is a vector synchronisation. The logic to implement the described mechanisms has been successfully implemented in a Xilinx Virtex2 FPGA.
international test conference | 2004
Burnell G. West
Is open architecture for ATE useful? Is it necessary? Is it feasible? When open architecture was first proposed it was greeted with enthusiasm, skepticism, turmoil, yawns. All of these were well deserved, because the concept was poorly defined and equally poorly executed. This resulted mainly from a basic misperception of the nature of the opportunity, coupled with cost reduction and value enhancement. Open architecture platforms generally limit long-term prospects by imposing platform-level architectural features such as DUT board interface definition and synchronization, and that constrain instrument mixes by slot and DUT fixturing assignments. This seriously limits one of the key anticipated advantages of OA-ATE, the reconfigurability/retooling cost savings. High volume production is the best way to cut the cost of complex items. The current OA-ATE trend does not facilitate this well.
international test conference | 2003
Burnell G. West
Production test and device I/O characterization are different tasks. Measurements and analysis associated with these different tasks need quite different tools. 1 Serial Data Links -the future is “bright” The 2001 ITRS predicted serial datacom rates reaching 40 GBPS by this year, doubling within a decade - and 40 GBPS production devices now exist. A 12.5 ps unit interval is tiny by today’s standards; for reliable data transmission to reach these rates the challenges are daunting. Despite this, we can fully expect such serial data links. What is required for them to be producible? Quite simply: reliable design, consistent process technology, and solid production test capability. 2 Design Qualification vs Production Test It is absolutely necessary to distinguish clearly between design qualification/characterization and the sort of test necessary to guarantee that a particular device is manufactured properly. Yet to sort one from the other is no mean task. It only takes a few extra ohms of interconnect to marginalize a good design; these few extra ohms will displace a transition by a few but unacceptably large number of picoseconds. Each bit in a high-speed serial data stream can be characterized by two key parameters: unit interval and individual edge displacement, expressed as a fraction of unit interval. Edge displacement is often referred to as jitter. Jitter is an unfortunate term, as it masks what is actually going on. Yes, some edges are out of place, but why are they? Displaced edges are created by intersymbol interference, internal systematic path or propagation delay errors, clock phase error, and random noise. These errors must be identified and minimized during the design qualification phase; the design is acceptable if error rate is sufficiently below target unless the product is defective. Key point: “random noise” in semiconductors is not very large. RJ in excess of 1 ps is more likely to be instrumentation noise than true circuit noise, or else is uncorrelated systematic error. This error is likely to be correctable in design. Laboratory equipment is generally capable of acquiring the data necessary to characterize today’s I/O designs. However, it is a costly mistake to attempt to translate these characterization tasks into a production test requirement. Measuring “jitter” in a production test environment is unneeded. What might be measured (and compared to design minimum, nominal, and maximum) are bit-by-bit displacements focused specifically on the structure of the specific serializers or deserializers in the signal path. This requirement is derived from the concept of defect-based testing: each element in the circuit (whether transistor, passive component, or interconnect) has a target quality requirement; if it is not manufactured to this quality requirement then something - something specific - is wrong. Yet this overall approach is insufficient by itself. It misses the tight coupling of the serial data stream to its internal self-aligned clock. If this clock is too erratic, then perfect serializing or deserializing circuitry will break. Conclusion: production test has two requirements: qualify the clock and qualify the bit generators or extractors. There should be no need for long data streams, or concomitant long test times.