Page 308 - Semiconductor Manufacturing Handbook
P. 308
Geng(SMH)_CH19.qxd 04/04/2005 20:00 Page 19.35
INSPECTION, MEASUREMENT, AND TEST
INSPECTION, MEASUREMENT, AND TEST 19.35
suppliers to proceed to the next phase. This next phase is typically the benchmark phase, where the
“paper performance” is physically verified. Benchmarking is discussed in the following section.
A good evaluation matrix can provide a basis for an impartial decision and should evaluate busi-
ness as well as technical criteria. Although the personal experiences of team members, both positive
and negative, should be considered in the evaluation process, personal biases should not overly influ-
ence the decision. The various categories should be weighted with a higher scaling number given to
the areas of more importance. The graded number multiplied by the scaling factor provides the
subtotal and then these are all added together for a total scoring for a vendor.
A sample list of evaluation criteria includes the following:
Tester Evaluation Criteria: Technical Criteria
Specifications and performance. This is based on the vendor’s published specifications—
determine whether “guaranteed” or “typical” specifications
Guaranteed: warranted performance that is verified by calibration, diagnostics
Typical: performance expected under normal operating conditions
Hardware
Digital: number of pins, data rate, number of timing edges, formats, EPA, OTA, vector capabil-
ities, capture memory, sequencer capabilities, driver and comparator pin electronics
Clocks: number of pins, frequency, accuracy, jitter
Device power supplies: number of supplies available, voltage and current force/measure
ranges, accuracy and speed, I capability
DDQ
PMU: types—whether system, per board, or per pin; number of pins, voltage and current
force/measure ranges, accuracy and speed
SCAN: number of pins, vector depth, ac timing capabilities
Memory test: number of pins address and data, HW or SW APG, APG rate, scrambling, pat-
tern library, pattern language, redundancy repair capabilities
Analog: number of pins of source and measure, BW, sample rates, number of bits of reso-
lution, spurious free dynamic range (SFDR), and accuracy
RF: number of pins of source and measure, frequency range, power levels, accuracy, mod-
ulation formats
Software
Workstation, Operating System, and Networking Environment
Software usability (development environment, SW languages, and/or GUIs and debug tools)
Reliability (system running and bug issues)
Revision control
Design-to-test links and ATPG tools from EDA/simulation tools
Support for STIL, CTL, other industry standards
Data logging, data collection, and analysis tools (STDF, statistical management of data)
Ease of integration into the IC manufacturer’s design and production processes
Scalability—flexibility to cover a broad set of applications
Useful life of the tester
Future upgradeability: “head room” and flexibility of architecture
Committed sales and support life of platform
Downloaded from Digital Engineering Library @ McGraw-Hill (www.digitalengineeringlibrary.com)
Copyright © 2004 The McGraw-Hill Companies. All rights reserved.
Any use is subject to the Terms of Use as given at the website.

