Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Timothy J. Wiltshire is active.

Publication


Featured researches published by Timothy J. Wiltshire.


Ibm Journal of Research and Development | 1997

Advanced DUV photolithography in a pilot line environment

Christopher P. Ausschnitt; Allan C. Thomas; Timothy J. Wiltshire

As the critical path to increasing circuit density, deep-ultraviolet (DUV) lithography has played a key role in the development of new semiconductor products. At present, DUV refers to imagery at the 248-nm wavelength, with the introduction of 193-nm photolithographic systems anticipated in the next few years. This paper presents an overview of DUV lithography applications in the IBM Advanced Semiconductor Technology Center (ASTC). Since 1990, we have used DUV lithography for critical levels of advanced generations of DRAM (64Mb, 256Mb, and 1Gb) and associated families of logic products. We describe the means by which DUV capability and productivity have increased in a decreasing process window environment. Tooling, processes, and process control systems have undergone continuous improvement to accommodate increasing wafer starts and the rapid introduction of new products.


Ibm Journal of Research and Development | 1992

A statistical approach to quality control of non-normal lithographical overlay distributions

Robert M. Booth; Kurt A. Tallman; Timothy J. Wiltshire; Pui L. Yee

To achieve the high reiiabiiity and performance required by integrated circuit (IC) chips in IBIUl Enterprise System/9000TM processors, itthography tooi centeriine overlay variations between masiting levels were specified at ±0.3 ixm, and circuit design images were transferred with 5x step-and-repeat photolithography tools. In contrast to data obtained from 1 x lithography tools, the levelto-level overlay data which characterize deviations from circuit design rules did not fit a normal distribution, and quality control was not achieved with traditional statistical procedures. A methodology was empirically developed which transformed measured data into worst-case overlay points and approximated the data by a gamma distribution. More than 80% of the worst-case distributions were fit by the gamma distribution. The transformation of chip worstcase overlay data and the quality control testing applicable to 5x step-and-repeat lithography tool processes are described in this paper.


Proceedings of SPIE | 2010

The GridMapper challenge: how to integrate into manufacturing for reduced overlay error

Allen H. Gabor; Bernhard R. Liegl; Michael Pike; Emily M. Hwang; Timothy J. Wiltshire

More sophisticated corrections of overlay error are required because of the challenge caused by technology scaling faster than fundamental tool improvements. Starting at the 45 nm node, the gap between the matchedmachine- overlay error (MMO) and technology requirement has decreased to the point where additional overlay correction methods are needed. This paper focuses on the steps we have taken to enable GridMapperTM, which is offered by ASML, as a method to reduce overlay error. The paper reviews the basic challenges of overlay error and previous standard correction practices. It then describes implementation of GridMapper into IBMs 300 mm fabrication facility. This paper also describes the challenges we faced and the improvements in overlay control observed with the use of this technique. Specifically, this paper will illustrate several improvements: 1. Minimization of non-linear grid signature differences between tools 2. Optimization of overlay corrections across all fields 3. Decreased grid errors, even on levels not using GridMapper 4. Maintenance of the grid for the lifetime of a product 5. Effectiveness in manufacturing - cycle time, automated corrections for tool grid signature changes and overlay performance similar to dedicated chuck performance


Proceedings of SPIE | 2009

Focus and dose characterization of immersion photoclusters

Timothy A. Brunner; Daniel Corliss; Timothy J. Wiltshire; Christopher P. Ausschnitt

The process window for state of the art chip manufacturing continues to decrease, driven by higher NA exposure tools and lower k1 values. The benefits of immersion lithography for Depth of Focus (DoF) are well known. Yet even with this immersion boost, NA=1.35 tools can push DoF into sub-100nm territory. In addition, immersion processes are subject to new sources of dose and focus variation. In order to realize the full potential of immersion lithography, it is necessary to characterize, understand and attack all sources of process variation. Previous work has established our dose/focus metrology capability1, in which we expose Process Monitor Grating (PMG) targets with high sensitivity to focus, measure the PMGs using scatterometry, and use the Ausschnitt dose/focus deconvolution approach to determine focus errors to within a few nm and dose errors to within 0.1%. In this paper, we concentrate on applying this capability to the detailed measurements of immersion photoclusters utilizing ASML exposure tools. Results will include: • comparison of Twinscan 1700i and 1900i focus capability • effectiveness of the Reticle Shape Correction (RSC) for non-flat reticles • visualization of non-flat wafer chucks, tilted image planes, and other systematic focus error components • tracking of tool trends over time, using automated monitor wafer flows The highly systematic nature of the observed focus errors suggest potential for future improvements in focus capability.


Proceedings of SPIE | 2009

Manufacturing implementation of scatterometry and other techniques for 300-mm lithography tool controls

Timothy J. Wiltshire; Daniel Corliss; Timothy A. Brunner; Christopher P. Ausschnitt; R. Young; R. Nielson; Emily M. Hwang; J. Iannucci

Focus and dose control of lithography tools for leading edge semiconductor manufacturing are critical to obtaining acceptable process yields and device performance. The need for these controls is increasing due to the apparent limitation of optical water immersion lithography at NA values of approximately 1.35 and the need to use the same equipment for 45nm, 32nm, and 22nm node production. There is a rich history of lithographic controls using various techniques described in the literature. These techniques include (but are not limited to) Phase Grating Focus Monitoring1 (PGFM), optical CD control using optical overlay metrology equipment (OOCD)2,3, and in more recent years optical scatterometry4,5. Some of the techniques, even though they are technically sound, have not been practical to implement in volume manufacturing as controls for various reasons. This work describes the implementation and performance of two of these techniques (optical scatterometry and OOCD) in a volume 300mm production facility. Data to be reviewed include: - General implementation approach. - Scatterometry dose and focus stability data for 193nm immersion and 248nm dry lithography systems. - Analysis of the stability of optical scatterometry dose and focus deconvolution coefficients over time for 193nm immersion and 248nm dry systems. - Comparison between scatterometry and OOCD techniques for focus monitoring of 248nm dry systems. The presentation will also describe the practical issues with implementing these techniques as well as describe some possible extensions to enhance the current capabilities being described.


Metrology, inspection, and process control for microlithography. Conference | 2000

Subwavelength alignment mark signal analysis of advanced memory products

Xiaoming Yin; Alfred K. K. Wong; Donald C. Wheeler; Gary Dale Williams; Eric Alfred Lehner; Franz X. Zach; Byeong Y. Kim; Yuzo Fukuzaki; Zhijian G. Lu; Santo Credendino; Timothy J. Wiltshire

The impact of alignment mark structure, mark geometry, and stepper alignment optical system on mark signal contrast was investigated using computer simulation. Several sub-wavelength poly silicon recessed film stack alignment targets of advanced memory products were studied. Stimulated alignment mark signals for both dark-field and bright-field systems using the rigorous electromagnetic simulation program TEMPEST showed excellent agreement with experimental data. For a dark-field alignment system, the critical parameters affecting signal contrast were found to be mark size and mark recess depth below silicon surface. On the other hand, film stack thickness and mark recess depth below/above silicon surface are the important parameters for a bright-field alignment system. From observed simulation results optimal process parameters are determined. Based on the simulation results some signal enhancement techniques will be discussed.


Metrology, inspection, and process control for microlithography. Conference | 2002

Ultrafast wafer alignment simulation based on thin film theory

Qiang Wu; Gary Williams; Byeong Y. Kim; Jay W. Strane; Timothy J. Wiltshire; Eric Alfred Lehner; Hiroyuki Akatsu

The shrink of semiconductor fabrication ground rule continues to follow Moores law over the past years. However, at the 100 nm node, the fabrication cost starts to rise rapidly. This is mainly due tot he increase of complexity in the fabrication process, including the use of hard masks, planarization, resolution enhancement techniques, etc. Smaller device sizes require higher alignment tolerances. Also, higher degree of complexity makes alignment detection more difficult. For example, planarization techniques may destroy mark topography; hard masks may optically bury alignment marks, and more film layers makes the alignment signal more susceptible to process variations. Therefore in order to achieve reliable alignment, it is absolutely critical to develop an accurate and fast simulation software that can characterize alignment performance based on the film stack structure. In this paper, we will demonstrate that we have built an extremely fast alignment performance based on the film stack structure. In this paper, we will demonstrate that we have built an extremely fast alignment signal simulator for both direct imaging and diffractive detection system based on simple optical theory. We will demonstrate through examples using our advanced DRAM products that it is capable of accurately mapping the multi-dimensional parameter space spanned by various film thickness parameters within a short period of time, which allows both on-the-fly feedback in alignment performance and alignment optimization.


Proceedings of SPIE | 2013

CD optimization methodology for extending optical lithography

C. Wong; G. Seevaratnam; Timothy J. Wiltshire; Nelson Felix; Timothy A. Brunner; Pawan Rawat; Maryana Escalante; W. Kim; Erica Rottenkolber; A. Elmalk; Vivien Wang; Christian Marinus Leewis; Paul Hinnen

This paper describes the joint development and optimization of an advanced critical dimension (CD) control methodology at IBM’s 300 mm semiconductor facility. The work is initially based on 22 nm critical level gate CD control, but the methodology is designed to support both the lithography equipment (1.35 NA scanners) and processes for 22, 20, 18, and 14 nm node applications. Specifically, this paper describes the CD uniformity of processes with and without enhanced CD control applied. The control methodology is differentiated from prior approaches1 by combining independent process tool compensations into an overall CD dose correction signature to be applied by the exposure tool. In addition, initial investigations of product specific focus characterization and correction are also described.


Proceedings of SPIE | 2012

High-order wafer alignment in manufacturing

Michael Pike; Nelson Felix; Vinayan C. Menon; Christopher P. Ausschnitt; Timothy J. Wiltshire; Sheldon Meyers; Won Kim; Blandine Minghetti

Requirements for ever tightening overlay control are driving improvements in tool set up and matching procedures, APC processes, and wafer alignment techniques in an attempt to address both systematic and non systematic sources of overlay error. Thermal processes used in semiconductor manufacturing have been shown to have drastic and unpredictable impacts on lithography overlay control. Traditional linear alignment can accommodate symmetric and linearly uniform wafer distortions even if these defects vary in magnitude wafer to wafer. However linear alignment cannot accommodate asymmetric wafer distortions caused by variations in film stresses and rapid thermal processes. Overlay improvement techniques such as Corrections per Exposure can be used to compensate for known systematic errors. However, systematic corrections applied on a lot by lot basis cannot account for variations in wafer to wafer grid distortions caused by semiconductor processing. With High Order Wafer Alignment, the sample size of wafer alignment data is significantly increased and modeled to correct for process induced grid distortions. HOWA grid corrections are calculated and applied for each wafer. Improved wafer to wafer overlay performance was demonstrated. How HOWA corrections propagate level to level in a typical alignment tree as well as the interaction of mixing and matching high order wafer alignment with traditional linear alignment used on less overlay critical levels. This evaluation included the evaluating the impact of overlay offsets added by systematic tool matching corrections, product specific corrections per exposure and 10 term APC process control.


Proceedings of SPIE | 2008

Determining DOF requirements needed to meet technology process assumptions

Allen H. Gabor; Andrew Brendler; Bernhard R. Liegl; Colin J. Brodsky; Gerhard Lembach; Scott M. Mansfield; Shailendra Mishra; Timothy A. Brunner; Timothy J. Wiltshire; Vinayan C. Menon; Wai-kin Li

Depth of Focus (DOF) and exposure latitude requirements have long been ambiguous. Techniques range from scaling values from previous generations to summing individual components from the scanner. Even more ambiguous is what critical dimension (CD) variation can be allowed to originate from dose and focus variation. In this paper we discuss a comprehensive approach to measuring focus variation that a process must be capable of handling. We also describe a detailed methodology to determine how much CD variation can come from dose and focus variation. This includes examples of the statistics used to combine individual components of CD, dose and focus variation.

Researchain Logo
Decentralizing Knowledge