New data from the National Institute of Standards and Technology (NIST) will assist in the design of optics for liquid immersion lithography, an old idea that recently has attracted new interest as a possible means of improving image resolution and thereby shrinking feature sizes of computer chips.
Conventional optical lithography has advanced sufficiently to achieve a resolution of 100 nanometers (billionths of a meter), but there are physical and technical limits to how much better it can get. By placing certain liquids between the final optical element and the silicon wafer, it may be possible to extend the resolution to 65 nanometers for state-of-the-art lithography using the 193-nanometer wavelength of light, or even 45 nanometers or below for future systems using the 157-nanometer wavelength.
A key characteristic of liquids to be used in immersion lithography is their refractive index, which affects how light bends as it crosses an interface, such as that between the liquid and a lens or a silicon wafer. Air has an index close to one. By contrast, water has a refractive index almost 50 percent higher. Placing this higher-index fluid between the lens and the silicon wafer reduces the resolution-limiting effects of diffraction, enabling imaging of smaller feature sizes.
Laura Ost | EurekAlert!
Copper oxide photocathodes: laser experiment reveals location of efficiency loss
10.05.2019 | Helmholtz-Zentrum Berlin für Materialien und Energie
NIST research sparks new insights on laser welding
02.05.2019 | National Institute of Standards and Technology (NIST)
A new assessment of NASA's record of global temperatures revealed that the agency's estimate of Earth's long-term temperature rise in recent decades is accurate to within less than a tenth of a degree Fahrenheit, providing confidence that past and future research is correctly capturing rising surface temperatures.
The most complete assessment ever of statistical uncertainty within the GISS Surface Temperature Analysis (GISTEMP) data product shows that the annual values...
Physicists at the University of Basel are able to show for the first time how a single electron looks in an artificial atom. A newly developed method enables them to show the probability of an electron being present in a space. This allows improved control of electron spins, which could serve as the smallest information unit in a future quantum computer. The experiments were published in Physical Review Letters and the related theory in Physical Review B.
The spin of an electron is a promising candidate for use as the smallest information unit (qubit) of a quantum computer. Controlling and switching this spin or...
Engineers at the University of Tokyo continually pioneer new ways to improve battery technology. Professor Atsuo Yamada and his team recently developed a...
With a quantum coprocessor in the cloud, physicists from Innsbruck, Austria, open the door to the simulation of previously unsolvable problems in chemistry, materials research or high-energy physics. The research groups led by Rainer Blatt and Peter Zoller report in the journal Nature how they simulated particle physics phenomena on 20 quantum bits and how the quantum simulator self-verified the result for the first time.
Many scientists are currently working on investigating how quantum advantage can be exploited on hardware already available today. Three years ago, physicists...
'Quantum technologies' utilise the unique phenomena of quantum superposition and entanglement to encode and process information, with potentially profound benefits to a wide range of information technologies from communications to sensing and computing.
However a major challenge in developing these technologies is that the quantum phenomena are very fragile, and only a handful of physical systems have been...
29.04.2019 | Event News
17.04.2019 | Event News
15.04.2019 | Event News
24.05.2019 | Physics and Astronomy
24.05.2019 | Medical Engineering
24.05.2019 | Life Sciences