Near-field optics: from subwavelength illumination to nanometric shadowing

Near-field optics uniquely addresses problems of x, y and z resolution by spatially confining the effect of a light source to nanometric domains. The problems in using far-field optics (conventional optical imaging through a lens) to achieve nanometric spatial resolution are formidable. Near-field optics serves a bridging role in biology between optical imaging and scanned probe microscopy. The integration of near-field and scanned probe imaging with far-field optics thus holds promise for solving the so-called inverse problem of optical imaging.

The world of microscopy can be divided into two defined approaches: lens-based imaging and lensless imaging (an overview of microscopy today is shown in Fig. 1). This article describes a method of lensless optical imaging whereby optical resolution can be increased by as much as 10-fold in the x and y dimensions and >100-fold in the z dimension. The technique is called near-field scanning optical microscop y (NSOM).

To place this new approach within the constellation of microscopic methodologies available today, a starting point is lens-based imaging, which was the first approach to be developed for microscopic characterization with the development of the optical microscope, invented some 400 years ago. In general, lens-based microscopes are an example of far-field imaging in which the imaging element, the lens, is placed many wavelengths away from the object and focuses and images the appropriate radiation onto or from an object under analysis. All lens-based imaging is limited by criteria such as the Rayleigh resolution limit. The Rayleigh criterion tries to define the xy resolution that can be achieved based on the wavelength of the radiation that is used, the acceptance angle of the lens and the index of refraction of the medium in which the radiation is propagating. As is well known for lens-based optical microscopy, this means that single-wavelength illumination can resolve the rod-like geometry of a 1- m bacterium but not the geometry of even the largest virus (pox virus, with a size of 0.25 m)—which requires a significant reduction in the wavelength of the radiation, to the point where an election microscope has to be used. None of the most advanced lens-based, purely optical techniques, even the most advanced methods of short-pulse excitation, have achieved the goal of imaging a virus.

In addition to the problem of diffraction, which results in the Rayleigh resolution limit, another critical problem in all lens-based microscopy techniques is out-of-focus light (
Fig. 2). This problem is clearly highlighted by the fact that the z resolution of an optical microscope, 1.6 m, is much poorer than the xy resolution noted above. A powerful method to reduce this interference is confocal imaging, which places apertures in the illumination path before the lens and the detection path after the lens (Fig. 2). This limits the detected rays to those close to the plane of focus, within 0.7 m. The reduction in out-of-focus light also has an effect on the xy resolution, which as a result is brought too close to the theoretical limit of the Rayleigh criterion, which is 0.25 m in x and y.

The increase in resolution, relative to conventional optical imaging, approached by confocal imaging is only a factor of 2 in the x, y and z dimensions (
Table 1). Yet confocal microscopy has transformed fields from biology to semiconductor manufacturing—highlighting the importance of even moderate improvements in the resolving power of light.

An alternative way to reduce out-of-focus light is to use a prism to create an evanescent field of radiation. This is a field of light that is attached to the surface of the prism and decays in intensity exponentially as a function of distance from the prism surface. Parts of an object that sit on the prism surface are illuminated to an extent of <0.3 m, which is the maximal penetration into the object of such an evanescent field. This approach to illuminating objects effectively halves the z illumination relative to confocal microscopy. However, this means that only the underside of samples are effectively illuminated, and only with samples that are transparent can the illuminated region be imaged with a lens that is placed above the prism surface. The xy resolution in such a methodology is of course limited by the same criteria that limit lens-based image formation in x and y.

Near-field microscopy was developed in our laboratory1 to break the resolution limit in x, y and z by sending light through an aperture that is much smaller than the wavelength of light and then scanning the aperture or the sample relative to each other at a distance much smaller than a wavelength. This ensures that the light interaction occurs before the enormous effects of diffraction come into play (see
Fig. 2). The xy resolution is limited to the dimension of the aperture, whereas the z extent of effective illumination is defined by the diffraction of the aperture. An evanescent field emanates from the aperture and diffraction limits the region in z that has sufficient fluence to generate optical contrast to maybe 10 nm from the surface of a 50-nm aperture. This is 300-fold better than evanescent wave imaging. Greater z penetration can be achieved with larger apertures. For example, an aperture of 0.25 m, which is the limit of lens-based imaging, provides an enormous signal in near-field optics and achieves penetrations in z close to those of evanescent field imaging.

Thus, near-field microscopy 1-3 is a means of spatially confining light that is rightly classified as lensless, but is within the intellectual genre of approaches in lens-based imaging discussed above, such as confocal microscopy. This is in contrast to the techniques of lens-based nonlinear microscopies, such as two-photon microscopy, three-photon microscopy and second-harmonic generation (SHG), which can be classified as spectral, rather than spatial, approaches to confine the effects of light. They use ultrafast lasers interacting with materials to limit the effective fluence in z by the probability of multiple photon interactions that are required to elicit the spectral phenomena being monitored. This is the case for all the nonlinear methods listed in
Figure 1 aside from SHG.

In the 1980s, our group realized that the requirement in SHG for asymmetric distributions could effectively be applied to selectively investigate molecules at biological interfaces, such as membranes (see ref.
4 and references therein). Our extensions of these ideas led to functional nonlinear optical imaging of membrane potential5. These advances, along with the contributions of others, are described elsewhere in this issue (see Loew and Campagnola, p. 1356).

In the following review, we place near-field optics and its development and achievements within the context of state-of-the-art far-field optical imaging and its application in the life sciences. The synergism resulting from combining near-field microscopy and SHG is described later in this review.

Aperture-based NSOM

Today, near-field optical instrumentation is available that places this technique at the threshold of achieving its enormous potential for nanometric imaging with light. In essence, near-field optics is a bridging technique between two enormous worlds of biologically relevant imaging, optics on the one hand and scanned probe microscopy (SPM) on the other. This bridging role needed the appropriate instrumentation to bring the technique of near-field optics to where it now finds itself in the expanding world of microscopy. To give the reader a true sense of what has been accomplished and what is the potential for the future, it is necessary to give some background of previous approaches that led to the present instrumental capabilities (see Box 1). The present instrumentation, developed in our laboratories, can be used with any probe that is available today and any mode of operation known today. This flexibility is crucial to enable NSOM to assume a bridging, interrelating role in which it can be transparently integrated with other microscopic techniques, including optical and spectral (e.g., Raman) as well as electron and ion optical.

In the early 1980s, when we started putting our ideas of near-field optics into practice, the first goal was to clearly define how much light could come out of a well-characterized subwavelength aperture2. There was no knowledge and a great deal of skepticism as to how much light could emanate from, say, a 50-nm aperture. There was even skepticism that such an aperture emanating light could be fabricated in a silicon wafer using nanolithography, which was then in its infancy. Our initial experiments published in 1984 were a huge success2, however, which encouraged us to move forward.

The next step was to address the problem of how to design a simple, reproducible aperture that could track real surfaces, which are markedly rough. Our first apertures 2 used silicon-processing techniques, but these techniques were in their infancy and incapable of handling this complicated task. We therefore invented a simple method, based squarely in biology, that gave a simple, cheap and reliable methodology for fabricating such apertures6. This technique is used in a majority of near-field optical measurements today is based on electrophysiological methods for producing intracellular electrodes. Glass structures with apertures as small as 50 nm after metal coating were first constructed using a Flaming/Brown glass puller (Sutter, Novato, CA, USA), which relies on heating, tension and controlled pulling to taper a piece of glass. The initial glass structures pulled by this method were micropipettes, because the nichrome heaters used in the first Flaming/Brown pullers were simply not capable of melting quartz fibers. The year was 1986, the year the invention of the ground-breaking technique called atomic force microscopy (AFM). AFM was certainly an enabling technology for the confinement, manipulation and analysis of light in nanometric domains.

In terms of light propagation, subwavelength apertures illuminated through an air medium (as in a tapered glass nanopipette) and silicon apertures show substantial scattering, because of the large mismatch between the refractive indices of glass and air and the absence of guiding of the light in the tapered region where a fiber would still allow light propagation. Thus, when carbon dioxide lasers were introduced to the technology of pipette pulling, a natural progression to quartz fibers ensued7. This resulted in higher throughput and the ability to guide light effectively in the illumination or collection mode of operation. Today, the Flaming/Brown puller based on carbon dioxide laser heating is the puller of choice.

By the early 1990s, the full import of AFM was being felt, and AFM was a natural means to bring a subwavelength aperture into the near-field in a controlled fashion if the aperture was placed at the tip of a force sensing structure. The glass-tapering technology described above did exactly this, and the first atomic force technique that was applied to the problem of near-field optics was shear or lateral force. Of the four versions of the technique that were originally independently developed, the one based on the tuning fork or emulations of the tuning fork is nearly universally employed today8. In this technique, invented by Karrai and Grober8, a straight fiber probe is mounted on one of the tines of the tuning fork and modulated by several nanometers. When the probe tip approaches a surface, the frequency of the tine on which the fiber is mounted is altered relative to the tine that is free (
Fig. 3a). This produces a change in the amplitude and phase of the modulation of the fiber attached relative to the free tine, and results in a signal that can be used by the electronics to keep the probe tip in relative proximity to the surface, by altering piezoelectrically the sample and/or tip position to keep the signal from the tuning fork constant.

The ease of use of shear force has always left something to be desired. This has less to do with the tuning-fork technique—which is thought to be superior even to conventional AFM feedback when applied in standard AFM geometries of normal force9—than with the fact that the forces in the normal force direction are larger and more defined than the forces in the lateral direction. Thus, for example, even a water layer, which is found on all (even dry) surfaces, contributes greatly to a lateral force signal. This leaves doubt as to the exact distance that the probe tip is from a surface. Also of considerable importance for biological applications is that liquid near field microscopy is not available in any commercial instrument using straight fiber probes. In addition, data obtained in shear-force mode are not directly comparable, in terms of AFM interactions, with conventional AFM imaging, where the standard is normal force feedback for obvious reasons, as detailed above. Furthermore, straight fiber probe geometries seriously perturb integration with standard optical or other microscopes by blocking the optical axis from above or below.

To circumvent these difficulties, in the early 1990s, our group10 showed that such straight glass structures could readily be cantilevered and formed excellent normal-force AFM sensors. Fujihara and coworkers11, our laboratory12 and Dunn13 showed that excellent near-field optical images could be obtained with cantilevered optical fibers (
Fig. 3b) together with on-line normal-force topographic imaging. The probes can be used in either contact or noncontact mode with nano-Newton forces between the tip and the surface. Noncontact imaging requires modulating the probe with high resonance frequencies and detecting the change in amplitude and phase in the feedback signal as the probe tip interacts with a surface. These cantilevers are remarkably similar, in their geometrical and force characteristics, to the ideal in atomic force sensors that is being sought but is hard to achieve in standard silicon AFM sensors14. These features include a single-beam, rather than the standard dual-beam, cantilever structure, which has been shown to reduce cantilever twisting14, and a probe tip that is exposed to the optical axis of the microscope (compare Fig. 3b and Fig. 3c). Other probes whose application will be discussed below are shown in Fig. 3d–h.

An example of imaging with such fiber probes completed with the system described in
Box 1 is shown in Figure 4 . The images starting from the top left ( Fig. 4a–c) are, in sequence, a low-resolution (10 ) optical image of yeast cells taken with the inverted section of the dual far-field optical microscope (seen in the box), then a higher-resolution (20 ) optical image to the left of the same field of view, and then a higher-resolution (50 ) image with the fiber probe in place illuminating the sample. At right is a view from the upright microscope portion of the dual microscope of the probe out of contact ( Fig. 4d) and in contact with the yeast cells (Fig. 4e ). Without any further adjustments, the AFM image in Figure 4f is obtained with overlapping fields of view with the optical far-field image. Simultaneously, in another channel a green fluorescent protein (GFP) fluorescence NSOM image of a form of GFP expressed in this budding yeast is recorded with the same probe ( Fig. 4g). An NSOM transmission optical image is also shown in Figure 4h .

Accurate placement of the AFM probe within the field of view of the optical images, permits imaging from eyeball to nanometers in one integrated instrument and adds detailed topographic information to the optical information that lacks such detail. Also, in terms of SPM in general, there is significant information added about the distribution of GFP fluorescence and absorption within the AFM topography. AFM, of course, has no such biologically relevant information.

NSOM optical fiber probes can be used in all aperture-based NSOM applications. These include the excellent transmission imaging that is shown in
Figure 4, reflection imaging15 and collection-mode operation16. The cantilevered nature of the probe allows viewing online with the lens along with NSOM and AFM imaging ( Fig. 4). The ability to have separate channels of illumination is significant for reflection NSOM imaging, where the light reflected by a sample illuminated with the probe is collected by the lens of the upright microscope. This is important in the imaging of opaque biochips and tissue samples in which transmission NSOM is not possible. It is also important for the integration of NSOM measurements with such modalities as differential interference contrast (DIC) imaging. Independent illumination of a lens of an upright microscope with a probe in place is also of particular importance when one considers new modalities of NSOM imaging (described below).

Today there is a general consensus that more light can be obtained from a straight than from a cantilevered fiber. However, our group has the most extensive comparisons to date, and these indicate that, as would be expected, most light loss occurs at the subwavelength aperture, which is common to both straight and cantilevered NSOM fiber probes. The bend in the cantilevered probe exhibits less loss, by several orders of magnitude, than the aperture and this can be compensated for. In this regard, it should be mentioned that there are two ways to produce subwavelength apertures in fibers. One is the pulling method, as described above, and the other is based on the etching approach pioneered by Ohtsu and his group in Japan17. Etched probes have a structure that should theoretically give higher throughput. However, the etching causes difficulty in coating, and thus today most work is done with pulled fiber probes. Zenobi, Deckert and coworkers18 have also developed a specialized etching technique that reduces extraneous pinholes in the coating. All such probes, however, require quite flat sample geometries, because etching produces very large, flat probe tips.

The importance of normal force feedback has stimulated several workers to try to develop an NSOM aperture with normal force sensing in silicon-based materials19, 20 (see
Fig. 3d). Using standard silicon microfabrication to produce such NSOM apertures in silicon AFM sensors has been difficult and progress has been slow. Although it is still not possible to emulate the guiding of light that is inherent to the nature of an optical fiber, at least two techniques have surfaced in addition to traditional microprocessing technology for producing such apertures. The methods include simply coating a silicon nitride cantilever, which is inherently transparent, with metal and then producing an aperture with focused ion-beam techniques. Alternatively, a most innovative technique based on the evanescent field on the surface of a prism (as described above) has been used to create an aperture in a metal-coated transparent silicon nitride cantilever21.

A prerequisite for producing light at the tip of such silicon-based NSOM apertures is the ability to bring a lens in close proximity to the back side of the probe. The platform described in
Box 1 allows such illumination. Essentially, this platform enables any probe and any mode that is presently known for NSOM to be effectively used. Silicon probes do not have light guiding and light concentration properties, as in tapered fibers do, which creates several problems. First, they require large fluences of light from a lens to illuminate the aperture, resulting in a reduction in signal-to-noise ratio. Second, when illuminated, they suffer from the index mismatch problems noted above. Third, they do not allow an on-line separate illumination or collection channel with the lens of the optical microscope, which has to be dedicated to illuminate the aperture (as a result, reflection imaging is difficult and the lack of light-guiding capacity prevents scanning of the probe in collection-mode imaging). Fourth, noncontact or intermittent contact imaging for such an illuminated aperture has been problematic and liquid imaging has remained difficult. Nonetheless, for femtosecond ablation experiments, where signal-to-noise concerns are not critical, these probes or their higher-damage-threshold glass counterparts ( Fig. 3e) may be preferable.

Finally, a new development applicable to all aperture-based NSOM is work that indicates a gold nanoparticle or gold-coated asperity placed on the aperture can increase the boundary conditions for light throughput through such apertures22. Preliminary results from this research indicate that there is a chance of considerable improvement in resolution for such aperture-based NSOM, to below the nominal 50 nm that is often quoted.

Exponential growth in NSOM methodologies

Aside from standard aperture-based NSOM, there are a whole host of near-field optical methods aimed at obtaining nanometric optical information. Most of these techniques show the considerable importance of full optical integration of lensless and lens-based imaging.

External illumination—active light sources.

One technique that highlights such integration, and indicates the considerable potential for functional near-field optical imaging of samples with biological importance, calls for externally illuminating an aperture containing a fluorophore that emits a point of light as an active light source for NSOM imaging. This technique was first described by our group23. More recently, however, elegant studies have been performed by Sandoghdar and coworkers demonstrating imaging using an active light source consisting of a single molecule (for a review, see ref. 24).

Optical analog of patch clamping.

We have been investigating the potential of this technique for monitoring ion concentrations in and around cellular membranes. The basis of our approach (Fig. 5) is a cantilevered micropipette that is used as a nanovessel for an ion-sensing dye, which is excited using an epi-illumination geometry. The AFM function of the system described in Box 1 is used to provide unprecedented (<1 nm) control of a dye at any position in and around a cell. In the past, we have used a pyranine dye fluorescence to monitor pH with nanometric spatial control in and around charged surfaces25. We have now extended these studies and have combined a Multiview 1000 (Nanonics Imaging) system illustrated in Figure 7 with a confocal beam-scanning system (Leica Microsystems). This confocal beam scanner was used to image a neuron while the probe was held at a constant distance above the cell surface (Fig. 5b).

The neuronal cell line that we investigated consisted of primary cultures of hippocampal neurons26. For these studies, the probe tip containing calcium green is held in place with AFM control over a neuronal cell membrane of a cell that is also filled with calcium green. The probe tip and the cell are excited in the epi-illumination geometry (illustrated in Fig. 5a by a beam-scanning confocal microscope). Each frame in the image is one confocal scan of the cell and the probe tip that takes 250 s to record. The distance of the tip from the surface of the cell is maintained to 1 nm with the AFM control during all of these scans. Between the fourth and the fifth confocal scans, cell calcium starts to be released from internal stores in response to a puff of caffeine injected just before the first confocal image (left image, top row). In confocal scans five and six (right images, top and middle row of Fig. 5b), the cell continues to respond, and in seventh scan the calcium is effusing through calcium channels and the tip begins to respond. The tip continues to respond in the eighth scan where some tip pixels are even red (which indicates the highest fluorescence intensity). At this point the cell response begins to die down. As each confocal scan takes 250 s, the first response of the cell is at 1.25 ms, whereas the first tip response is between 1.50 ms and 1.75 ms. The tip response begins to decrease between 2.00 ms and 2.25 ms. Such times are quite reasonable for such a calcium release.

The technique can potentially be used to monitor ionic alterations in the dendritic spines of neurons without the mechanically perturbing suction that results from patch clamping. Also, ionic fluxes around synaptic terminals can be monitored together in conjunction with ultrastructural detection of membrane movement accompanying such ion fluxes. At least one suggested theory of neuron learning proposes a physical movement of the synapse that increases the synaptic strength, and this technology could potentially be applied to investigate such processes.

External illumination—passive concentrators.

An alternative approach to an external illumination protocol is to illuminate a standard silicon cantilever from the side, with the polarization of the incident light being along the axis of the tip of the probe. In general, such experiments have used standard silicon cantilevers, and light concentration is achieved simply by the nanometric tip that concentrates the light field at the tip of the probe. It is possible that glass probes, with their very long, slender profiles with gold nanoparticles (Fig. 3g), would be even better than silicon AFM sensors for such an application, but this has not been tested in this mode of operation, which is generally termed apertureless NSOM (ANSOM).

Pioneering studies in this area were performed by Wickramasinghe and co-workers27 and Knoll and Keilmann28, among others. These investigators showed contrast that could be associated with alterations in optical interactions. The problem of coupling of topography with optics in this technique has been particularly severe and has plagued many studies in the field.

In all of these external illumination schemes it is possible to tag alterations caused by the tip-sample interaction by modulating the probe at close to its resonance frequency. This makes it possible to electronically tag the scattered light from the tip-sample interaction. In spite of this, the scattered signal monitored at the resonance frequency is still highly contaminated with spurious signals that are not associated with the tip-sample interactions. The problem can be significantly improved by monitoring the signal from this interaction at a frequency that is the second harmonic of the fundamental frequency at which the probe is modulated28, 29.

Probably the most exciting application of this sort of external illumination protocol is the imaging of chemical alterations in a sample by monitoring the scattering of infrared radiation within the region of the electromagnetic spectrum where vibrational modes of surface molecules absorb light in chemically specific ways. Such chemical identification with high spatial resolution is very important for numerous areas of interest in biology. These extend from the chemical identification of molecular entities on biochips to the spatially resolved nanometric imaging of highly compartmentalized cell membranes. Of course, application of this latter methodology to biological imaging is subject to the problem of high absorption of infrared radiation by water.

Superresolution Raman spectral imaging.

Raman spectral imaging is a way around the absorption of infrared light by water in superresolution infrared imaging. It involves shining a laser at a sample and then monitoring the scattered light that emanates from the sample at frequencies or wavelengths different from the excitation. Of course, most of the scattered light occurs at the same frequency; this is called Rayleigh scattering, which explains why the sky is blue (shorter blue wavelengths scatter better). Raman scattering is seven orders of magnitude smaller than same-frequency Rayleigh scattering. The problem is obvious: conventional aperture NSOM, which excites a small signal from a small aperture, has little probability of exciting detectable Raman scattering. Nonetheless, researchers have been successful under extreme conditions (long scan times and high Raman scatterers) in obtaining near-field Raman scattering30, 31.

Raman enhancement.

Several findings from the past year indicate that near-field Raman scattering could be made a practical reality. The first is that of surface-enhanced Raman scattering (SERS). It was shown nearly 30 years ago that roughened silver surfaces result in enormous enhancements in Raman signals because of the excitation of surface plasmons in metals, such as silver or gold32. More recently, Emory and Nie33 and Feld and coworkers34 have demonstrated that, at certain locations, these surfaces can produce Raman spectra from single molecules. With this observation, interest has focused on trying either to attach a metallic nanoparticle to an AFM sensor or to produce a roughened silver AFM sensor.

Using a unique method for producing a silver or gold nanoparticle at the tip of a force-sensing cantilevered nanopipette35, Sun and Shen36 have reported enhancements of 104 in the silicon Raman signal, using a MultiView 1000 and a standard micro-Raman 180° scattering geometry in which the laser beam is illuminated through the lens of the far-field microscope and this same lens collects the scattered light. In addition, Hartschuh et al.37 have reported that a silver metal wire roughened with a focused ion beam and attached to a tuning fork for feedback is capable of enhancing, by a factor of 103, the signal of carbon nanotubes, signals similar to those observed by Sun and Shen36 for a silicon sample. For their measurements, Hartschuh et al.37 used an inverted microscope that illuminated a transparent sample, with the tip in close proximity; in contrast, Sun and Shen36 used a standard micro-Raman geometry RM Raman Series (Renishaw) with an upright microscope complexed to the instrumentation described in Box 1. It should be noted that both silicon and carbon nanotubes are very strong Raman scatterers; nonetheless, the results obtained thus far are impressive.

Shadow NSOM—a contrary approach to NSOM

All of the studies in NSOM thus far have considered the problem of how one produces a point of light. No one has considered the problem from the point of view of shadowing a surface of a sample from a far-field light source and obtaining the information on the nano-optical properties of the shadowed region by difference imaging ( Fig. 3h). For many applications, using linear optical processes such as one-photon absorption or fluorescence, shadow NSOM is most probably not applicable either because a very large background resulting from scattering may obscure the shadowed signal in the difference image or because, in the case of fluorescence, a very strong bleaching would occur in the exposed regions. However, when the unique features of Raman spectroscopy are combined with the integrated platform described Box 1 and the special properties of glass probes, shadow NSOM becomes an appealing possibility.

Shadow NSOM requires a clear optical axis for the NSOM platform, a probe tip that is exposed to the optical axis, and independent motion of the sample and the probe tip. For these measurements a MultiView 2000 (Nanonics Imaging Ltd, Jerusalem, Israel) was used. From a micro-Raman point of view, shadow NSOM makes use of the excellent rejection of Rayleigh-scattered light found in Raman spectrometers and of the superior capabilities of charge-coupled detectors (CCDs) with exceptional linearity in response, high dynamic range, high quantum efficiency and essentially zero dark noise.

With such an integration, a shadow NSOM Raman image is obtained by placing on the sample a glass probe coated with any metal that has high opacity but does not show enhancement phenomena (Fig. 3h). This probe tip is brought into contact with a point on the sample with subnanometric AFM control, and a CCD spectrum is stored. Using the independent motion of the probe, it is then retracted without movement of the sample. At this point, another CCD spectrum is recorded and stored for the same pixel and the difference spectrum is computed to give the spectrum of what was shadowed by the probe tip. In one initial test with this approach, we obtained a 100-nm resolution in the line scan ( Fig. 6a). This is still not close to what was discussed above for enhancement phenomena. What should be taken into consideration, however, is that the focal spot of the laser through the lens in this experiment was 3 m because the objective used had only a 50 magnification. With higher-magnification objectives, focal spots as small as 1 m or less should be readily achievable with the Renishaw InVia Raman microSpectrometer (Warsash Scientific) that was used. This should increase the resolution of the shadow NSOM image, because the size of the focal spot defines the noise in the measurement and a reduction in the noise by a factor of at least 4, as would occur with a reduction of laser spot size by 2, should bring the resolution of shadow NSOM close to what is possible with SERS enhancement methods. This number is also consistent with estimates based on rules of thumb for the sensitivity of difference Raman spectroscopy, which is estimated to be 1 part in 105.

Shadowing techniques should work for a variety of problems in which the signal observed is removed from the exciting laser beam. This could include photoluminescence either in systems that do not experience bleaching (e.g., in inorganic materials) or in nonlinear optical phenomena (e.g., second-harmonic or third-harmonic generation, which are produced, unlike two- and three-photon fluorescence, through excited levels that do not need to be populated). Initial results are presented in Figure 6b.

In this regard, it should be mentioned that exciting activity in nonlinear optical near-field phenomena includes very strong enhancement phenomena that have been observed in second-harmonic generation and even aperture-based approaches to such nonlinear near-field imaging (for excellent reviews, see refs. 35, 38). In terms of nonlinear optical enhancement phenomena, credit should be given to Wessel, who clearly pointed out the potential of such an approach39.

Fluorescence correlation spectroscopy

Some defined areas in which near-field optics should play a dramatic role are still to be effectively explored. One such area is fluorescence correlation spectroscopy (FCS), which can be used to monitor the dynamics and concentration of fluorescent species. Near-field optics has the potential for increasing the molecular detection efficiency (MDE) in FCS by orders of magnitude. The MDE is given by the expression:

where the beam size, wo, and the z penetration, zo, is much smaller in near-field optics than what is possible in confocal imaging, which is presently used for higher sensitivity. Near-field optics can achieve a beam size of 0.05 m, as compared with 0.5 m in confocal imaging—a factor of 10 improvement in the MDE.

A further increase in the MDE, of a factor of over a 100, is achieved in near-field optical illumination because the radiation exits from the probe tip with a large divergence. This limits by a factor of at least 100 the effective z extent in near-field optics as compared with confocal or multiphoton imaging (zo of 0.7 m can be achieved in fluorescence mode for a 50- m detection aperture and a high-numerical-aperture objective). Thus, from equation (1), an MDE value can be achieved at least 1,000-fold higher than what is presently obtained. As a result, the FCS signal increases by a factor of 1,000 and, in addition, the noise is diminished by the marked reduction in the illuminated volume. This latter point has been shown in initial experiments by Levene et al.40. In addition, on-line AFM control in the illumination reduces any noise that is due to mechanical motion.

Furthermore, there is a great advantage to having the illumination, rather than the detection, limited by an aperture. By limiting the illumination, one not only reduces wo but also, without the need for a detection aperture, zo. This reduction in z also reduces background noise and thereby increases detection efficiency while reducing out-of-focus bleaching. Finally, the outer diameters of cantilevered optical fibers are similar to glass intracellular electrodes. Thus, these probes can be inserted into cells with the great control of AFM for FCS measurements at any point within the cell. The control is achievable by the highly controlled applications of atomic force, where the force per unit area (F/A) can be accurately monitored to penetrate without damage and great precision into a region of choice.

Conclusions—a crucial bridge in integrated microscopy

Near-field optics has resulted in the highest resolution ever achieved in optical imaging, including a wide variety of imaging modalities from fluorescence imaging, to reflection and collection imaging, to Raman imaging and even nonlinear imaging. There is no debate about this. Achieving such high resolution was akin to climbing Mount Everest.

Unheard of two decades ago, near-field optics is today not only a recognized field, but also (and this is what is so exciting) an area in which an exponentially growing group of ideas is coalescing on how to obtain nanometer-scale optical information using the concepts of the optical near-field in its broadest sense. To achieve these ideas, as has been documented in this article, requires SPM platforms that are totally integrated into upright or inverted optical microscopes (preferably both). We have found interesting solutions to such integration and are sure that other solutions will be found. These platforms will open up the world of integrated microscopy, which is critical for the future of optical imaging. The important issue is not the question of what is the more appropriate technique—whether NSOM, AFM, two-photon microscopy, second-harmonic generation, 4Pi or FCS—but the reality that there is a great synergism in imaging modalities.

A concept not touched upon in this paper in any detail, but which we have addressed in other investigations41, is the classic problem of optical imaging: optical imaging fits into a general class of physics problems that are called inverse problems. Such problems have considerable import in a variety of areas, even those that are purely biological, such as neural network analysis. These problems suffer from a lack of the required information to obtain a complete solution. In terms of optical imaging, the required information is not provided by even the most advanced emulations of far-field optics. On the other hand, an inverse problem is one in which if something is independently known about the object, then the imaging modality that is being applied can provide much more information on the object than without this additional information. For example, consider the problem of optical deconvolution, which transcends the mode of imaging that is being employed. It is applicable to confocal, CCD, 4Pi or multiphoton microscopy and requires accurate information about where an object ends. Because such accurate information is lacking, certain deconvolution algorithms rely on what is called blind deconvolution. In contrast, an online NSOM/AFM platform that does not perturb the imaging technique being used readily provides essential nanometric information on the optical borders of the object. With this on-line information, a closed-loop form of deconvolution could be implemented in which the information provided yields a calculated object that is tested on-line with the NSOM/AFM system and further iterations are undertaken to improve the results of the calculation. Such an integration of imaging with computational methodologies should have an enormous impact on all forms of optical imaging, and near-field optics will be a bridge in this future evolution of the field.

In summary, near-field optics, and its associated advances in instrumentation, are at the cusp of a rapid advance, much like what happened with AFM in the early 1990s. AFM and the techniques it has spawned have played a crucial role in nanotechnology; near-field optical imaging and the technologies it is spawning (e.g., nanopipette-based nano-fountain-pen nanoprotein printing42) are likely to be equally crucial in providing information about molecules in cells, their interactions in space and time, and their involvement in the fundamental processes of biology.

Published online 31 October 2003.


  1. Lewis, A. Isaacson, M. Harootunian, A. & Muray, A. Development of a 500-Å spatial-resolution light-microscope. Biophys. J. 41, 405–406 (1983).
  2. Lewis, A. Isaacson, M., Harootunian, A. & Muray, A. Development of a 500-Å spatial-resolution light-microscope. 1. Light is efficiently transmitted through /16 diameter apertures. Ultramicroscopy 13, 227–231 (1984).
  3. Pohl, D.W., Denk, W. & Lanz, M. Optical stethoscopy: image recording with a resolution /20. Appl. Phys. Lett. 44, 651–653 (1984).
  4. Bouevich, O. Lewis, A. Pinnevsky, I. & Loew, L. Probing membrane potential with non-linear optics. Biophys. J. 65, 672–682 (1993).
  5. Lewis, A. et al. Second harmonic generation of biological interfaces: probing membrane proteins and imaging membrane potential around GFP molecules at specific sites in neuronal cells of C. elegans. Chem. Physics 245, 133–144 (1999).
  6. Harootunian, A. Betzig, E., Isaacson, M.S. & Lewis, A. Superresolution fluorescence near-field scanning optical microscopy (NSOM). Appl. Phys. Lett. 49, 674–676 (1986).
  7. Betzig, E. Trautman, J.K., Harris, T.D., Weiner, J.S. & Kostelak, R.L. Breaking the diffraction barrier—optical microscopy on a nanometric scale. Science 251, 1468–1470 (1991).
  8. Karrai, K. & Grober, R.D. Piezoelectric tip-sample distance control for near-field optical microscopes. Appl. Phys. Lett. 66, 1842–1844 (1995).
  9. Giessibl, F.J. Advances in atomic force microscopy. Rev. Mod. Phys. 75, 949–983 (2003).
  10. Shalom, S., Lieberman, K., Lewis, A. & Cohen, S.R. A micropipette force probe suitable for near-field scanning optical microscopy. Rev. Sci. Instr. 63, 4061–4065 (1992).
  11. Muramatsu, H. et al. Development of near-field optic atomic-force microscope for biological materials in aqueous solutions. Ultramicroscopy 61, 266–269 (1995).
  12. Lewis, A. et al. New design and imaging concepts in NSOM. Ultramicroscopy 61, 215–221 (1995).
  13. Dunn, R.C. Near-field scanning optical microscopy. Chem. Rev. 99, 2891–2927 (1999).
  14. Sader, J.E. Susceptibility of atomic force microscope cantilevers to lateral forces. Rev. Sci. Instr. 74, 2438–2443 (2003).
  15. Lewis, A. et al. Failure analysis of integrated circuits beyond the diffraction limit: contact mode near-field scanning optical microscopy with integrated resistance, capacitance, and UV confocal imaging. Proc. Inst. Electric. Electron. Eng. 88, 1471–1481 (2000).
  16. Benami, U. et al. Near-infrared contact mode collection near-field optical and normal force microscopy of modulated multiple quantum well lasers. Appl. Phys. Lett. 68, 2337–2339 (1996).
  17. Toda, Y., Kourogi, M., Ohtsu, M., Nagamune, Y. & Arakawa, Y. Spatially and spectrally resolved imaging of GaAs quantum-dot structures using near-field optical technique. Appl. Phys. Lett. 69, 827–829 (1996).
  18. Stuckle, R.M. et al. High quality near-field optical probes by tube etching. Appl. Phys. Lett. 75, 160–162 (1999)
  19. Zhou, H., Midha, A., Mills, G., Donaldson, L. & Weaver, J.M.R. Scanning near-field optical spectroscopy and imaging using nanofabricated probes. Appl. Phys. Lett. 75, 1824–1826 (1999).
  20. Oesterschulze, E., Rudow, O., Mihalcea, C., Scholz, W. & Werner, S. Cantilever probes for SNO