Metamaterials Will Change Optics
Duke University engineers believe that continued advances in creating ever-more exotic and sophisticated human-made materials will greatly improve their ability to control light at will.
The burgeoning use of metamaterials in the field of optics does not rely on the limited set of materials found in nature, but rather human-made constructs that can be designed to control light's many properties. This control is gained by use of metamaterials, which are not so much single substances but entire human-made structures that can be engineered to exhibit properties not readily found in nature.
In their latest series of experiments, the Duke team demonstrated that a metamaterial construct they developed could create holograms -- like the images seen on credit or bank cards -- in the infrared range of light, something that had not been done before.
The Duke engineers point out that while this advance was achieved in a specific wavelength of light, the principles used to design and create the metamaterial in their experiments should apply in controlling light in most frequencies.
"In the past, our ability to create optical devices has been limited by the properties of natural materials," said StéphaneLarouche, research scientist in electrical and computer engineering at Duke's Pratt School of Engineering. "Now, with the advent of metamaterials, we can almost do whatever we want to do with light.
"In addition to holograms, the approach we developed easily extends to a broad range of optical devices," Larouche said. "If realized, full three-dimensional capabilities open the door to new devices combining a wide range of properties. Our experiments provide a glimpse of the opportunities available for advanced optical devices based on metamaterials that can support quite complex material properties."
The results of Larouche's experiments, which were conducted in the laboratory of senior researcher David R. Smith, a professor of electrical and computer engineering, appeared in an advanced online publication of the journal Nature Materials. The research was supported by the Army Research Office's Multidisciplinary University Research Initiative (MURI).
The metamaterial device fashioned by the Duke team doesn't look anything like a lens, though its ability to control the direction of rays passing through it surpasses that of a conventional lens. While traditional lenses are made of clear substances -- like glass or plastic -- with highly polished surfaces, the new device looks more like a miniature set of tan Venetian blinds.
These metamaterials are constructed on thin slabs of the same material used to make computer chips. Metal elements are etched upon these slabs to form a lattice-like pattern. The metal elements can be arranged in limitless ways, depending on the properties desired.
"There is unquestionable potential for far more advanced and functional optical devices if greater control can be obtained over the underlying materials," Larouche said. "The ability to design and fabricate the components of these metamaterial constructs has reached the point where we can now build even more sophisticated designs.
"We believe that just about any optical device can be made more efficient and effective using these new approaches," he said.
Text 3. NASA Sub-Scale Solid-Rocket Motor Tests Material for Space Launch System
A sub-scale solid rocket motor designed to mimic NASA's Space Launch System, or SLS, booster design successfully was tested today by engineers at NASA's Marshall Space Flight Center in Huntsville, Ala. The 20-second firing tested new insulation materials on the 24-inch-diameter, 109-inch-long motor. The motor is a scaled down, low-cost replica of the solid rocket motors that will boost SLS off the launch pad.
Marshall is leading the design and development of the SLS on behalf of the agency. The new heavy-lift launch vehicle will expand human presence beyond low-Earth orbit and enable new missions of exploration across the solar system.
The test will help engineers develop and evaluate analytical models and skills to assess future full-scale SLS solid rocket motor tests. The next full-scale test, Qualification Motor-1 (QM-1), is targeted for spring 2013. Two five-segment solid rocket motors, the world's largest at 154-foot-long and 12-foot diameter, will be used in the first two 70-metric-ton capability flights of SLS.
Previous ground tests of the motors included carbon insulation to protect the rocket's nozzle from the harsh environment and 5000-degree temperatures to which it is exposed. QM-1 will include a new insulation material, provided by a new vendor, to line the motor's nozzle.
"Test firing small motors at Marshall provides a quick, affordable and effective way to evaluate the new nozzle liner's performance," said Scott Ringel, an engineer at Marshall and the design lead for this test. "We have sophisticated analytic and computer modeling tools that tell us whether the new nozzle insulation will perform well, but nothing gives us better confidence than a hot-fire test."
The test also includes several secondary objectives. The team introduced an intentional defect in the propellant with a tool designed to create a specific flaw size. By measuring the temperature inside the motor at the flaw location, the team hopes to gain a better understanding for the propellant's margin for error. Test data also will help the team better understand acoustics and vibrations resulting from the rocket motor's plume.
In addition, NASA's Engineering and Safety Center will use test data to measure a solid rocket motor's plume and how it reacts to certain materials.
Engineers from Marshall's Engineering Directorate designed the test motor with support from ATK Aerospace Systems of Huntsville, Ala. ATK of Brigham City, Utah, the prime contractor for the SLS booster, is responsible for designing and testing the SLS five-segment solid rocket motor.
Text 4. Photography
The optics of photography involves both lenses and the medium in which the electromagnetic radiation is recorded, whether it be a plate, film, or charge-coupled device. Photographers must consider the reciprocity of the camera and the shot which is summarized by the relation Exposure - Aperture Area – Exposure Time – Scene Luminance.
In other words, the smaller the aperture (giving greater depth of focus), the less light coming in, so the length of time has to be increased (leading to possible blurriness if motion occurs). An example of the use of the law of reciprocity is the Sunny 16 rule which gives a rough estimate for the settings needed to estimate the proper exposure in daylight.[87]
A camera's aperture is measured by a unit less number called the f-number or f-stop, f/#, often notated as , and given by
where is the focal length, and is the diameter of the entrance pupil. By convention, "f/#" is treated as a single symbol, and specific values of f/# are written by replacing the number sign with the value. The two ways to increase the f-stop are to either decrease the diameter of the entrance pupil or change to a longer focal length (in the case of a zoom lens, this can be done by simply adjusting the lens). Higher f-numbers also have a larger depth of field due to the lens approaching the limit of a pinhole camera which is able to focus all images perfectly, regardless of distance, but requires very long exposure times.
The field of view that the lens will provide changes with the focal length of the lens. There are three basic classifications based on the relationship to the diagonal size of the film or sensor size of the camera to the focal length of the lens:
Normal lens: angle of view of about 50° (called normal because this angle considered roughly equivalent to human vision) and a focal length approximately equal to the diagonal of the film or sensor.
Wide-angle lens: angle of view wider than 60° and focal length shorter than a normal lens.
Long focus lens: angle of view narrow than a normal lens. This is any lens with a focal length longer than the diagonal measure of the film or sensor. The most common type of long focus lens is the telephoto lens, a design that uses a special telephoto group to be physically shorter than its focal length.
Modern zoom lenses may have some or all of these attributes.
The absolute value for the exposure time required depends on how sensitive to light the medium being used is (measured by the film speed, or, for digital media, by the quantum efficiency). Early photography used media that had very low light sensitivity, and so exposure times had to be long even for very bright shots. As technology has improved, so has the sensitivity through film cameras and digital cameras.
Other results from physical and geometrical optics apply to camera optics. For example, the maximum resolution capability of a particular camera set-up is determined by the diffraction limit associated with the pupil size and given, roughly, by the Rayleigh criterion.
Text 5. Atmospheric optics
The unique optical properties of the atmosphere cause a wide range of spectacular optical phenomena. The blue color of the sky is a direct result of Rayleigh scattering which redirects higher frequency (blue) sunlight back into the field of view of the observer. Because blue light is scattered more easily than red light, the sun takes on a reddish hue when it is observed through a thick atmosphere, as during a sunrise or sunset. Additional particulate matter in the sky can scatter different colors at different angles creating colorful glowing skies at dusk and dawn. Scattering off of ice crystals and other particles in the atmosphere are responsible for halos, afterglows, coronas, rays of sunlight, and sun dogs. The variation in these kinds of phenomena is due to different particle sizes and geometries.
Mirages are optical phenomena in which light rays are bent due to thermal variations in the refraction index of air, producing displaced or heavily distorted images of distant objects. Other dramatic optical phenomena associated with this include theNovaya Zemlya effect where the sun appears to rise earlier than predicted with a distorted shape. A spectacular form of refraction occurs with a temperature inversion called the Fata Morgana where objects on the horizon or even beyond the horizon, such as islands, cliffs, ships or icebergs, appear elongated and elevated, like "fairy tale castles".
Rainbows are the result of a combination of internal reflection and dispersive refraction of light in raindrops. A single reflection off the backs of an array of raindrops produces a rainbow with an angular size on the sky that ranges from 40° to 42° with red on the outside. Double rainbows are produced by two internal reflections with angular size of 50.5° to 54° with violet on the outside. Because rainbows are seen with the sun 180° away from the center of the rainbow, rainbows are more prominent the closer the sun is to the horizon.
Text 6. Brown Liquor and Solar Cells to Provide Sustainable Electricity
A breakthrough for inexpensive electricity from solar cells, and a massive investment in wind power, will mean a need to store energy in an intelligent way. According to research at Linköping University, published inScience, batteries of biological waste products from pulp mills could provide the solution.
Organic solar cells based on conductive plastic is a low cost alternative that has achieved high enough performance to be upscaled and, in turn, become competitive. However, solar electricity must be able to be stored from day to night, as well as electricity from wind turbines from windy to calm days.
In conventional batteries metal oxides conduct the charge. Materials, such as cobalt, are expensive and a limited resource, therefore, low cost solutions are sought preferably with renewable materials.
"Nature solved the problem long ago," says OlleInganäs, professor of biomolecular and organic electronics at Linköping University (LiU) and lead author of the article in a recent edition of Science.
He drew inspiration from the process of photosynthesis, where electrons charged by solar energy are transported by quinones; electrochemically active molecules based on benzene rings composed of six carbon atoms. Inganäs chose the raw material brown liquor that is a by-product from the manufacture of paper pulp. The brown liquor is largely composed of lignin, a biological polymer in the plant cell walls.
To utilise the quinones as charge carriers in batteries, Inganäs and his Polish colleague GrzegorzMilczarek devised a thin film from a mixture of pyrrole and lignin derivatives from the brown liquor. The film, 0.5 microns in thickness, is used as a cathode in the battery.
The goal is to offer ways to store renewable electricity where it is produced, without constructing up large grids. In several countries, major wind power investments are planned. Meanwhile, the performance of cheap organic solar cells has now reached a critical level. A research team at the University of California, Los Angeles, has recently reported efficiency of more than 10 percent of the energy of the captured sunlight.
According to Inganäs who for many years conducted research on organic solar cells, the efficiency is sufficient to initiate an industrial scale up of the technology.
"Now we need more research into new energy storage based on cheap and renewable raw materials. Lignin constitutes 20-30 percent of the biomass of a tree, so it's a source that never ends."
Text 7. Hard Electronics: Hall Effect Magnetic Field Sensors for High Temperatures and Harmful Radiation Environments
Researchers at Toyohashi University of Technology have invented Hall effect magnetic field sensors that are operable at high temperatures and harmful radiation conditions. The sensors will find applications in space craft and nuclear power stations.
Toyohashi Tech researchers have fabricated Hall effect magnetic field sensors operable at least 400oC and in extreme radiation conditions using gallium nitride-based heterostructures a with two-dimensional electron gas.
Silicon and III-V compound semiconductor Hall effect magnetic field sensors are widely used in the electronics industry for monitoring rotation in equipment such as optical memory disks and for banknote authentication in vending machines. However, the use of Hall sensors for monitoring magnetic fields in outer space and nuclear power stations is more challenging because of the large fluctuations in temperature and harmful radiation in these environments.
To resolve these issues, the Toyohashi Tech researchers used AlGaN/GaN two-dimensional electron gas heterostructures to fabricate high sensitivity micro-Hall effect magnetic field sensors that are stable at high temperatures and high fluxes of proton irradiation.
Notably, the AlGaN/GaN micro-Hall sensors were stable up to at least 400oC, whereas sensors fabricated using the GaAs and InSb degraded from ~120oC.
Furthermore, the electron mobility and two dimensional electron density of the AlGaN/GaN micro-Hall sensors were only slightly affected by a 1x1013 cm-2 proton dose at 380 keV.
The researchers are actively seeking industrial partners to explot the robust properties of the 2DEG-AlGaN/GaN 2DEG Hall sensors for operation at high temperatures and in harsh radiation environments.
A potential application included imaging of ferromagnetic domains at the surface of permanent magnetics. AdarshSandhu has demonstrated the imaging of magnetic domains in ferromagnetic materials with aAlGaN/GaN micro-Hall sensor in a high temperature scanning Hall probe microscope (SHPM).
Text 8. Nanopower: Avoiding Electrolyte Failure in NanoscaleLithum Batteries
It turns out you can be too thin -- especially if you're a nanoscale battery. Researchers from the National Institute of Standards and Technology (NIST), the University of Maryland, College Park, and Sandia National Laboratories built a series of nanowire batteries to demonstrate that the thickness of the electrolyte layer can dramatically affect the performance of the battery, effectively setting a lower limit to the size of the tiny power sources. The results are important because battery size and performance are key to the development of autonomous MEMS -- microelectromechanical machines -- which have potentially revolutionary applications in a wide range of fields.
MEMS devices, which can be as small as tens of micrometers (that is, roughly a tenth the width of a human hair), have been proposed for many applications in medicine and industrial monitoring, but they generally need a small, long-lived, fast-charging battery for a power source. Present battery technology makes it impossible to build these machines much smaller than a millimeter -- most of which is the battery itself -- which makes the devices terribly inefficient.
NIST researcher Alec Talin and his colleagues created a veritable forest of tiny -- about 7 micrometers tall and 800 nanometers wide -- solid-state lithium ion batteries to see just how small they could be made with existing materials and to test their performance.
Starting with silicon nanowires, the researchers deposited layers of metal (for a contact), cathode material, electrolyte, and anode materials with various thicknesses to form the miniature batteries. They used a transmission electron microscope (TEM) to observe the flow of current throughout the batteries and watch the materials inside them change as they charged and discharged.
The team found that when the thickness of the electrolyte film falls below a threshold of about 200 nanometers, the electrons can jump the electrolyte border instead of flowing through the wire to the device and on to the cathode. Electrons taking the short way through the electrolyte -- a short circuit -- cause the electrolyte to break down and the battery to quickly discharge.
"What isn't clear is exactly why the electrolyte breaks down," says Talin. "But what is clear is that we need to develop a new electrolyte if we are going to construct smaller batteries. The predominant material, LiPON, just won't work at the thicknesses necessary to make practical high-energy-density rechargeable batteries for autonomous MEMS."
Text 9. Better Organic Electronics: Researchers Show the Way Forward for Improving Organic and Molecular Electronic Devices
Future prospects for superior new organic electronic devices are brighter now thanks to a new study by researchers with the U.S. Department of Energy (DOE)'s Lawrence Berkeley National Laboratory (Berkeley Lab). Working at the Lab's Molecular Foundry, a DOE nanoscience center, the team has provided the first experimental determination of the pathways by which electrical charge is transported from molecule-to-molecule in an organic thin film. Their results also show how such organic films can be chemically modified to improve conductance.
"We have shown that when the molecules in organic thin films are aligned in particular directions, there is much better conductance," says MiquelSalmeron, a leading authority on nanoscale surface imaging who directs Berkeley Lab's Materials Sciences Division and who led this study. "Chemists already know how to fabricate organic thin films in a way that can achieve such an alignment, which means they should be able to use the information provided by our methodology to determine the molecular alignment and its role on charge transport across and along the molecules. This will help improve the performances of future organic electronic devices."
Salmeron and ShaulAloni, also of the Materials Sciences Division, are the corresponding authors of a paper in the journalNanoLetters that describes this work. The paper is titled "Electron Microscopy Reveals Structure and Morphology of One Molecule Thin Organic Films." Other co-authors were Virginia Altoe, Florent Martin and Allard Katan.
Organic electronics, also known as plastic or polymer electronics, are devices that utilize carbon-based molecules as conductors rather than metals or semiconductors. They are prized for their low costs, light weight and rubbery flexibility. Organic electronics are also expected to play a big role in molecular computing, but to date their use has been hampered by low electrical conductance in comparison to metals and semiconductors.
"Chemists and engineers have been using their intuition and trial-and-error testing to make progress in the field but at some point you hit a wall unless you understand what is going on at the molecular level, for example, how electrons or holes flow through or across molecules, how the charge transport depends on the structure of the organic layers and the orientation of the molecules, and how the charge transport responds to mechanical forces and chemical inputs," Salmeron says. "With our experimental results, we have shown that we can now provide answers for these questions."
In this study, Salmeron and his colleagues used electron diffraction patterns to map the crystal structures of molecular films made from monolayers of short versions of commonly used polymers containing long chains of thiophene units. They focused specifically on pentathiophene butyric acid (5TBA) and two of its derivatives (D5TBA and DH5TBA) that were induced to self-assemble on various electron-transparent substrates. Pentathiophenes -- molecules containing a ring of four carbon and one sulfur atoms -- are members of a well-studied and promising family of organic semiconductors.
Obtaining structural crystallographic maps of monolayer organic films using electron beams posed a major challenge, as Aloni explains.
"These organic molecules are extremely sensitive to high energy electrons," he says. "When you shoot a beam of high energy electrons through the film it immediately affects the molecules. Within few seconds we no longer see the signature intermolecular alignment of the diffraction pattern. Despite this, when applied correctly, electron microscopy becomes essential tool that can provide unique information on organic samples."
Salmeron, Aloni and their colleagues overcame the challenge through the combination of a unique strategy they developed and a transmission electron microscope (TEM) at the Molecular Foundry's Imaging and Manipulation of Nanostructures Facility. Electron diffraction patterns were collected as a parallel electron beam was scanned over the film, then analyzed by computer to generate structural crystallographic maps.
"These maps contain uncompromised information of the size, symmetry and orientation of the unit cell, the orientation and structure of the domains, the degree of crystallinity, and any variations on the micrometer scale," says first author Altoe. "Such data are crucial to understanding the structure and electrical transport properties of the organic films, and allow us to track small changes driven by chemical modifications of the support films."
In their paper, the authors acknowledge that to gain structural information they had to sacrifice some resolution.
"The achievable resolution of the structural map is a compromise between sample radiation hardness, detector sensitivity and noise, and data acquisition rate," Salmeron says. "To keep the dose of high energy electrons at a level the monolayer film could support and still be able to collect valuable information about its structure, we had to spread the beam to a 90 nanometer diameter. However a fast and direct control of the beam position combined with the use of fast and ultrasensitive detectors should allow for the use of smaller beams with a higher electron flux, resulting in a better than 10 nanometer resolution."
While the combination of organic molecular films and substrates in this study conduct electrical current via electron holes (positively-charged energy spaces), Salmeron and his colleagues say their structural mapping can also be applied to materials whose conductance is electron-based.
"We expect our methodology to have widespread applications in materials research," Salmeron says.
Aloni and Altoe say this methodology is now available at the Imaging and Manipulation of Nanostructures Facility for users of the Molecular Foundry.
Text 10. New High Definition Fiber Tracking Reveals Damage Caused by Traumatic Brain Injury
A powerful new imaging technique called High Definition Fiber Tracking (HDFT) will allow doctors to clearly see for the first time neural connections broken by traumatic brain injury (TBI) and other neurological disorders, much like X-rays show a fractured bone, according to researchers from the University of Pittsburgh in a report published online in the Journal of Neurosurgery.
In the report, the researchers describe the case of a 32-year-old man who wasn't wearing a helmet when his all-terrain vehicle crashed. Initially, his CT scans showed bleeding and swelling on the right side of the brain, which controls left-sided body movement. A week later, while the man was still in a coma, a conventional MRI scan showed brain bruising and swelling in the same area. When he awoke three weeks later, the man couldn't move his left leg, arm and hand.
"There are about 1.7 million cases of TBI in the country each year, and all too often conventional scans show no injury or show improvement over time even though the patient continues to struggle," said co-senior author and UPMC neurosurgeon David O. Okonkwo, M.D., Ph.D., associate professor, Department of Neurological Surgery, Pitt School of Medicine. "Until now, we have had no objective way of identifying how the injury damaged the patient's brain tissue, predicting how the patient would fare, or planning rehabilitation to maximize the recovery."
HDFT might be able to provide those answers, said co-senior author Walter Schneider, Ph.D., professor of psychology at Pitt's Learning Research and Development Center (LRDC), who led the team that developed the technology. Data from sophisticated MRI scanners is processed through computer algorithms to reveal the wiring of the brain in vivid detail and to pinpoint breaks in the cables, called fiber tracts. Each tract contains millions of neuronal connections.
"In our experiments, HDFT has been able to identify disruptions in neural pathways with a clarity that no other method can see," Dr. Schneider said. "With it, we can virtually dissect 40 major fiber tracts in the brain to find damaged areas and quantify the proportion of fibers lost relative to the uninjured side of the brain or to the brains of healthy individuals. Now, we can clearly see breaks and identify which parts of the brain have lost connections."
HDFT scans of the study patient's brain were performed four and 10 months after he was injured; he also had another scan performed with current state-of the-art diffusion tensor imaging (DTI), an imaging modality that collects data points from 51 directions, while HDFT is based on data from 257 directions. For the latter, the injury site was compared to the healthy side of his brain, as well as to HDFT brain scans from six healthy individuals.
Only the HDFT scan identified a lesion in a motor fiber pathway of the brain that correlated with the patient's symptoms of left-sided weakness, including mostly intact fibers in the region controlling his left leg and extensive breaks in the region controlling his left hand. The patient eventually recovered movement in his left leg and arm by six months after the accident, but still could not use his wrist and fingers effectively 10 months later.
Memory loss, language problems, personality changes and other brain changes occur with TBI, which the researchers are exploring with HDFT in other research protocols.
UPMC neurosurgeons also have used the technology to supplement conventional imaging, noted Robert Friedlander, M.D., professor and chair, Department of Neurological Surgery, Pitt School of Medicine, and UPMC Endowed Professor of Neurosurgery and Neurobiology. He is not a member of this research study.
"I have used HDFT scans to map my approach to removing certain tumors and vascular abnormalities that lie in areas of the brain that cannot be reached without going through normal tissue," he said. "It shows me where significant functional pathways are relative to the lesion, so that I can make better decisions about which fiber tracts must be avoided and what might be an acceptable sacrifice to maintain the patient's best quality of life after surgery."
Dr. Okonkwo noted that the patient and his family were relieved to learn that there was evidence of brain damage to explain his ongoing difficulties. The team continues to evaluate and validate HDFT's utility as a brain imaging tool, so it is not yet routinely available.
"We have been wowed by the detailed, meaningful images we can get with this technology," Dr. Okonkwo said. "HDFT has the potential to be a game-changer in the way we handle TBI and other brain disorders."
Co-authors include lead author Samuel L. Shin, Ph.D., Allison J. Hricik, M.S., Megan Maserati, and Ava M. Puccio, Ph.D., all of the Department of Neurological Surgery; Timothy Verstynen, Ph.D., SudhirPathak, M.S., and Kevin Jarbo, all of LRDC; and Sue R. Beers, of the Department of Psychiatry, all of the University of Pittsburgh.
Text 11. Nanoscale Magnetic Resonance Imaging, Quantum Computer Get Nudge from New Research
Magnetic resonance imaging (MRI) on the nanoscale and the ever-elusive quantum computer are among the advancements edging closer toward the realm of possibility, and a new study co-authored by a UC Santa Barbara researcher may give both an extra nudge.
The findings recently appeared inScience Express, an online version of the journal Science.
Ania Bleszynski Jayich, an assistant professor of physics who joined the UCSB faculty in 2010, spent a year at Harvard working on an experiment that coupled nitrogen-vacancy centers in diamond to nanomechanical resonators. That project is the basis for the new paper, "Coherent sensing of a mechanical resonator with a single spin qubit."
A nitrogen-vacancy (NV) center is a specific defect in diamond that exhibits a quantum magnetic behavior known as spin. When a single spin in diamond is coupled with a magnetic mechanical resonator -- a device used to generate or select specific frequencies -- it points toward the potential for a new nanoscale sensing technique with implications for biology and technology, Jayich explained.
Among those possible future applications of such a technique is magnetic resonance imaging on a scale small enough to image the structure of proteins -- an as-yet unaccomplished feat that Jayich called "one of the holy grails of structural biology."
"The same physics that will allow the NV center to detect the magnetic field of the resonator, hopefully, will allow MRI on the nanoscale," Jayich said. "It could make MRI more accurate, and able to see more. It's like having a camera with eight megapixels versus one with two megapixels and taking a picture of someone's face. You can't see features that are smaller than the size of a pixel. So do they have three freckles, or do they all look like one big freckle?
"That's the idea," Jayich continued. "To resolve individual freckles, so to speak, to see what a protein is made up of. What we found in this paper suggests that it is possible, although a significant amount of work still needs to be done."
Though further into the future based on the approach used for this paper, Jayich said, there is also the potential for such a coupling to be advanced and exploited as a possible route toward the development of a hybrid quantum system, or quantum computer.
Jayich collaborated on the project with researchers Shimon Kolkowitz, QuirinUnterreithmeier, Steven Bennett, and Mikhail Lukin, all from Harvard; Peter Rabl, from the Institute for Quantum Optics and Quantum Information of the Austrian Academy of Science; and J.G.E. Harris, from Yale. The work was supported in part by the National Science Foundation, the Center for Ultracold Atoms, and the Packard Foundation.
Text 12. Brain-Imaging Technique Predicts Who Will Suffer Cognitive Decline Over Time
Cognitive loss and brain degeneration currently affect millions of adults, and the number will increase, given the population of aging baby boomers. Today, nearly 20 percent of people age 65 or older suffer from mild cognitive impairment and 10 percent have dementia.
UCLA scientists previously developed a brain-imaging tool to help assess the neurological changes associated with these conditions. The UCLA team now reports in the February issue of the journal Archives of Neurology that the brain-scan technique effectively tracked and predicted cognitive decline over a two-year period.
The team has created a chemical marker called FDDNP that binds to both plaque and tangle deposits -- the hallmarks of Alzheimer's disease -- which can then be viewed using a positron emission tomography (PET) brain scan, providing a "window into the brain." Using this method, researchers are able to pinpoint where in the brain these abnormal protein deposits are accumulating.
"We are finding that this may be a useful neuro-imaging marker that can detect changes early, before symptoms appear, and it may be helpful in tracking changes in the brain over time," said study author Dr. Gary Small, UCLA's Parlow-Solomon Professor on Aging and a professor of psychiatry at the Semel Institute for Neuroscience and Human Behavior at UCLA.
Small noted that FDDNP-PET scanning is the only available brain-imaging technique that can assess tau tangles. Autopsy findings have found that tangles correlate with Alzheimer's disease progression much better than do plaques.
For the study, researchers performed brain scans and cognitive assessments on the subjects at baseline and then again two years later. The study involved 43 volunteer paricipants, with an average age of 64, who did not have dementia. At the start of the study, approximately half (22) of the participants had normal aging and the other half (21) had mild cognitive impairment, or MCI, a condition that increases a person's risk of developing Alzheimer's disease.
Researchers found that for both groups, increases in FDDNP binding in the frontal, posterior cingulate and global areas of the brain at the two-year follow-up correlated with progression of cognitive decline. These areas of the brain are involved in decision-making, complex reasoning, memory and emotions. Higher initial baseline FDDNP binding in both subject groups was associated with a decline in cognitive functioning in areas such as language and attention at the two-year follow-up.
"We found that increases in FDDNP binding in key brain areas correlated with increases in clinical symptoms over time," said study author Dr. Jorge R. Barrio, who holds UCLA's Plott Chair in Gerentology and is a professor of molecular and medical pharmacology at the David Geffen School of Medicine at UCLA. "Initial binding levels were also predictive of future cognitive decline."
Among the subjects with mild cognitive impairment, the level of initial binding in the frontal and parietal areas of the brain provided the greatest accuracy in identifying those who developed Alzheimer's disease after two years. Of the 21 subjects with MCI, six were diagnosed with Alzheimer's at follow-up, and these six subjects had higher initial frontal and parietal binding values than the other subjects in the MCI group.
In the normal aging subjects, three developed mild cognitive impairment after two years. Two of these three participants had had the highest baseline binding values in the temporal, parietal and frontal brain regions among this group.
Researchers said the next step in research will involve a longer duration of follow-up with larger samples of subjects. In addition, the team is using this brain-imaging technique in clinical trials to help track novel therapeutics for brain aging, such as curcumin, a chemical found in turmeric spice.
"Tracking the effectiveness of such treatments may help accelerate drug discovery efforts," Small, the author of the new book "The Alzheimer's Prevention Program," said. "Because FDDNP appears to predict who will develop dementia, it may be particularly useful in tracking the effectiveness of interventions designed to delay the onset of dementia symptoms and eventually prevent the disease."
Small recently received research approval from the U.S. Food and Drug Administration to use FDDNP-PET to study people with mild cognitive impairment to determine whether a high-potency form of curcumin -- a spice with anti-amyloid, anti-tau and anti-inflammatory properties -- can prevent Alzheimer's disease and the accumulation of plaques and tangles in the brain.
UCLA owns three U.S. patents on the FDDNP chemical marker. The Office of Intellectual Property at UCLA is actively seeking a commercial partner to bring this promising technology to market.
Small and study authors Jorge R. Barrio and S. C. Huang are among the inventors.
Additional authors included PrabhaSiddarth, Linda M. Ercoli, Alison C. Burggren, Karen J. Miller, Dr. Helen Lavretsky and Dr. Susan Y. Bookheimer, all of the UCLA Department of Psychiatry and Biobehavioral Sciences, and Vladimir Kepe and S.C. Huang, who are part of the UCLA Department of Molecular and Medical Pharmacology.