Wednesday, October 31, 2012

Refining climate models

Feature from Oak Ridge National Laboratory:

Feature

Twitter IconFollow     RSS IconRSS     Print IconPrint     Flickr IconPhotos     YouTube IconVideo     Share IconShare

Refining climate models

Researchers study carbon cycling in deciduous trees

Jeff Warren and a team of ORNL researchers track the rate of carbon flow through deciduous trees, which will provide data for improved climate models.
Jeff Warren and a team of ORNL researchers track the rate of carbon flow through deciduous trees, which will provide data for improved climate models.
(hi-res image)
 
Using dogwood trees, scientists are gaining a better understanding of the role photosynthesis and respiration play in the atmospheric carbon dioxide cycle. Their findings will aid computer modelers in improving the accuracy of climate simulations.
"The Department of Energy is very interested in experimentalists working with the climate modelers to improve the fidelity of the climate models," said Jeff Warren, a scientist in Oak Ridge National Laboratory's Environmental Sciences Division. "The models are essentially building these trees using equations, but the models may be more simplistic than they should be."
ORNL researchers are trying to determine how much carbon dioxide deciduous trees remove from the atmosphere and convert into energy, and where they store this energy. Trees and humans rely on similar types of energy. While we eat carbohydrates - which are merely chains of carbon-rich sugars - trees make their carbs by removing carbon dioxide from the atmosphere.
In mid-September, Warren and a team of seven other scientists at ORNL began the last of three carbon-tracking experiments on a stand of dogwood trees at the University of Tennessee Forest Resources Research and Education Center in Oak Ridge. Using a rare and traceable form of carbon dioxide made with a stable carbon isotope, ORNL researchers were able to label trees individually, creating a pulse of carbon that could be tracked as it moved through a tree — a process similar to how doctors track sugars moving through humans during a PET scan.
Carbon enters the tree through the leaves, and over the course of a few days, travels through the tree's branches, trunk and so on. The researchers follow the carbon by sampling new leaves, old leaves, fruit, stem tissue, roots and soil.
"Carbon flow within plants has always been studied but has never fully been understood," Warren said. "Our experiment will help us understand the fate of carbon dioxide after it is taken into a plant. It'll show us which structures receive the carbon and how quickly the carbon travels through the system."
The research team isolated each tree using a structure built from hollow PVC pipes and wrapped in clear film from a local hardware store. Although these tent-like structures appear to be primitive, they are much more high-tech than they seem, complete with air conditioning and environmental monitoring equipment.
Acting like mobile greenhouses, the structures encase the trees, allowing researchers to release the valuable carbon dioxide label without the gas escaping into the atmosphere.
The scientists exposed four dogwood trees to the traceable carbon dioxide individually for about two hours. After labeling, they removed the greenhouse structure and covered two of the trees with a shade-cloth to reduce photosynthesis. This provided them with two different light treatments, which will be useful for assessing how well carbon uptake is represented in the climate models. The experiment was repeated in the spring, summer and fall to determine how the seasons affect carbon dioxide use by trees.
"The supercomputing model sends a fixed amount of carbon to growth and to the roots, but it doesn't send any carbon to reproductive structures like berries or seeds, or symbiotic fungi associated with the roots," Warren said. "Depending on the season, this can account for quite a bit of carbon removed from the atmosphere that the model is completely disregarding."
Additional studies on carbon use in plants are needed to determine the true impact of the ecosystem on climate change.
"When we consider that a large portion of the land surface of the Earth is covered by deciduous forests, small changes in how carbon is cycled between the forest and the atmosphere can make a large difference when predicting climate change," said Colleen Iversen, a staff scientist and ecosystem ecologist in ORNL's Environmental Sciences Division. "Models are what we use to project what the climate could look like in 100 years, so it is important for climate change scientists like us to really make sure that the model processes are representing what's happening in the natural world."
Eventually, the researchers hope to incorporate their data into the publicly available Community Land Model, one of several models linked together to show interactions between atmosphere, ocean, land and ice in the larger Community Earth System Model. Researchers worldwide run the coupled model on supercomputers at Oak Ridge and elsewhere to better understand climate change.
For more about the field experiments, watch a video of ORNL's research team and modelers working with dogwood trees here: http://www.youtube.com/watch?v=adxPfd6J2Ig— Jennifer Brouner, Oct. 31, 2012

Tuesday, October 30, 2012

Folding Funnels Key to Biomimicry

Press release from Lawrence Berkeley National Laboratory:


Berkeley Lab Finding that Protein Folding Funnels Also Apply to Self-Assembly Should Benefit Biomimicry and Nanosynthesis

OCTOBER 30, 2012
Lynn Yarris (510) 486-5375  lcyarris@lbl.gov
 0 
 
 
   
Feature
AFM micrographic of 2D S-layers assembled on mica shows two different pathways to crystalization, one in which the domans are 2-3 nanometers taller (white dotted circles)than the other. Differences in the two height profiles, which were measured along the horizontal dotted black lines, were the result of kinetic trapping. (Image from Molecular Foundry)
AFM micrograph of 2D S-layers assembled on mica shows two different pathways to crystalization, one in which the domans are 2-3 nanometers taller (white circles). Height differences, measured along the dotted black line, were the result of kinetic trapping. (Image from Molecular Foundry)
Proteins are able to self-assemble into a wide range of highly ordered structures that feature a diverse array of properties. Through biomimicry – technological innovation inspired by nature – humans hope to emulate proteins and produce our own version of self-assembling molecules. A key to accomplishing this is understanding how protein-folding – a process critical to the form and function of a protein – is extended from individual proteins to complex assemblies.
Researchers with the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) have now shown that a concept widely accepted as describing the folding of a single individual protein is also applicable to the self-assembly of multiple proteins. Their findings provide important guidelines for future biomimicry efforts, particularly for device fabrication and nanoscale synthesis.
“We’ve made the first direct observations that the concept of a folding funnel with kinetic energy traps for individual proteins can be equally applied to the assembly of ordered protein structures,” says Jim DeYoreo, a scientist with the Molecular Foundry, a DOE nanoscience center at Berkeley Lab, who led this research along with Berkeley Lab chemist Carolyn Bertozzi. “Our results tell us that efforts to discover and codify the design rules for the self-assembly of complex molecular systems will have to take into account the impact of kinetic traps associated with conformational transformations.”
DeYoreo and Bertozzi are the corresponding authors of a paper published by theProceedings of the National Academy of Sciences (PNAS) that reported this research. The paper is titled “Direct observation of kinetic traps associated with structural transformations leading to multiple pathways of S-layer assembly.” Co-authoring the paper were Seong-Ho Shin, Sungwook Chung, Babak Sanii and Luis Comolli.
(From left) Sungwook Chung, Seong-Ho Shin, James DeYoreo and Carolyn Bertozzi with Berkeley Lab’s Molecular Foundry, were part of a team that demonstrated the concept of folding funnels applies equally to individual and ensembles of proteins. (Photo by Roy Kaltschmidt)
(From left) Sungwook Chung, Seong-Ho Shin, James DeYoreo and Carolyn Bertozzi with Berkeley Lab’s Molecular Foundry, were part of a team that demonstrated the concept of folding funnels applies equally to individual and ensembles of proteins. (Photo by Roy Kaltschmidt)
Proteins are essentially biomolecular nanomachines capable of performing numerous tasks because of their ability to fold themselves into a multitude of shapes and forms. When individual proteins self-assemble into ordered structures the resulting ensemble often adopts conformations that are quite distinct from those of the individual components.
“For example, collagen matrices, which constitute the organic scaffolds of bones and teeth, are constructed from triple helices of individual collagen monomers,” DeYoreo says. “These helices will further assemble into highly organized twisted fibrils that exhibit a pseudohexagonal symmetry.”
The folding funnel concept explains individual protein folding on the basis of conformational changes to reach a state of minimal free energy. An unfolded protein starts out in a state of high free energy that makes its conformation unstable. Initially, there are a number of possible three-dimensional conformations that would reduce this free energy. However, as the protein starts to fold, the free energy begins to drop and the number of possible conformations begins to decrease like the shrinking width of a funnel. The bottom of the funnel is reached when free energy is minimized and there is only one available conformation. As the free energy drops, however, there may be kinetic traps along the way that can stop the folding process and hold the protein in partially folded conformations, known as molten globules and folding intermediates, for extended periods of time. Eventually these trapped conformational states will be transformed into a stable conformation but the shape and form of that final conformation is influenced by the kinetic traps.
The folding funnel concept explains protein folding on the basis of conformational changes to reach a state of minimal free energy. An unfolded protein starts out in a state of high free and reaches its native state at minimal energy. Kinetic traps along the way can create transient molten globules and folding intermediates.
The folding funnel concept explains protein folding on the basis of conformational changes to reach a state of minimal free energy. An unfolded protein starts out in a state of high free and reaches its native state at minimal energy. Kinetic traps along the way can create transient molten globules and folding intermediates.
“In a protein folding funnel, the funnel walls are presumed not to be smooth and the resulting bumps and valleys define kinetic traps,” DeYoreo says. “This physical picture of folding has been explored in some detail at the single molecule level, but has not been considered for protein self-assembly into extended architectures even though conformational transformations are part and parcel of the self-assembly process.”
DeYoreo, Bertozzi and their colleagues took steps to correct this knowledge deficit by studying the surface-layer (S-layer) proteins that self-assemble into a crystalline membrane around the single cells of bacteria and Archaea. This outer membrane serves as the first point of contact between the microbe and its environment and is key to the microbe’s ability to survive. Using in situ Atomic Force Microscopy (AFM), the researchers imaged in real time and at the molecular level kinetic trapping during the 2D self-assembly of S-layer protein structures on mica surfaces.
“We observed that self-assembly of S-layer proteins tracks along two different pathways, one leading directly to the low-energy final, ordered state, and the other leading to a kinetic trap occupied by a long-lived transient state that is more disordered,” DeYoreo says. “Although either state is easily accessible during crystal nucleation, if the system falls into the high-energy state, escape to the final, low-energy state is strongly impeded at room temperature. This demonstrates the importance of kinetic traps in determining the pathway of S-layer crystallization and suggests that the concept of folding funnels is equally valid for self-assembly of extended protein structures.”
This research was supported by the DOE Office of Science.
#  #  #
Lawrence Berkeley National Laboratory addresses the world’s most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab’s scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy’s Office of Science. For more, visit www.lbl.gov.
The Molecular Foundry is one of five DOE Nanoscale Science Research Centers (NSRCs), national user facilities for interdisciplinary research at the nanoscale, supported by the DOE Office of Science. Together the NSRCs comprise a suite of complementary facilities that provide researchers with state-of-the-art capabilities to fabricate, process, characterize, and model nanoscale materials, and constitute the largest infrastructure investment of the National Nanotechnology Initiative. The NSRCs are located at DOE’s Argonne, Brookhaven, Lawrence Berkeley, Oak Ridge and Sandia and Los Alamos national laboratories.  For more information about the DOE NSRCs, please visit http://science.energy.gov/bes/suf/user-facilities/nanoscale-science-research-centers/.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit the Office of Science website atscience.energy.gov/.

Additional Information
For more information about the research of Jim DeYoreo  go here
For more information about the Molecular Foundry go here
For more information about the research of Carolyn Bertozzi go here

Rethinking the Computer at 80

The following is an excerpt from an article in The New York Times:


The New York Times
Tuesday, October 30, 2012

Rethinking the Computer at 80

By JOHN MARKOFF

MENLO PARK, Calif. — Many people cite Albert Einstein’s aphorism “Everything should be made as simple as possible, but no simpler.” Only a handful, however, have had the opportunity to discuss the concept with the physicist over breakfast.

One of those is Peter G. Neumann, now an 80-year-old computer scientist at SRI International, a pioneering engineering research laboratory here.

As an applied-mathematics student at Harvard, Dr. Neumann had a two-hour breakfast with Einstein on Nov. 8, 1952. What the young math student took away was a deeply held philosophy of design that has remained with him for six decades and has been his governing principle of computing and computer security.

For many of those years, Dr. Neumann (pronounced NOY-man) has remained a voice in the wilderness, tirelessly pointing out that the computer industry has a penchant for repeating the mistakes of the past. He has long been one of the nation’s leading specialists in computer security, and early on he predicted that the security flaws that have accompanied the pell-mell explosion of the computer and Internet industries would have disastrous consequences.

“His biggest contribution is to stress the ‘systems’ nature of the security and reliability problems,” said Steven M. Bellovin, chief technology officer of the Federal Trade Commission. “That is, trouble occurs not because of one failure, but because of the way many different pieces interact.”

Dr. Bellovin said that it was Dr. Neumann who originally gave him the insight that “complex systems break in complex ways” — that the increasing complexity of modern hardware and software has made it virtually impossible to identify the flaws and vulnerabilities in computer systems and ensure that they are secure and trustworthy.

The consequence has come to pass in the form of an epidemic of computer malware and rising concerns about cyberwarfare as a threat to global security, voiced alarmingly this month by the defense secretary, Leon E. Panetta, who warned of a possible “cyber-Pearl Harbor” attack on the United States.

It is remarkable, then, that years after most of his contemporaries have retired, Dr. Neumann is still at it and has seized the opportunity to start over and redesign computers and software from a “clean slate.”

He is leading a team of researchers in an effort to completely rethink how to make computers and networks secure, in a five-year project financed by the Pentagon’s Defense Advanced Research Projects Agency, or Darpa, with Robert N. Watson, a computer security researcher at Cambridge University’s Computer Laboratory.

“I’ve been tilting at the same windmills for basically 40 years,” said Dr. Neumann recently during a lunchtime interview at a Chinese restaurant near his art-filled home in Palo Alto, Calif. “And I get the impression that most of the folks who are responsible don’t want to hear about complexity. They are interested in quick and dirty solutions.”

For more, visit www.nytimes.com.

Monday, October 29, 2012

ORNL Debuts Titan Supercomputer

News Release

Twitter IconFollow     RSS IconRSS     Print IconPrint     Flickr IconPhotos     YouTube IconVideo     Share IconShare
Media Contact: Ron Walli
Communications and Media Relations
865.576.0226


ORNL Debuts Titan Supercomputer

Oak Ridge National Laboratory is home to Titan, the world’s most powerful supercomputer for open science with a theoretical peak performance exceeding 20 petaflops (quadrillion calculations per second). That kind of computational capability—almost unimaginable—is on par with each of the world’s 7 billion people being able to carry out 3 million calculations per second.
Oak Ridge National Laboratory is home to Titan, the world’s most powerful supercomputer for open science with a theoretical peak performance exceeding 20 petaflops (quadrillion calculations per second). That kind of computational capability—almost unimaginable—is on par with each of the world’s 7 billion people being able to carry out 3 million calculations per second. (hi-res image)
 
OAK RIDGE, Tenn., Oct. 29, 2012 — The U.S. Department of Energy's (DOE) Oak Ridge National Laboratory launched a new era of scientific supercomputing today with Titan, a system capable of churning through more than 20,000 trillion calculations each second—or 20 petaflops—by employing a family of processors called graphic processing units first created for computer gaming. Titan will be 10 times more powerful than ORNL's last world-leading system, Jaguar, while overcoming power and space limitations inherent in the previous generation of high-performance computers.
Titan, which is supported by the Department of Energy, will provide unprecedented computing power for research in energy, climate change, efficient engines, materials and other disciplines and pave the way for a wide range of achievements in science and technology.
The Cray XK7 system contains 18,688 nodes, with each holding a 16-core AMD Opteron 6274 processor and an NVIDIA Tesla K20 graphics processing unit (GPU) accelerator. Titan also has more than 700 terabytes of memory. The combination of central processing units, the traditional foundation of high-performance computers, and more recent GPUs will allow Titan to occupy the same space as its Jaguar predecessor while using only marginally more electricity.
"One challenge in supercomputers today is power consumption," said Jeff Nichols, associate laboratory director for computing and computational sciences. "Combining GPUs and CPUs in a single system requires less power than CPUs alone and is a responsible move toward lowering our carbon footprint. Titan will provide unprecedented computing power for research in energy, climate change, materials and other disciplines to enable scientific leadership."
Because they handle hundreds of calculations simultaneously, GPUs can go through many more than CPUs in a given time. By relying on its 299,008 CPU cores to guide simulations and allowing its new NVIDIA GPUs to do the heavy lifting, Titan will enable researchers to run scientific calculations with greater speed and accuracy.
"Titan will allow scientists to simulate physical systems more realistically and in far greater detail," said James Hack, director of ORNL's National Center for Computational Sciences. "The improvements in simulation fidelity will accelerate progress in a wide range of research areas such as alternative energy and energy efficiency, the identification and development of novel and useful materials and the opportunity for more advanced climate projections."
Titan will be open to select projects while ORNL and Cray work through the process for final system acceptance. The lion's share of access to Titan in the coming year will come from the Department of Energy's Innovative and Novel Computational Impact on Theory and Experiment program, better known as INCITE.
Researchers have been preparing for Titan and its hybrid architecture for the past two years, with many ready to make the most of the system on day one. Among the flagship scientific applications on Titan:
Materials Science The magnetic properties of materials hold the key to major advances in technology. The application WL-LSMS provides a nanoscale analysis of important materials such as steels, iron-nickel alloys and advanced permanent magnets that will help drive future electric motors and generators. Titan will allow researchers to improve the calculations of a material's magnetic states as they vary by temperature.
"The order-of-magnitude increase in computational power available with Titan will allow us to investigate even more realistic models with better accuracy," noted ORNL researcher and WL-LSMS developer Markus Eisenbach.
Combustion The S3D application models the underlying turbulent combustion of fuels in an internal combustion engine. This line of research is critical to the American energy economy, given that three-quarters of the fossil fuel used in the United States goes to powering cars and trucks, which produce one-quarter of the country's greenhouse gases.
Titan will allow researchers to model large-molecule hydrocarbon fuels such as the gasoline surrogate isooctane; commercially important oxygenated alcohols such as ethanol and butanol; and biofuel surrogates that blend methyl butanoate, methyl decanoate and n-heptane.
"In particular, these simulations will enable us to understand the complexities associated with strong coupling between fuel chemistry and turbulence at low preignition temperatures," noted team member Jacqueline Chen of Sandia National Laboratories. "These complexities pose challenges, but also opportunities, as the strong sensitivities to both the fuel chemistry and to the fluid flows provide multiple control options which may lead to the design of a high-efficiency, low-emission, optimally combined engine-fuel system."
Nuclear Energy Nuclear researchers use the Denovo application to, among other things, model the behavior of neutrons in a nuclear power reactor. America's aging nuclear power plants provide about a fifth of the country's electricity, and Denovo will help them extend their operating lives while ensuring safety. Titan will allow Denovo to simulate a fuel rod through one round of use in a reactor core in 13 hours; this job took 60 hours on the Jaguar system.
Climate Change The Community Atmosphere Model-Spectral Element simulates long-term global climate. Improved atmospheric modeling under Titan will help researchers better understand future air quality as well as the effect of particles suspended in the air.
Using a grid of 14-kilometer cells, the new system will be able to simulate from one to five years per day of computing time, up from the three months or so that Jaguar was able to churn through in a day.
"As scientists are asked to answer not only whether the climate is changing but where and how, the workload for global climate models must grow dramatically," noted CAM-SE team member Kate Evans of ORNL. "Titan will help us address the complexity that will be required in such models."
ORNL is managed by UT-Battelle for the Department of Energy. The Department of Energy is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit http://science.energy.gov/.
###