Wednesday, November 28, 2012

DOE researchers advance scientific computing with record-setting simulations

Argonne National Laboratory Press Release:


DOE researchers advance scientific computing with record-setting simulations

NOVEMBER 28, 2012
WASHINGTON, D.C. – Breaking new ground for scientific computing, two teams of Department of Energy (DOE) scientists have for the first time exceeded a sustained performance level of 10 petaflops (quadrillion floating point operations per second) on the Sequoia supercomputer at the National Nuclear Security Administration’s (NNSA) Lawrence Livermore National Laboratory (LLNL).
A team led by Argonne National Laboratory used the recently developed Hardware/Hybrid Accelerated Cosmology Codes (HACC) framework to achieve nearly 14 petaflops on the 20-petaflop Sequoia, an IBM BlueGene/Q supercomputer, in a record-setting benchmark run with 3.6 trillion simulation particles. HACC provides cosmologists the ability to simulate entire survey-sized volumes of the universe at a high resolution, with the ability to track billions of individual galaxies.
Simulations of this kind are required by the next generation of cosmological surveys to help elucidate the nature of dark energy and dark matter. The HACC framework is designed for extreme performance in the weak scaling limit (high levels of memory utilization) by integrating innovative algorithms, as well as programming paradigms, in a way that easily adapts to different computer architectures. The HACC team is now conducting a fully-instrumented science run with more than a trillion particles on Argonne’s 10-petaflop Mira system, also an IBM BlueGene/Q system.
“The performance of these applications on Mira and Sequoia provides an early glimpse of the transformational science these machines make possible — science important to DOE missions,” said Barbara Helland, of DOE’s Office of Science. “By pushing the state-of-the-art, these two teams of scientists are advancing science and also the know-how to use these new resources to produce insight and discovery.”
LLNL, in collaboration with scientists at IBM Research, created a new simulation capability called Cardioid to realistically and rapidly model a beating human heart at near-cellular resolution. The highly scalable code models in exquisite detail the electrophysiology of the human heart, including activation of heart muscle cells and cell-to-cell electrical coupling. Developed to run with high efficiency in the extreme strong-scaling limit, the scientists were able to achieve a performance of nearly 12 petaflops on Sequoia, and demonstrated the ability to model a highly resolved whole heart beating in very nearly real time (67.2 seconds of wall-clock time to model 60 seconds of real time). Using Cardioid, the team performed groundbreaking simulations demonstrating for the first time in a simulation of a whole heart the generation of a reentrant activation pattern that often leads to a kind of arrhythmia known as Torsades de Pointes, which can result in sudden cardiac death. The potential to elucidate detailed mechanisms of arrhythmia will have impact on a multitude of applications in medicine, pharmaceuticals and implantable devices.
“A vital DOE/NNSA mission is to push the state-of-the-art in high performance computing to not only ensure the nation’s security but its technological and economic competitiveness,” said NNSA Advanced Simulation and Computing Director Bob Meisner. “Sequoia and Mira are powerful computational engines that allow our skilled teams to run applications such as Cardioid and HACC at very high levels of performance. What we learn from these early science applications will inform a broad range of scientific computing including our national security applications.”
Sequoia at Livermore and Mira at Argonne represent the third generation of IBM Blue Gene supercomputers. Sequoia, second on the TOP500 list with 98,304 nodes (1.57 million central processing units), and Mira, fourth on the list with 49,152 nodes (786,432 central processing units), allow execution of massive calculations in parallel. Both teams took full advantage of the five levels of parallelism available in the hardware to achieve sustained performance levels of 58.8 percent for Cardioid and an astounding 69.2 percent for HACC of the theoretical peak performance of the machine, as well as near perfect scaling. Strong scaling measures the ability to speed up a problem by using more processors, so that a given simulation finishes in one-hundredth of the time by using 100 times as many processors. Weak scaling measures the ability to increase the size of problem by using more processors, so that in a given time, a simulation 100 times larger is executed by using 100 times as many processors.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the U.S. and is working to address some of the most pressing challenges of our time. For more information, seescience.energy.gov.
About the NNSA
Established by Congress in 2000, NNSA is a semi-autonomous agency within the U.S. Department of Energy responsible for enhancing national security through the military application of nuclear science. NNSA maintains and enhances the safety, security, reliability and performance of the U.S. nuclear weapons stockpile without nuclear testing; works to reduce global danger from weapons of mass destruction; provides the U.S. Navy with safe and effective nuclear propulsion; and responds to nuclear and radiological emergencies in the U.S. and abroad.
About Argonne
Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation's first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America's scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy's Office of Science.

POLLEN: CAN HUMANS’ SEASONAL BANE BECOME A TOOL IN THE FIGHT AGAINST DISEASE?

DARPA Press Release:


POLLEN: CAN HUMANS’ SEASONAL BANE BECOME A TOOL IN THE FIGHT AGAINST DISEASE?

November 27, 2012
DARPA Young Faculty Award researcher studies use of pollen shells for oral delivery of vaccines 
As a globally deployed force tasked with defending U.S. interests and delivering humanitarian assistance to international populations, the Department of Defense must be able to provide health care anywhere in the world at any time to protect against natural and man-made health threats. Having trained and equipped medical personnel on hand is not feasible for every mission, however, which is one reason why DoD invests in medical treatments that can be easily administered by one’s self or by fellow servicemembers. Among the 2012 class of academic researchers receiving mentorship and funding through DARPA’s Young Faculty Awards (YFA) program, one individual is studying novel methods for packaging and delivery of orally consumed vaccines. His tool of choice: pollen.
Harvinder Gill, an assistant professor of chemical engineering at Texas Tech University, seeks to understand, engineer and test a pollen-based oral vaccination platform to protect against a range of infectious diseases. If successful, his research could lead to more effective, more easily administered and more easily transported vaccines for deployed troops.
What attributes does pollen have that make DoD consider it anything more than a seasonal menace to humans’ sinuses? To start with, the exterior of a pollen grain is a shell made of a naturally durable, non-allergenic polymer. The contents of the shell that actually contain the allergy-inducing plant proteins and fats can be cleaned out, rendering the shell itself neutral. The leftover space inside the shell could be filled with vaccines and delivered into the body through oral ingestion. The pollen shell’s natural toughness would help the vaccine survive conditions inside the body. The pollen could then pass through the intestinal lining to deliver vaccine. 
The value of an orally consumed vaccine is that it is efficient, painless, can be self-administered and can induce both systemic and mucosal immune responses, thus enhancing protection. But why is pollen any better for this than a traditional pill? The body’s own processes often limit the effectiveness of pills. When patients ingest vaccines and other medications, stomach acids and digestive processes can degrade the medication. Because pollen shells are durable, however, they can potentially survive inside the body and safeguard a vaccine until it can be delivered. All this means that along with the traditional image of pollen as airborne particles that cause headaches and sneezing, pollen could also eventually be known as an edible vaccine delivery vehicle.
“DARPA already has a large portfolio of biology programs aimed at protecting the health of U.S. warfighters from threats known and unknown,” said Jay Schnitzer, director of DARPA’s Defense Sciences Office, which currently oversees the YFA program. “We actively support innovative basic research like that conducted by YFA recipients because it helps open new areas for exploration and fosters valuable, lasting relationships between DoD and the research community.”
More information about Dr. Gill’s research may be found at www.gill-lab.che.ttu.edu. His laboratory is currently investigating pollen grains, micro-needles, gold nanoparticles and polymeric micro-nano particles for mucosal vaccination and cancer drug delivery.
DARPA anticipates releasing the next YFA solicitation in late 2012. Interested candidates should please watch www.grants.gov and www.darpa.mil for updates.
# # #

Monday, November 26, 2012

Modern-day cleanroom invented by Sandia physicist still used 50 years later


Sandia Labs News Releases

Modern-day cleanroom invented by Sandia physicist still used 50 years later

ALBUQUERQUE, N.M. — When Willis Whitfield invented the modern-day cleanroom 50 years ago, researchers and industrialists didn’t believe it at first. But within a few short years, $50 billion worth of laminar-flow cleanrooms were being built worldwide and the invention is used in hospitals, laboratories and manufacturing plants today.
The retired Sandia National Laboratories physicist, who passed away this month at age 92, was dubbed “Mr. Clean” by TIME Magazine at the time, but the travel, scientific presentations and accolades didn’t change the unassuming scientist, who was always modest about the invention that revolutionized manufacturing in electronics and pharmaceuticals, made hospital operating rooms safer and helped further space exploration.
Sandia President and Labs Director Paul Hommert remembered Whitfield as a Sandia pioneer.
Willis Whitfield
Cleanroom inventor Willis Whitfield, who passed away this month at age 92, steps out of a mobile cleanroom at Sandia National Laboratories, which could be transported to remote sites.(Photo courtesy of Sandia National Laboratories) Click on thumbnail for a high-resolution image.
“Willis Whitfield represented the very best of Sandia. An exemplary researcher, a physicist who became an engineer’s engineer, Willis lived in that sweet spot where the best technical work is always done, at the intersection of skill, experience, training and intuition,” Hommert said. “His breakthrough concept for a new kind of cleanroom, orders of magnitude more effective than anything else available in the early 1960s, came at just the right time to usher in a new era of electronics, health care, scientific research and space exploration. His impact was immense; even immeasurable.”
Gil Herrera, Sandia’s director of microsystems science, technology and components, also remembered Whitfield’s contribution to the work Herrera oversees at Sandia today.
“When Willis invented the cleanroom, he did so to improve the reliability of miniature mechanical components for Sandia systems. Little did he know his invention would enable the semiconductor industry, and thereby enable all modern electronics, computers and information technologies,” Herrera said.  “It has also enabled breakthroughs in biotechnology, nanotechnology, health sciences and healthcare.”
Today, cleanrooms and clean benches based on Sandia’s design are used in the manufacture of precision mechanical assemblies for systems designed at the national security laboratory, Herrera said.
Whitfield, the son of Texas cotton farmers who learned to do for themselves by fixing their own equipment, was asked to solve a manufacturing problem for Sandia, so he invented the laminar-flow cleanroom. With slight modifications, it is still the standard.
“He built it, found out no one had done it that way before, and said, ‘I don’t understand why [no one had invented it]. It’s so simple,’” recalled his son, Jim Whitfield, who was a young child at the time. “I heard someone ask him how long did it take him to think of that idea and he said, ‘Five minutes, I just did the obvious thing.’”
Sandia historian Rebecca Ullrich said it wasn’t quite that easy.
Solving a problem
In 1959, nuclear weapons components — mainly mechanical switching parts — were becoming smaller and microscopic dust particles were preventing Sandia from achieving the quality the laboratories needed, so Whitfield’s supervisors asked his group to find a solution, Ullrich said.
While Whitfield might have come up with the idea quickly, months of research led up to that moment of discovery, Ullrich said. Whitfield discovered the practice at the time was to tightly seal cleanrooms, wear protective clothing and vacuum often. Still, the airflow was turbulent in existing cleanrooms and particles introduced were not removed. These measures didn’t create the necessary conditions for manufacturing close-tolerance parts, she said.
Whitfield looked at blowers, vents, grading and the cost per square foot to build his invention, so it would be something people could afford.
By the end of 1960, Whitfield had his initial drawings for a 10-by-6 cleanroom. His solution was to constantly flush out the room with highly filtered air. In that first model, Whitfield designed a workbench along one wall. Clean air entered the room from a bank of filters that were 99.97 percent efficient in removing particles larger than 0.3 microns. For example, cigarette smoke blown in one side comes out the other as clean air, according to a 1962 Sandia Lab News article.
The air is circulated in the room at a rate of 4,000 cubic feet or about 10 changes of air per minute, an amount of air movement that is barely perceptible to the workers inside. The linear speed of air is slightly more than 1 mph, which is about the same as that felt walking through a still room.
In a later modification, the air was passed down over the work area instead of across, getting an assist from gravity in carrying troublesome particles into the floor, which was covered with grating. Filters underneath clean the air and it is circulated back around to re-enter the room. The constant flow of clean air performs a sweeping function.
When the first cleanroom was tested “the dust counters went to nearly zero. We thought they were broken,” Whitfield said in a 1993 videotaped interview at Sandia.
Willis Whitfield
Whitfield lived long enough to see his invention mark its 50th anniversary this year. He pauses during a tour of a cleanroom in Sandia's Microsystems Engineering Sciences and Applications (MESA) complex. (Photo by Randy Montoya) Click on the thumbnail for a high-resolution image.
The laminar-flow cleanroom created a work environment that was more than 1,000 times cleaner than the cleanrooms in use at the time.
According to tests at the time, the laminar-flow cleanroom’s work area contained an average of 750 dust particles one-third of a micron in size or larger per cubic foot of air. (A micron is equal to 40-millionths of an inch.) That’s compared to average dust counts of more than 1 million particles per cubic foot of air in one of the best conventional cleanrooms in use at the time.
Bringing the cleanroom to the world
Whitfield gave his initial paper on what was then called the “ultra-cleanroom” at the Institute of Environmental Sciences meeting in Chicago in 1962.
After the meeting and publicity, Whitfield’s phone never stopped ringing, Ullrich said. “Industry jumped all over it.”
But at a standing-room-only talk about a year later at the American Society for Contamination Control in Boston, manufacturers challenged the invention’s claims, accusing Whitfield of perpetuating a hoax, Ullrich said.
Jim Whitfield remembers his father’s story about that meeting: “The numbers he was showing were unbelievable. At this conference, people were telling him that can’t be right. Then, one of his colleagues [from Bell Labs] got up and said he thought Whitfield was wrong. His numbers are 10 times too conservative. So, he knew at that point that it was a dramatic shift in the technology.”
Others recognized it too, and within a couple years, $50 billion worth of cleanrooms had been built worldwide.  
“When you have something that everyone wants, they come to you,” Whitfield said in the 1993 interview. “The desperate need for this accelerated the gap between development and production drastically.”
RCA and General Motors Co. were early adopters of the cleanroom, and the invention revolutionized the pharmaceuticals and microelectronics industries, Ullrich said.
Bataan Memorial Methodist Hospital in Albuquerque, which later became Lovelace Medical Center, was the first hospital to use laminar-flow cleanrooms in its operating rooms to prevent infections, Ullrich said. And the Houston hospital today known as the University of Texas MD Anderson Cancer Center built 22 cleanrooms to prevent infections in leukemia patients undergoing chemotherapy, Whitfield said in 1993.
Whitfield eventually worked with NASA to provide planetary quarantines during missions to the moon and Mars and spacecraft sterilization techniques, Ullrich said.
But fame did not change Whitfield.
“He’s a nice guy, very honest, very straightforward,” Ullrich said. “He’s very modest about it. His values mean he’s going to do the right thing. He makes sure other people share credit for things.”
The cleanroom design also made it possible for Sandia to standardize cleanrooms for the first time in 1963 for the federal government.
Whitfield lived to see his invention turn 50 this year, but was unable to give an interview, so his son spoke for him, saying, “I’m sure in his heart, he feels very satisfied that he’s made such a big and positive impact on society.”
Additional photos available on Sandia’s Flickr site.

Sandia National Laboratories is a multi-program laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin company, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies and economic competitiveness.

Modeling the Breaking Points of Metallic Glasses

Lawrence Berkeley National Laboratory Press Release:


Mathematical methods developed by a Berkeley Lab researcher help explain why liquid metals have wildly different breaking points, depending on how they are made

NOVEMBER 26, 2012
Linda Vu 510.495.2402  lvu@lbl.gov
 1 
 
 
   
Feature
Metallic glass alloys (or liquid metals) are three times stronger than the best industrial steel, but can be molded into complex shapes with the same ease as plastic. These materials are highly resistant to scratching, denting, shattering and corrosion. So far, they have been used in a variety of products from golf clubs to aircraft components. And, some smartphone manufacturers are even looking to cast their next-generation phone cases out of it.
But despite their potential, the mechanical properties of these substances are still a scientific mystery. One lingering question is why they have such wildly different toughness and breaking points, depending on how they are made. Although this may not be a huge concern for small applications like smartphone cases it will be extremely important if these materials are ever used in structural applications where they would need to support large loads.
A simulation of crack initiation in a metallic glass. The metallic glass on the left is initially more relaxed, due to a longer heat treatment, than the metallic glass on the right. The very different crack tip shapes and deformation patterns under the same external conditions result in a significantly reduced breaking resistance for the more relaxed glass. (Courtesy of Christopher Rycroft, LBNL)
A simulation of crack initiation in a metallic glass. The metallic glass on the left is initially more relaxed, due to a longer heat treatment, than the metallic glass on the right. The very different crack tip shapes and deformation patterns under the same external conditions result in a significantly reduced breaking resistance for the more relaxed glass. (Courtesy of Christopher Rycroft, LBNL)
Recently, Christopher Rycroft of the Lawrence Berkeley National Laboratory’s (Berkeley Lab’s) Computational Research Division has developed some novel computational techniques to address this question. When Rycroft combined these techniques with a mechanical model of metallic glass developed by Eran Bouchbinder and his colleagues at Israel’s Weizmann Institute, the two were able to propose a novel explanation of the physical process behind the large variations in breaking points of metallic glasses. Their results are also in qualitative agreement with laboratory experiments.
“We hope that this work will contribute to the understanding of metallic glasses, and aid in their use in practical applications. Ultimately, we would like to develop a tool capable of making quantitative predictions about the toughness of metallic glasses depending on their preparation method,” says Rycroft.
Rycroft and Bouchbinder are co-authors on a paper recently published in Physical Review Letters.
What is a Metallic Glass? And, Why is it So Difficult to Model?

Scientists define “glass” as a material that cools from a liquid state to a solid state without crystallizing—which is when atoms settle into a lattice, or a highly regular spatial pattern. Because many metal lattices are riddled with defects, these materials “deform”, or permanently bend out of shape, relatively easily.  When crystallization does not occur, the atoms settle into a random arrangement. This atomic structure allows metallic glasses to spring back into shape instead of deforming permanently.  And without the defects, some metallic glasses also have extremely efficient magnetic properties.
Christopher Rycroft
Christopher Rycroft
Rycroft notes that one of the biggest mysteries in condensed matter physics is how glass transitions from a liquid state to a solid state. To successfully create metallic glass, the metal has to cool relatively quickly before atomic lattices form.
“Depending on how you prepare or manipulate these metallic glasses, the breaking points can differ by a factor of 10,” says Rycroft. “Because scientists don’t completely understand how glass transitions from liquid to solid state, they have not been able to fully explain why the breaking points of these materials vary so widely.”
According to Bouchbinder, computer models also have a hard time predicting the breaking points of metallic glass because the timescale of events varies dramatically—from microseconds to seconds. For instance, researchers can bend or pull the material for several seconds before it breaks, which occurs almost instantaneously. And the material’s internal plastic deformation—the process where it irreversibly deforms—occurs on an intermediate timescale.
Eran Bouchbinder, Weizmann Institute
Eran Bouchbinder, Weizmann Institute
“We’ve actually been able to develop some numerical methods to capture these differences in timescales,” says Rycroft, of the techniques used in the recent paper.
When Rycroft incorporated these methods into Bouchbinder’s mechanical model and calibrated it based on available data, the duo managed to simulate and better understand the breaking points of metallic glass alloys based on their preparation process. He notes that this model is rather unique as it combines novel and flexible numerical methods with recent insights about the physics of glasses. The simulations have also been able to predict the large decreases in toughness that are seen in laboratory experiments.
“If you can vary the way metallic glass is prepared in computer models and capture the differences in how it breaks, you can pose a reasonable explanation for why this occurs. This might also give you a better idea about how the glass transitions from a liquid to a solid, as well as the mechanical properties of a glass,” says Rycroft. “We’ve essentially created something that might evolve into a tool for predicting the toughness of metallic glasses.”
“For quite some time I’ve wanted to calculate the fracture toughness of metallic glasses, but knew that this was a very tough mathematical and computational challenge, certainly well above my abilities, and probably above the capabilities of conventional computational solid mechanics,” says Bouchbinder. “I think that Rycroft’s methods have opened the way to new possibilities and I am enthusiastic to see where this can lead us.”
This work was done with a grant for developing new numerical algorithms with scientific and engineering applications from the Office of Advanced Scientific Computing Research within the Department of Energy’s Office of Science.
# # #
Lawrence Berkeley National Laboratory addresses the world’s most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab’s scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy’s Office of Science. For more, visit www.lbl.gov.

A specialized tool

Oak Ridge National Laboratory Feature

Twitter IconFollow     RSS IconRSS     Print IconPrint     Flickr IconPhotos     YouTube IconVideo     Share IconShare

A specialized tool

ORNL, six U.S. universities offer neutron science cybercourse

The University of Tennessee Takeshi Egami is among the cybercourse instructors.
The University of Tennessee Takeshi Egami is among the cybercourse instructors.
(hi-res image)
 
Neutron scattering is a specialized tool that allows scientists to do breakthrough research into the nature of advanced materials.
Developing expertise in neutron scattering can be difficult, however, because the field is too narrow to merit an entire graduate course but too deep to learn by reading a textbook, said University of Tennessee neutron scientist Takeshi Egami.
To address the problem, the Department of Energy's Oak Ridge National Laboratory organized interactive, streamed lectures involving UT and five other universities and is offering them free online. The lecture series has attracted online students worldwide and remains available to those who register.
"Neutron Scattering in Quantum Condensed Matter Physics" consists of 26 lectures given in the 2012 fall semester. The cybercourse began in early September and finishes early in December. The 75-minute lectures have been presented twice each week.
Centrally coordinated by the Neutron Sciences Directorate (NScD) and the Joint Institute for Neutron Science (JINS) at ORNL, the course is taught by neutron scattering researchers Collin Broholm (Johns Hopkins University), Takeshi Egami (University of Tennessee), Young S. Lee (Massachusetts Institute of Technology), Seunghun Lee (University of Virginia), Stephen Nagler (ORNL), Roger Pynn (Indiana University), and Sunil K. Sinha (University of California-San Diego).
The course was developed by Meiyun Chang-Smith, education manager for NScD and JINS. It is apparently the first time researchers from several prestigious institutions in the U.S. have collaborated to provide a neutron scattering course online.
Class topics included the theory of inelastic and elastic neutron scattering, magnetic neutron scattering, neutron sources and instrumentation, surfaces and interfaces, phonons, magnetic excitations, and critical phenomena. It was open to advanced Ph.D. students, postdoctoral researchers, and faculty members who want to learn more about neutron scattering as a tool for research in the structure and dynamics of condensed matter.
The lectures are transmitted live from the home university of the lecturer using the Adobe Connect interactive software platform, and they're recorded with a webcam. PowerPoint slides and a live cyber whiteboard feature enable the lecturer to give both the onsite and remote audiences detailed explanations.
Online registrants get a username and password to log in to use the resources on the web site. They also get a URL link for logging in to the live lectures. An interactive feature on Adobe Connect allows participants -- from wherever they are — to ask questions and make comments during the live lecture via a chat box. Homework assignments are posted and, subsequently, correct answers.
A number of students enrolled for credit at their respective schools, and launching the course to the neutron community nationally and abroad has boosted numbers to 149 to date.
The link to access the lectures is neutrons.ornl.gov/education/qcmp/
ORNL's NScD operates the High Flux Isotope Reactor and the Spallation Neutron Source, both of which are funded by the U.S. Department of Energy Office of Basic Energy Sciences.— Agatha Bardoel, Nov. 26, 2012

Friday, November 16, 2012

In Memoriam: Stuart Freedman, Renowned Nuclear Physicist


NOVEMBER 16, 2012
Paul Preuss 510-486-6249  paul_preuss@lbl.gov
 26 
 
 
   
News Release
Stuart Freedman (photo Roy Kaltschmidt, Lawrence Berkeley National Laboratory)
Stuart Freedman (photo Roy Kaltschmidt, Lawrence Berkeley National Laboratory)
Stuart Jay Freedman, a nuclear physicist with Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California at Berkeley, and a world-renowned investigator of fundamental physical laws, died suddenly on November 9 while attending a scientific conference in Santa Fe, NM. He was 68.
“Stuart was a truly remarkable scientist, with extraordinarily diverse interests, and still very much at the height of his powers,” says James Symons, Director of Berkeley Lab’s Nuclear Science Division. “It is somehow fitting that he spent his last few days with close friends, actively engaged in discussing new ways to make fundamental measurements requiring deep insight and ingenuity. We have lost a great physicist, but I can’t imagine that he would have wanted to leave us in any other way.”

Freedman’s friend and long-time associate, Berkeley Lab physicist Robert Cahn, recalls that “Stuart started as a particle theorist but became an extraordinarily versatile and creative experimentalist, with a reputation for getting the right answer, often when others didn’t.”
These qualities were already evident when Freedman, while still a graduate student at UC Berkeley in 1972, used the radioactive decay of calcium atoms to rule out “local hidden variable” theories. This was the first persuasive experimental demonstration that there is no escaping the non-deterministic nature of quantum mechanics. He went on to exclude a number of other possible excursions from standard physics, including the existence of naked quarks, faster-than-light particles, and very light Higgs bosons. He also shot down surprising but widely  heralded results that seemed to point to very heavy neutrinos,  supposedly having a mass of 17 kiloelectron volts (17 keV) –  about 100,000 times heavier than current expectations.

“He loved people with crazy ideas, if only for a good argument, and he was a source of brilliant ideas himself,” says Berkeley Lab’s Brian Fujikawa, who worked closely with Freedman since 1984 and helped him perform the decisive 17-keV experiment. “Stuart used a spectrometer that eliminated likely sources of error, and on top of that he created a small ‘fake’ signal by mixing carbon-14 into the sulfur-35 source whose decays we were measuring. Since we could detect that fake in the data, if there had been a real signal in the beta spectrum at 17 keV we would have seen it.”

Some searches were less conclusive, however. Leading theorist Roberto Peccei of the University of California at Los Angeles, whose work with Helen Quinn led to the proposal of particles called axions, recalls writing an early paper with Freedman in 1978, when both were at Stanford. “It was called, appropriately, ‘Do axions exist?’ We are, incidentally, still asking the same question today.” Peccei confirms that Freedman “was not afraid to go against orthodoxy. In fact, he relished this role! The world has lost a wonderful physicist, but his impact on our field will remain.”

Freedman joined Argonne National Laboratory in 1982 and later became a professor in the University of Chicago’s Fermi Institute, where, says UChicago cosmologist Michael Turner, “he provided a crucial link between Argonne and the university.” During a time when Turner and others were establishing the connections between cosmology and particle physics, nuclear physics, and astrophysics, “Stuart provided the key connection to weak-interaction physics with his important experiments on the properties of neutrons and neutrinos.”
Freedman established the parameters of the weak interaction in the coupling of weak currents to the neutron. Because these measurements are essential to understanding nuclear fusion, Cahn says, “They make it possible for us to determine the temperature at the center of the sun.”

Says Turner, “Stuart was not only a brilliant experimentalist but a wise person who gave sage advice gently, often using his wonderfully wry sense of humor. We will sorely miss Stuart’s scientific contributions, his friendship, and wise counsel.”

In 1991 Freedman and his wife, Joyce, who had led the sponsored research office at UChicago, moved to Berkeley, joining Berkeley Lab and UC Berkeley while maintaining his affiliation with Argonne and Chicago. His fame for neutrino work grew, notably following the 2003 confirmation from the KamLAND experiment in Japan that different neutrinos have different tiny masses and oscillate from one “flavor” to another. KamLAND benefited from detector technology and signal processing contributed by U.S. participation, inaugurated and led by Freedman.

“The KamLAND oscillations result was one of Stuart’s proudest accomplishments,” says Jason Detwiler, an assistant professor at the University of Washington who met Freedman during the construction of KamLAND and subsequently worked with him at Berkeley Lab for many years. While the SNO experiment in Canada had established that neutrinos change flavor while traveling from the sun to Earth, “KamLAND was designed to capture antineutrinos produce by nuclear reactors, and it was Stuart’s kind of experiment – a laboratory-style experiment in which both the source and the detector were controlled. The upshot was that KamLAND produced the first clean signature of actual oscillations.”
Detwiler characterizes Freedman’s experimental style as “like a Grand Master in chess, always thinking many steps ahead. He always had the clearest view of the science and the experiment’s essential rationale.”

Spencer Klein, Deputy Director of the Nuclear Science Division, says, “Stuart was a driving force in our division, in the physics department on campus, and in the international neutrino community.” Besides neutrino oscillations, Freedman’s contributions to neutrino science include KamLAND’s detection of “geoneutrinos” originating from radioactive decays inside the Earth, and his role as U.S. spokesperson and U.S. construction project manager of the CUORE experiment at the Gran Sasso underground laboratory in Italy, a search for the as-yet-undetected process of neutrinoless double-beta decay, which if found would indicate that neutrinos are their own antiparticles.

Freedman contributed widely to the nuclear science community, including co-chairing the recent National Academy of Science’s decadal survey, Nuclear Physics: Exploring the Heart of Matter; co-chairing the National Research Council report, Scientific Opportunities with a Rare-Isotope Facility in the United States; and co-chairing the American Physical Society’s magisterial neutrino study, The Neutrino Matrix.
At the time of his death, he was the leader of the Weak Interaction Group based in the Nuclear Science Division, a wide-ranging program bringing together international collaborations like KamLAND and CUORE and smaller-scale experiments like the optical trapping of short-lived radioactive isotopes at the 88-Inch Cyclotron, to examine the weak interaction between electrons and neutrinos and the quarks that constitute protons and neutrons.
“Somehow, Stuart just kept growing as a scientist,” says Gerald Garvey of Los Alamos, an experimental nuclear physicist and expert in science policy whose collaborations with Freedman began over 30 years ago. “Most of us start slowing down after 50, but Stuart continued to get stronger and stronger.”

Freedman was born in Los Angeles on January 13, 1944 and received his education at UC Berkeley, graduating with a B.S. in Engineering Physics in 1965, an M.S. in Physics in 1967, and a Ph.D. in Physics in 1972. His teaching career took him from Princeton, to Stanford, and then in 1982 to Argonne and the University of Chicago. In 1991 he assumed joint appointments as Faculty Senior Scientist in Berkeley Lab’s Nuclear Science Division and Professor in UC Berkeley’s Department of Physics.

In 1999 Freedman was named to the Luis W. Alvarez Memorial Chair in Experimental Physics at UC Berkeley. His numerous awards and honors include election to the National Academy of Science in 2001, election to the American Academy of Arts and Sciences and named a Fellow of the American Association for the Advancement of Science, both in 2006, and the 2007 Tom W. Bonner Prize for Nuclear Physics from the American Physical Society.
Freedman, a resident of Berkeley, is survived by his wife, Joyce; his son, Paul, and daughter-in-law, Emily; his sister, Ina Jo Scheid; nephew Jason Sturman; and two grandchildren.

###

Lawrence Berkeley National Laboratory addresses the world’s most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab’s scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy’s Office of Science. For more, visit http://www.lbl.gov.