CNMS RESEARCH


Gordon Bell Prize Emerges From Ongoing Computational
Nanoscience Endstation Effort

Achievement:

A team led by Thomas Schulthess, including Gonzalo Alvarez, Mike Summers, Thomas Maier, and Paul Kent from the Computer Science and Mathematics Division (CSMD) and the Center for Nanophase Materials Sciences (CNMS) Nanomaterials Theory Institute; Jeremy Meredith and Ed D’Azevedo from CSMD; Markus Eisenbach and Don Maxwell from the National Center for Computational Sciences (NCCS); and Jeff Larkin and John Levesque with the Cray, Inc. Center for Excellence at NCCS, recently won the most prestigious prize in high performance computing, the Gordon Bell Prize [1] for Peak Performance, for their petascale simulations of high-temperature superconductors [2]. The team developed the DCA++ code to study using the Dynamic Cluster Approximation (DCA) models of high-temperature superconductors, such as the two-dimensional Hubbard model and extensions that allow studies of disorder effects and nanoscale inhomogeneities.

Significance:

Cuprates are superconductors with transition temperature as high as 150 K. Shortly after their discovery in the late 1980s, the two-dimensional Hubbard model was proposed as a simple description of these materials. Since then, the Hubbard model (Fig. 1) has been vigorously discussed within the scientific literature on the cuprates. Only recently, with the advent of quantum cluster methods – that provide a systematic way to solve the quantum many-body-problem posed by the Hubbard model – and leadership computing at ORNL, has it been possible to demonstrate unequivocally that the model describes a superconducting transition with d-wave pairing [3]. A detailed analysis of the simulations lead to the prediction of the pairing interaction mechanism, as was reported in a series of papers resulting from collaboration between Thomas Maier and Doug Scalpino from the University of California at Santa Barbara [4].

With the new Cray XT5 system at ORNL’s NCCS and the improved algorithms implemented in the DCA++ code, scientists have a capability that is over a thousand times more powerful compared to only four years ago when the first successful simulations of the Hubbard model [3] were performed. With this enhanced capability, it is now be possible to study effects of disorder and nanoscale inhomogeneities. The interrelation between superconductivity and nanoscale material inhomogeneities in cuprates has been a subject of many studies since the presence of inhomogeneous electronic and magnetic phases was discovered in the mid 1990s in neutron scattering experiments. With the recent variable temperature STM studies, in which Yazdani’s group at Princeton [5] showed that regions with paired charge carriers exist in the materials at temperature well above the superconducting transition, the detailed understanding of the role inhomogeneities play in cuprates revealed by the simulations may lead to new strategies in the search for materials with even higher transition temperatures.

Computational Nanoscience Endstation:

DCA++ is among a suite of codes and simulation capabilities that comprise the computational nanoscience end-station (CNE) developed in collaboration between CNMS and Computer Science and Mathematics Division. In analogy to experimental endstations at large experimental facilities, the CNE provides users with the leading edge scientific instrumentation (i.e., modeling software) and expertise to perform scientific research at scale on leadership computing facilities such as NCCS. (See Fig. 2) In addition to DCA++ and a toolkit to support atomistic simulations of magnetic nanosystems, the CNE currently supports large-scale electronic structure codes that allow direct ab-initio simulations of nanoscale systems [6]. The CNE has been an important driver of the CNMS user program. In FY 2008, 100 of CNMS’ 406 users utilized the center’s capabilities in theory, modeling, and simulation. Simulations of the CNE are carried out on the CNMS cluster, NCCS, as well as the large supercomputers at NERSC. Thirty scientists are jointly users of the CNMS and the NCCS, and the CNMS/NTI team is leading one of the large INCITE allocations at NCCS [7]. A CNMS user project led by Jihui Yang from General Motors independently received INCITE allocations for 2008 and 2009 with the help of capabilities and expertise developed in the CNE [8]. The scalable version of the Vienna Atomistic Simulations Package (VASP) that Paul Kent adapted to run on large-scale supercomputers [9] has been the most run code on NERSC’s new Cray XT4 supercomputing during the 2008 allocation year. Demonstrating the impact of this work, sixteen different research groups have utilized this optimized version, which is now set as default for VASP users at NERSC.

Future Work:

Future work will be focused on understanding the Hubbard model pairing mechanism in the presence of material disorder, and, in general, on growing the computational end-station effort and expanding the scientific capabilities covered by the end-station. Software challenges that need immediate work to fully utilize multicore processors include (i) shared memory programming models and hybrid programming, i.e., distributed memory (MPI) and shared memory combined, and (ii) work on novel accelerator architectures such as GPU and cell processors.

Acknowledgements:

Thomas Schulthess, Thomas Maier, Paul Kent, Gonzalo Alvarez, Mike Summers, and Jeremy Meredith acknowledge the support of DOE Basic Energy Sciences, Division of Scientific User Facilities through the Center for Nanophase Materials Sciences and ORNL’s Ultrascale Computing Initiative within the Laboratory Directed Research and Development Program (LDRD) for initial code development and for subsequent scaling of the codes. Don Maxell, Gonzalo Alvarez, and Mike Summers acknowledge the support of DOE Advanced Scientific Computing Research (ASCR) through the National Center for Computational Sciences (NCCS). Markus Eisenbach also acknowledges the support of ORNL’s Ultrascale Computing Initiative within its LDRD program. Ed D’Azevedo acknowledges the support of the DOE ASCR Mathematical, Information, and Computational Sciences Division. Jeff Larkin and John Levesque acknowledge the support of the Cray, Inc. Center for Excellence located at NCCS.

References

  1. http://awards.acm.org/bell/
  2. http://www.hpcwire.com/offthewire/ORNL_Supercomputer_Simulation_Captures_Gordon_Bell_Prize.html
  3. T. A. Maier, M. Jarrell, T. C. Schulthess, P. R. C. Kent, and J. B. White, “A systematic study of the superconductivity in the 2D Hubbard model,” Phys. Rev. Lett. 95, 237001 (2005).
  4. T. A. Maier, M. Jarrell, and D. J. Scalapino, “Pairing interaction in the two-dimensional Hubbard model studied with a dynamics cluster quantum Monte Carlo approximation,” Phys. Rev. B 74, 094513 (2006); T. A. Maier, D. Poilblanc, and D. J. Scalapino, “Dynamic analysis of the pairing interaction in the Hubbard and t-J models of high-temperature superconductors,” Phys. Rev. Lett. 100, 237001 (2008).
  5. K. K. Gomes, A.N. Pasupathy, A. Pushp, S. Ono, Y. Ando, A. Yazdani, “Visualizing pair formation on the atomic scale in the high-TC superconductor Bi2Sr2CaCu2O8+δ”, Nature 447, 569-572 (2007).
  6. For a recent summary of scientific highlights generated with the electronic structure part of the CNE, see the Winter 2008 Issue of the SciDAC Review, http://www.scidacreview.org/0804/index.html
  7. http://www.nccs.gov/leadership-science/project-archive-list/fy09-projects/
  8. http://science.doe.gov/ascr/INCITE/2009INCITEFactSheet.pdf
  9. P.R.C. Kent, “Computational Challenges of Large-Scale Long-Time First-Principles Molecular Dynamics” J. Phys.: Conf. Series 125, 012058 (2008).
Fig. 2. Main CNMS computational nanoscience endstation objectives (center) and relation to other scientific areas