33 Matching Results

Search Results

Advanced search parameters have been applied.

Ferrenberg Swendsen Analysis of LLNL and NYBlue BG/L p4rhms Data

Description: These results are from the continuing Lattice Quantum Chromodynamics runs on BG/L. These results are from the Ferrenberg-Swendsen analysis [?] of the combined data from LLNL and NYBlue BG/L runs for 32{sup 3} x 8 runs with the p4rhmc v2.0 QMP-MPI.X (semi-optimized p4 code using qmp over mpi). The jobs include beta values ranging from 3.525 to 3.535 with an alternate analysis extending to 3.540. The NYBlue data sets are from 9k trajectories from Oct 2007, and the LLNL data are from two independent streams of {approx}5k each, taking from the July 2007 runs. The following outputs are produced by the fs-2+1-chiub.c program. All outputs have had checksums produced by addCks.pl and checked by the checkCks.pl perl script after scanning.
Date: December 5, 2007
Creator: Soltz, R.
Partner: UNT Libraries Government Documents Department

Review of Systematic Investigations of the Rout/Rside ratio in HBT at RHIC

Description: We review the significant difference in the ratio R{sub out}/R{sub side} between experiment and theory in heavy-ion collisions at RHIC. This ratio is expected to be strongly correlated with the pion emission duration. Hydrodynamic models typically calculate a value that approximately equal to 1.5 and moderately dependent on k{sub T} whereas the experiments report a value close to unity and independent of k{sub T}. We review those calculations in which systematic variations in the theoretical assumptions were reported. We find that the scenario of second order phase transition or cross-over has been given insufficient attention, and may play an important role in resolving this discrepancy.
Date: January 6, 2005
Creator: Soltz, R. A.
Partner: UNT Libraries Government Documents Department

First Physics from HotQCD Collaboration

Description: The following pages show results from the first series of runs on BG/L using the unoptimized code from the MILC collaboration. The calculations were run with the AsqTad improved staggered fermion action and the RHMC algorithm on a 32{sup 3} x 8 lattice. The jobs were run mostly during October, 2006 on approximately 5% of the machine. The run consisted of approximately 1000 trajectories per beta value, spanning beta = 6.458 to 6.85, covering a temperature range of 140-210 MeV.
Date: November 6, 2006
Creator: Soltz, R; Gupta, R & Grandy, J
Partner: UNT Libraries Government Documents Department

LQCD Phase 1 Runs with P4RHMC

Description: These results represent the first set of runs of 10 {beta} values ranging from 2000-7000 trajectories with the p4rhmc code. This initial run sequence spanned roughly 2-weeks in late January and Early February, 2007. To manage the submission of dependent jobs: subSet.pl--submits a set of dependent jobs for a single run; rmSet.pl--removes a set of dependent jobs in reverse order of submission; and statSet.pl--runs pstat command and prints parsed output along with directory contents. The results of running the statSet.pl command are printed for three different times during the start up the next sequence of runs using the milc code.
Date: February 13, 2007
Creator: Soltz, R & Gupta, R
Partner: UNT Libraries Government Documents Department

Lattice QCD Thermodynamcis : P4 Action for new beta and MILC Nt=6

Description: These results are from the continuing Lattice Quantum Chromodynamics runs on BG/L. We show analyzed thermodynamics results for 6-10k trajectories ({beta} dependent), of the 32{sup 3}{chi}8 runs with the p4rhmc v2.0 QMP{_}MPI.X (semi-optimized p4 code using qmp over mpi). These jobs had a number of omitted trajectories, due to zero size and over-sized data files. For this interim report these errors were removed from the output to save space. The results also include the output of a new ''histogram.perl'' script, used to plot the gauge action < S{sub g} >= 10*(1- < plaq >)-(1- < rect >) for three values of {beta}=3.51, 3.54, 3.57. This output will be used to determine the new {beta} values that will be run to define the critical temperature. We also show a preliminary analysis on the first 5,000 trajectories of the 32{sup 3}{chi}6 runs for the milc code, using the new su3{_}rhmc{_}susc{_}eos.3g1f.qmp-bgl2 faster RHMC algorithm.
Date: May 7, 2007
Creator: Soltz, R; Vranas, P & Gupta, R
Partner: UNT Libraries Government Documents Department

Lattice QCD Thermodynamics : 10k analysis with 1k thermalziation and block size of 500.

Description: This is a re-analysis of the Lattice QCD Thermodynamics p4rhcm new beta analysis (UCRL-TR-230742) with approximately 10k trajectories per beta point, rerun with thermalization cut at 1000, and a block size of 500. Some diagnostic text has been omitted to reduce the number of pages.
Date: May 10, 2007
Creator: Soltz, R; Vranas, P & Gupta, R
Partner: UNT Libraries Government Documents Department

Lattice QCD Thermodynamics First 5000 Trajectories

Description: These results represent the first LQCD analysis for approximately 5000 trajectories with each of the p4rhmc and milc codes, with some of the lower temperature runs having fewer. Both runs were for lattice dimensions of 32{sup 3}x8. Some 32{sup 4} T=0 jobs were also run for p4rhmc. The p4 calculation was performed with v2.0 QMP{_}MPI.X (semi-optimized p4 code using qmp over mpi) and milc version of the su3 rhmc susc eos executable dated Mar 1, 2007 on ubgl in the /usr/gapps/hip/qcd/milc/bin subdirectory (svn revision 28). As with previous runs, calculations were performed along lines of constant physics, with the light quark masses 2-3 times their physics values and the strange quark mass set by m{sub ud} = 0.1m{sub s}. Job submissions were performed using a new subSet.pl job submission script that locates current jobs and submits additional jobs with the same beta value as pending. Note that after reaching a limit of about 35 jobs subsequent submissions are delayed and will not be submitted directly from that state. The job submission script was used to submit revised versions of the milc and p4rhmc csh scripts. Initial thermalized lattices for each code were also for milc (taken from the firstPhys runs), but the p4rhmc runs include thermalization. The only modifications for running on BG/L were to the directory names and the mT parameter which determines job durations (24 hrs on BG/L vs. 4 hrs on ubgl). All finite temperature jobs were submitted to the 512 node partitions, and all T=0 runs were submitted to 2048 node partitions. The set of runs was plagued by filesystem errors on lscratch1 and lscratcH{sub 2}. Many jobs had to be reset (deleting the most recent output file for milc and/or lattice for p4) and resubmitted. The analysis was performed with a new set of scripts ...
Date: March 15, 2007
Creator: Soltz, R & Gupta, R
Partner: UNT Libraries Government Documents Department

The BlueGene/L Supercomputer and Quantum ChromoDynamics

Description: In summary our update contains: (1) Perfect speedup sustaining 19.3% of peak for the Wilson D D-slash Dirac operator. (2) Measurements of the full Conjugate Gradient (CG) inverter that inverts the Dirac operator. The CG inverter contains two global sums over the entire machine. Nevertheless, our measurements retain perfect speedup scaling demonstrating the robustness of our methods. (3) We ran on the largest BG/L system, the LLNL 64 rack BG/L supercomputer, and obtained a sustained speed of 59.1 TFlops. Furthermore, the speedup scaling of the Dirac operator and of the CG inverter are perfect all the way up to the full size of the machine, 131,072 cores (please see Figure II). The local lattice is rather small (4 x 4 x 4 x 16) while the total lattice has been a lattice QCD vision for thermodynamic studies (a total of 128 x 128 x 256 x 32 lattice sites). This speed is about five times larger compared to the speed we quoted in our submission. As we have pointed out in our paper QCD is notoriously sensitive to network and memory latencies, has a relatively high communication to computation ratio which can not be overlapped in BGL in virtual node mode, and as an application is in a class of its own. The above results are thrilling to us and a 30 year long dream for lattice QCD.
Date: October 19, 2006
Creator: Vranas, P & Soltz, R
Partner: UNT Libraries Government Documents Department

Tc with AsqTad and p4rhmc, July 20, 2007 Update

Description: We present the ongoing analysis of Lattice Quantum Chromodynamics runs on the LLNL BG/L supercomputer. This installment adds the density analysis of the p4rhmc for the first few thousand trajectories and the {psi} - bar{psi} history for hot and cold starts with two values of beta.
Date: July 18, 2007
Creator: Soltz, R.; Vranas, P. & Gupta, R.
Partner: UNT Libraries Government Documents Department

Tc with AsqTada and p4rhmc, July 23 update.

Description: We present the ongoing analysis of Lattice Quantum Chromodynamics runs on the LLNL BG/L supercomputer. This installment adds the density analysis of the p4rhmc for the first five thousand trajectories and the AsqTad action results for N{sub {tau}} = 6 results for ten thousand trajectories.
Date: July 25, 2007
Creator: Soltz, R.; Vranas, P. & Gupta, R.
Partner: UNT Libraries Government Documents Department

Second update The Gordon Bell Competetion entry gb110s2

Description: Since the update to our entry of October 20th we have just made a significant improvement. We understand that this is past the deadline for updates and very close to the conference date. However, Lawrence Livermore National Laboratory has just updated the BG/L system software on their full 64 BG/L supercomputer to IBM-BGL Release 3. As we discussed in our update of October 20 this release includes our custom L1 and SRAM access functions that allow us to achieve higher sustained performance. Just a few hours ago we got access to the full system and obtained the fastest sustained performance point. In the full 131,072 CPU-cores system QCD sustains 70.9 Teraflops for the Dirac operator and 67.9 teraflops for the full Conjugate Gradient inverter. This is about 20% faster than our last update. We attach the corresponding speedup figure. As you can tell the speedup is perfect. This figure is the same as Figure 1 of our October 20th update except that it now includes the 131,072 CPU-cores point.
Date: November 12, 2006
Creator: Vranas, P & Soltz, R
Partner: UNT Libraries Government Documents Department

First LQCD Physics Runs with MILC and P4RHMC

Description: An initial series of physics LQCD runs were submitted to the BG/L science bank with the milc and p4rhmc. Both runs were for lattice dimensions of 32{sup 2} x 8. The p4 calculation was performed with v2.0 QMP{_}MPI.X (semioptomized p4 code using qmp over mpi) and milc v7.2, also using RHMC, but not specifically optimized for BlueGene. Calculations were performed along lines of constant physics, with the light quark masses 2-3 times their physics values and the strange quark mass set by m{sub ud} = 0.1m{sub s}. Job submissions was performed using the standard milc and p4 scripts provided on the ubgl cluster. Initial thermalized lattices for each code were also provided in this way. The only modifications for running on BG/L were to the directory names and the mT parameter which determines job durations (24 hrs on BG/L vs. 4 hrs on ubgl). The milc scripts were set to resubmit themselves 10 times, and the p4 scripts were submitted serially using the ''psub -d'' job dependency option. The runp4rhmc.tcsh could not be used to resubmit due to the 30m time limit imposed on interactive jobs. Most jobs were submitted to the smallest, 512 node partitions, but both codes could also run on the 1024 node partitions with a gain of only 30-50%. The majority of jobs ran without error. Stalled jobs were often indicative of a communication gap within a partition that LC was able to fix quickly. On some occasion a zero-length lattice file was deleted to allow jobs to restart successfully. Approximately 1000 trajectories were calculated for each beta value, see Table . The analysis was performed with the standard analysis scripts for each code, make{_}summary.pl for milc and analysis.tcsh for p4rhmc. All lattices, log files, and job submission scripts have been archived to permanent storage for ...
Date: January 18, 2007
Creator: Soltz, R & Gupta, R
Partner: UNT Libraries Government Documents Department

Overview, goals, and preliminary results of E910 laboratory directed research and development at Lawrence Livermore National Laboratory

Description: E910 is a large acceptance proton-nucleus experiment at the BNL AGS. The experiment completed its first run in the Spring of 1996, collecting more than 20 million pA events, using Be, Cu, Au, and U targets. We present preliminary results for momentum conservation, slow proton distributions, and dN/dy for negative tracks. 16 refs., 5 figs.
Date: January 24, 1997
Creator: Soltz, R.A. & Collaboration, E910
Partner: UNT Libraries Government Documents Department

Energy Lossand Flow of Heavy Quarks in Au+Au Collisions at root-s=200GeV

Description: The PHENIX experiment at the Relativistic Heavy Ion Collider (RHIC) has measured electrons with 0.3 < p{sub rmT} < 9 GeV/c at midrapidity (|y| < 0.35) from heavy flavor (charm and bottom) decays in Au+Au collisions at {radical}s{sub NN} = 200 GeV. The nuclear modification factor R{sub AA} relative to p+p collisions shows a strong suppression in central Au+Au collisions, indicating substantial energy loss of heavy quarks in the medium produced at RHIC energies. A large azimuthal anisotropy, v{sub 2}, with respect to the reaction plane is observed for 0.5 < p{sub rmT} < 5 GeV/c indicating non-zero heavy flavor elliptic flow. A simultaneous description of R{sub AA}(p{sub rmT}) and v{sub 2}(p{sub rmT}) constrains the existing models of heavy-quark rescattering in strongly interacting matter and provides information on the transport properties of the produced medium. In particular, a viscosity to entropy density ratio close to the conjectured quantum lower bound, i.e. near a perfect fluid, is suggested.
Date: February 26, 2007
Creator: Soltz, R; Klay, J; Enokizono, A; Newby, J; Heffner, M & Hartouni, E
Partner: UNT Libraries Government Documents Department

Exploring Lifetime Effects in Femtoscopy

Description: We investigate the role of lifetime effects from resonances and emission duration tails in femtoscopy at RHIC in two Blast-Wave models. We find the non-Gaussian components compare well with published source imaged data, but the value of R{sub out} obtained from Gaussian fits is not insensitive to the non-Gaussian contributions when realistic acceptance cuts are applied to models.
Date: September 6, 2005
Creator: Brown, D.; Soltz, R.; Newby, J. & Kisiel, A.
Partner: UNT Libraries Government Documents Department

Imaging Three Dimensional Two-Particle Correlations for Heavy-Ion Reaction Studies

Description: The authors report an extension of the source imaging method for analyzing three-dimensional sources from three-dimensional correlations. The technique consists of expanding the correlation data and the underlying source function in spherical harmonics and inverting the resulting system of one-dimensional integral equations. With this strategy, they can image the source function quickly, even with the extremely large data sets common in three-dimensional analyses.
Date: June 27, 2005
Creator: Brown, D; Enokizono, A; Heffner, M; Soltz, R; Danielewicz, P & Pratt, S
Partner: UNT Libraries Government Documents Department

Femtoscopy in Relativistic Heavy Ion Collisions

Description: Analyses of two-particle correlations have provided the chief means for determining spatio-temporal characteristics of relativistic heavy ion collisions. We discuss the theoretical formalism behind these studies and the experimental methods used in carrying them out. Recent results from RHIC are put into context in a systematic review of correlation measurements performed over the past two decades. The current understanding of these results are discussed in terms of model comparisons and overall trends.
Date: July 29, 2005
Creator: Lisa, M; Pratt, S; Soltz, R A & Wiedemann, U
Partner: UNT Libraries Government Documents Department