SEARCH FOR CRITICAL FLUCTUATIONS IN PB+PB COLLISIONS AT THE CERN SPS.
Mikhail Kopytine, SUNY at Stony Brook, for the NA44 Collaboration.
Talk at the Relativistic Heavy Ion Experiment Session of the XXXth International Symposium on Multiparticle Dynamics , October 9-15, 2000, Tihany, Hungary

Abstract :
NA44 uses a 512 channel Si pad array covering pseudorapidity 1.5 < eta < 3.3 to study events of charged hadron production in Pb+Pb collisions at the CERN SPS. We apply a multiresolution analysis, based on a Discrete Wavelet Transformation, to probe the texture of particle distributions in individual events by simultaneous localization of features in space and scale. Scanning a broad range of multiplicities, we look for traces of a possible critical behaviour in the power spectra of local density fluctuations. Measured results are compared with detailed simulations of detector response, using as input heavy ion event generators.
Date and time of the talk: October 14, 11:30-11:45 am.

Sometimes people talk about fluctuations and mean global fluctuations. That is however not what we deal with in this work. Here is one of the reasons. The problem is to see if in history of a typical collision, there has been a special event -- a phase transition. The RMS fluctuation is an average which can be taken on a set of observable events. Unlike the condensed matter physicists, we only have access to the frozen-out stage of the time evolution, and a global fluctuation measured on an ensemble of such events is not that informative. Our case is more like that of cosmology. In case of the Universe, we, first, have only one event realization; second, we observe it after freeze-out. Nevertheless, there are ways to test dynamical hypotheses about its origin. These are based on the study of texture, or local fluctuations. One can easily tell the actul Universe from the man-made image of the Universe shown by the gift-wrapper below. Trying to formulate the difference, we intuitively evoke a texture analysis procedure which tells us on the basis of the visual information that the different scales play different roles in the two images.


This example shows how a phase transition manifests itself in texture. The images show how a collection of spins in a 2D Ising model goes through the critical point. The large correlation length, or large scale texture enhancement, can serve as an experimental signature of the critical point. Similar models for QCD exist where the hadron field, rather than magnetisation, plays the role of an order parameter in the second order phase transition. Instead of a temperature scan, we do a multiplicity (related to the barion chemical potential) scan.

Commentary: RQMD event in the GEANT simulation of the detector response. Magnetic field and delta electron generation are on. Target thickness is 1.15 g/cm**2. We only use delta-free side of the detector.


Commentary: Because the double differential d2N/d_eta/d_phi is independent of phi, dN/d_eta = 2*pi*d2N/d_eta/d_phi.


Commentary: DWT (Discrete Wavelet Transformation) decomposition of the three different kinds of an image: a chess board, a smooth gradient surface, and a set of 1000 random white noise field samples. The first two cases are opposite in the scale localization of the information they carry. The third case has a remarkable property of scale-independence. In the words of Norbert Wiener ("Generalized Harmonic Analysis"): "... the energy of a haphazard sequence of impulses is uniformly distributed in frequency. ... Theoretically this equipartition of energy might be used in the absolute calibration of acoustical instruments."


The texture analysis with DWT is based on the multiresolution analysis theorem. Any piece-wise continuous function with a finite integral can be accurately enough represented by a histogram. The histogram undergoes a number of rebinnings and the scale gets coarser. The right column shows the information lost in each rebinning step, which is the difference between the two rebinning stages. It is clear that if you keep doing this, as much information as is desired can be lost. You also notice that the right histograms have very simple structure: they consist of juxstaposed pairs of box functions of equal amplitude and different sign. These are Haar wavelets. Because all the lost information is found in the right column, this means that the original function can be RECONSTRUCTED as accurately as possible from the wavelets. The theorem states that an orthonormal basis to decompose and reconstruct the given function with any degree of accuracy can be formed by the set of wavelets.

The family of 2D functions, obtainable from the three colorful functions by translations (characterized by integers i and j) and dilation/contraction (integer m), form an orthonormal basis in the space of piece-wise continuous functions whose second power in integrable. [1] Decomposition in a wavelet basis can be used to extract power spectra of fluctuations in the density fields (see work [2] where the method is explained in 1D case). In this work, we use WAILI library of wavelet software. The bottom formula defines the power spectrum in 2D.

Power spectra for just one of the multiplicity bins we use. The first striking thing is that the "true" power spectra (red points) are indeed enhanced on the coarse scale. One has to clarify the reason for this enhancement. Obviously, one of the reasons is the shape of the double differential d2N/d_eta/d_phi distribution. The averaged event, formed by summing amplitude patterns of all the measured (namely, 7374 in this case) events in a given multiplicity bin, and dividing by the number of events, has a much reduced texture because fluctuations get averaged out. However they do retain some of the texture associated with the d2N/d_eta/d_phi shape, with the dead channels and the finiteness of the beam's geometrical cross-section. Still better way to get rid of the "trivial" texture is to use mixed events (shown). Our event mixing is done by event number scrambling (not to be confused with channel number scrambling !). Therefore, the mixed events preserve the texture associated with the geometrical distortion, the inherent dN/d_eta shape, the dead channels, while fluctuations are not statistically suppressed. I call this static texture; its opposite is dynamic texture which we are after, and which is not coupled with the detector.

Treatment of static texture :

Source Treatment Irreducible remainder estimate** (quoted for diagonal texture correlation in the
379< <463 bin),

subtract
empty
target
event mixing * MC
subtract
mixed
events
preserve
sectors
finite beam
Xsection: 1X2mm
irrelevant no irrelevant yes ! 0.14 for the coarsest scale, negligible otherwise
detector offset
dx=1.1mm, dy=0.3mm
irrelevant yes ! irrelevant yes !
shape irrelevant yes ! irrelevant yes !
dead pads irrelevant yes ! irrelevant yes !
background hits yes ! yes ! yes ! no < 0.15, generally decreases with scale fineness
channel Xtalk 7% for neighbours; negligible otherwise irrelevant yes ! yes ! no
statistical fluctuations irrelevant yes ! irrelevant yes ! 0.
* event mixing = event NUMBER scrambling, NOT channel scrambling
** info for orientation only, see the data plots for details and all cases

By static texture we mean texture which reproduces its pattern event after event. This can be either because it is coupled with detector channels (dead pads, geometry distortion, channel cross-talk, etc) or because of "static" physics features such as shape. We eliminate the static texture from the texture correlation observable by empty target subtraction and by subtraction of mixed events power spectra. For comparison with models, MC simulation includes the known static texture effects and undergoes the same elimination procedure. By the "irreducible remainder" I mean the residual effect which may

Here I group the static texture effects according to similarity of manifestation and treatment, into "lemon", "peach", and "white". In the lemon group, the contributor of dominant irreducibility (shown in dark yellow) is taken to represent the irreducibility level of the group. In the peach group, the effects are hard to separate and the errorbar evaluation method (to be described shortly) represents the net effect. The white case is trivial.




The horisontal extent of the boxes shows the boundaries of each multiplicity bin. The vertical extent of each box is the systematic errorbar in that bin. The statistical errorbars are shown by the traditional errorbars on points. The systematic errorbars were evaluated by removing the Pb target and switching magnetic field polarity to expose the given side of the detector to delta electrons, while minimizing nuclear interactions. This gives an "analog" generator of uncorrelated noise. All correlations (i.e. deviations of the P_m(true)-P_m(mixed) from 0) in this noise generator are considered to be systematic errors. Thus this component of the systematic error gets a sign, which is why the systematic errors are asymmetric. The other component (significant only on the coarsest scale) is the uncertainty of our knowledge of beam's geometrical cross-section. Finiteness of the beam cross-section accounts for the rising texture correlation in the top plots, as was confirmed by the MC studies.





References :
[1] Ingrid Daubechies, "Ten Lectures on Wavelets" [ back to the context ]
[2] L-Z.Fang, J.Pando, "Large-scale Structures revealed by Wavelet Decomposition", astro-ph/9701228 29 Jan 1997 (find it!) [ back to the context]
Useful info : Web-based collection of wavelet-related information
Click to E-mail your comments: (Mikhail.Kopytine@sunysb.edu)
Back to Main