jump to navigation

On Competition April 10, 2006

Posted by gordonwatts in physics.
trackback

A few weeks ago DZERO released a Bs mixing result. This result is important for many physics reasons, but is also a bit of a “personal” victory for DZERO. It was a tremendous amount of hard work and done under considerable pressure: “CDF is about to release their results any second now!”. We had to be first: to first order CDF’s detector is better designed to make this measurement.

Tommaso has a great post about it from the CDF point of view, where he examines some of the reasons why CDF is taking longer than DZERO to get the Bs result out:

The publication process of CDF data analyses is baroque, bordering the grotesque. Once a group finalizes their result and presents it at internal meetings, the result has to be blessed. This involves three rounds of scrutiny, the full documentation of the analysis in internal notes, and often the fight with skeptics who like to sit at meetings and play “shoot the sitting duck” with the unfortunate colleague presenting the result. Usually, when an important result is on, the physicists who produced it are asked to perform additional checks of various kinds, and defend it with internal referees. When all of that is through, and not a day earlier, the result can be shown at Physics conferences.

The internal review procedures at DZERO are also quite lengthy. I don’t think I’m revealing any state secrets:

  1. Author distributes an analysis note containing complete details of the analysis to the physics group. The group is required to have carefully examined the analysis within one week. If there are problems it will take longer, of course. Usually this step does happen in one week because, if the analyzers have done their homework, they have been presenting the analysis in the group for months.
  2. An Editorial Board (EB) is formed (this usually proceeds in parallel with step 1 since it can take a while to find people who have time).
  3. Once the group review is finished, all material is given to the EB. There is no specific deadline for the EB to examine the analysis; they can take as long as they wish. This is the “shoot the sitting duck” phase that Tommaso refers to above. I’ve seen this take as little as 2 weeks and on up to many months, depending on the analysis and the people involved.
  4. One step 3 is done a note for external readers is put together (the conference note). This is reviewed by the EB — usually very quickly. And is then forwarded to the collaboration. The collaboration has 1 week to make comments.
  5. It usually takes several more weeks after this to respond to all the comments — many of which are wording and phrase change requests — before the analysis result is actually released to the public.

I don’t think it is that much different than what CDF has to go through — perhaps a bit more streamlined. We are all afraid that something wrong will make it out; hence all the layers of cross checking that go on. All of the collaboration is on the author list; this is the way the collaboration makes sure that the results that get out are correct. It can be a pain!

At another spot in the post Tommaso states:

Now for CDF. We have a better suited detector for B physics measurements – the Silicon Vertex Tracker, a hardware trigger that measures track momentum and impact parameter in 10 microseconds, with a precision close to that achievable offline – and more experience in the field (D0 only installed a silicon vertex detector and central spectrometer for Run II). We have more data on tape, more people working at analyses, more everything – some claim ours is bigger, too. But we lost.

I think he is right about just about everything except for two things. The rumor mill seems to agree that CDF’s result will be better than ours: it should be for the reasons he mentions. But I’m not so sure CDF has more data on tape with the Silicon running. DZERO has been amazingly fortunate in that our Si detector has been a champ (lets hope we don’t mess it up!). You can see how much data we have on tape in this cool plot.

UPDATE: I’d written this to come out a bit later, but CDF is releasing their results today! So it doesn’t make much sense to hold onto the post for several more days…

Comments»

1. Experimental sociology | Cosmic Variance - April 17, 2006

[…] A little late, but I didn’t want to let slip this interesting discussion about the agonizing process of making experimental particle physics results ready for public consumption from Tomaso Dorrigo and Gordon Watts. You’ll recall that we mentioned a couple of weeks ago the new results from Fermilab’s Tevatron on B-mixing, a measurement that puts interesting new constraints on the possibilities for physics beyond the Standard Model. The first announcement was from the D0 (”D-Zero”) experiment; as Collin pointed out in the comments, the CDF experiment followed with their own results soon thereafter. […]

2. Life as a Physicist » Blog Archive » CDF Nails It! - April 19, 2006

[…] Along the x axis is the difference in mass between a Bs meson (a B meson containing a b and a s quark) and its anti-matter partner, the anti-Bs meson. The y-axis is the amplitude, but for sake of brevity, I'll call it the probability (I'm being very sloppy here!). Tommaso goes into a good deal of detail in his post, so I won't repeat it here. You can see how CDF's data resolves the high-probability peak much (at 17.25 ps-1) much better than D0 does (at 19 ps-1). That is for all the reasons we previously talked about both on this blog and on Tommaso's blog. […]

3. Life as a Physicist » More On Competition - April 20, 2006

[…] Tommaso's and my posts on the Bs rivalry got picked up by the Cosmic Variance blog in, ironically, a post talking about sociology (I have got to update the look on my blog; ugly compared to everyone else!). One of the things both Tommaso and I discussed is what it takes to get a result out of a large experimental collaboration. The comments to the Cosmic Variance post picked up on some of this. In particular, there were several comments that could be summed up by this one by Scott O: The SNO collaboration goes a step further still. It is collaboration policy not to show any result in public unless it has both gone through extensive internal review and has been submitted for publication to a refereed journal. In other words, there is no such thing as a “SNO preliminary result”. The attitude is that if it’s not ready to submit for publication, it’s not ready to show in public either. Obviously this slows down the publication process, but personally I think there’s a lot of be said for it as well. […]


Leave a comment