Skip to content

Building Decay Candidates In DaVinci

Learning Objectives

  • Learning about the different classes of standard particles in LHCb
  • Understanding the difference between Minimum Bias and NoBias data
  • Learning how to build candidates from NoBias data

LHCb's Standard Particles

The reconstruction process of the LHCb experiment takes place in several steps. First, the hit information from the tracking stations (the VELO, the UT, and the SciFi) is taken to build track candidates doing a fit. We also take the information from clusters in the calorimeters to build photon candidates, or even \(\pi^0\) mesons, which almost always decay into two photons. This gives way to protoparticles: tracks with PID variables but without a specific particle species assignment.

Photons and \(\pi^0\) candidates

Photons are not classified as tracks because they do not leave hits on the tracking subdetectors. Instead, we build photons by looking at clusters in the electromagnetic calorimeter (ECAL).

Sometimes, however, a photon is produced and, in its interaction with the material of some of the subdetectors, can be converted into a pair of oppositely-charged particles. We call these converted photons, and some analysis can use them to improve the resolution of their decays.

Another interesting case study are \(\pi^0\) mesons. These decay with a 98% chance into two photons. These two photons can sometimes occupy the same cell in the ECAL, making the distinction between them harder. We call these merged \(\pi^0\)s. If the photons occupy separate cells, they are referred to as resolved \(\pi^0\)s.

Protoparticles are then assigned particle species by giving them a corresponding mass hypothesis (e.g. a 'pion' is simply a protoparticle whose mass has been assigned 140MeV). Depending on the track type, we can build long muons, downstream electrons, upstream kaons, etc.

In the code snippet above, we can filtered through our pions in the event using the RecoConf module standard_particles.

RecoConf vs. Hlt2Conf

Starting in DaVinci v65r0, some modules from Hlt2Conf were ported over to RecoConf, including standard_particles and the algorithms_thor.

Using standard_particles, we have the following functions available:

Function Description
make_long_cb_{electrons, muons, pions, kaons, protons} Maker of long ChargedBasic particles per type.
make_has_rich_long_cb_{pions, kaons} Maker of long ChargedBasic particles per type requiring information from the RICH
make_photons Basic builder for photons
make_long_electrons_{no, with}_brem Basic builders for long electrons with or without Bremsstrahlung corrections
make_long_and_{upstream, downstream}_electrons_{no, with}_brem Basic builder for long electrons, and upstream or downstream electrons with or without Bremsstrahlung corrections
make_long_upstream_and_downstream_electrons_{no, with}_brem Basic builder for long, upstream or downstream electrons with or without Bremsstrahlung corrections
make_long_{muons, pions, kaons, protons, deuterons, helium3} Basic builder for long muons, pions, kaons, protons, deuterons, helium3
make_ismuon_long_muon Basic builder for long muons with ISMUON condition
make_{up, down}_{electrons_no_brem, muons, pions, kaons, protons, deuterons, helium3} Basic builders for upstream or downstream electrons (no Bremsstrahlung correction), muons, pions, kaons, protons, deuterons, helium3
make_ttrack_{pions, protons, muons, kaons} Basic builders for T-track protons or pions
make_has_rich_{long, down, up}_{pions, kaons, protons, deuterons, helium3} Basic builders for longstream, downstream and upstream pions, kaons, protons, deuterons, helium3
make_has_rich_ttrack_{pions, protons, muons, kaons} Basic builders for T-track pions, protons
make_{resolved, merged}_pi0s Basic builder to make resolved and merged \(\pi^0\)
filter_leptons_loose Basic filter for preselection of leptons
make_detached_{dielectron, mue, mumu} Detached di-lepton builder
make_detached_{dielectron, mue}_with_brem Detached di-lepton builder with included Bremsstrahlung correction
make_dimuon_base Basic maker for a di-muon combination

Filtering and combining particles

As exemplified above, particles coming from these builders can later be filtered using ThOr Functors and the help of ParticleFilter. For instance, this is how we would select long kaons with a PID_K>5 and a momentum higher than \(1~\mathrm{GeV}\):

make_kaons = standard_particles.make_long_kaons()
code_kaons = F.require_all(F.PID_K > 5, F.P > 1 * GeV)
kaons = algorithms_thor.ParticleFilter(make_kaons, F.FILTER(code_kaons))

These final-state particles can later be combined so that they come from a parent object. This can be either an actual parent particle, like a \(D_s^-\) decaying into two kaons and a pion, or can simply be a container of the two tracks, which we can use to easily apply cuts on combinations of particles that are not necessarily the only ones coming from a vertex.

For either case, we use the algorithm ParticleCombiner from algorithms_thor. This function takes in multiple keyword arguments, including its name, a list of inputs, a decay descriptor, and cuts on the combination (CombinationCut) and the vertex (CompositeCut).

The combination cuts typically include selections on the DOCA (and the DOCA \(\chi^2\)), and also a selection on the invariant mass of the combination inputs. The vertex selections can also include a cut on the invariant mass variable that is calculated using the information from the vertex fit as well. (Typically it can be more precise, so the selections tend to be tighter.)

Other vertex cuts can include selections on CHI2DOF, which in this case represents the quality of the vertex, the lifetime of the parent, BPVLTIME, the minimum IP or IP \(\chi^2\) of the parent, MINIP or MINIPCHI2, the flight distance in the \(z\) coordinate, BPVVDZ, among others.

In addition, you can provide an algorithm for the ParticleCombiner argument. ParticleVertexFitter is used when you wish to reconstruct the vertex through a fit to the particles' tracks. ParticleAdder is used when the user simply wants to have a combination of tracks with no vertex fit.

For instance, assume we want to have a di-kaon object so that we can apply cuts on the pair, and later combine this with the pion to form the \(D_s^-\):

dikaon = ParticleCombiner(
    Inputs=[kaons, kaons],
    ParticleCombiner="ParticleAdder",
    DecayDescriptor="phi(1020) -> K+ K-",
    CombinationCut=F.require_all(F.SDOCA(1,2) < 0.2 * mm),
    CompositeCut=F.require_all(F.MINIP(pvs)>1 * mm),
)

ds = ParticleCombiner(
    Inputs=[dikaon, pions],
    ParticleCombiner="ParticleVertexFitter",
    DecayDescriptor="[D_s^- -> phi(1020) pi-]cc",
    CombinationCut=F.require_all(in_range(1900 * MeV, F.MASS, 2030 * MeV)),
    CompositeCut=F.require_all(in_range(1920 * MeV, F.MASS, 2010 * MeV), F.CHI2DOF < 4)
)

Running on Minimum Bias

As we saw during the First Analysis Steps lessons, the NoBias stream collects data that has not passed any trigger selection, selecting a random event out of every few hundred thousands.

Minimum Bias vs NoBias

At LHCb, there are two similar things which are often confused: "Minimum Bias MC" vs. "NoBias data" Minimum Bias MC refers to simulated samples where no decay is preferred, this is a simulated approximation for exactly what we see in real life. This is then used for e.g. checking that distributions in MC are realistic. They are sometimes used to check the expected throughput of trigger lines. They can be used to study ghost tracks...

NoBias on the other hand is a trigger line which makes no selection at all (data flows through the trigger only so it can be reformatted). The NoBias stream can contain events without a beam-beam crossing, while in Minimum Bias simulation, proton-proton collisions are forced.

This data is then accessible to analysts and members of the Collaboration who may use it for different purposes, including various checks or even analysis. Similarly, we can simulate Minimum Bias data by simply colliding protons and allowing the resulting particles within the LHCb acceptance to decay according to the PDG's branching fractions. The list of decays accessible via Minimum Bias simulation, along with their probabilities, is listed in the massive DECAY.dec file.

The NoBias stream persists all information, so we can build our own decay candidates offline, using ParticleAdder, ParticleCombiner, and other functions of the like. We can even use the earlier script, reflecting the change in the stream on the YAML file.

It should be noted however, since we're using NoBias data, only very few events will contain the decays we're looking for, as such to see a signal we will need to process a lot of events (100,000 in the bellow example).

In the following example, we run over NoBias data (see the options file here):

input_files:
- /eos/lhcb/wg/dpa/wp7/Run3SK/NoBias/00246705_00000054_1.nobias.dst
- /eos/lhcb/wg/dpa/wp7/Run3SK/NoBias/00246705_00000055_1.nobias.dst
input_type: ROOT
output_type: ROOT
input_raw_format: 0.5
simulation: false
input_process: 'TurboPass'
input_stream: 'NoBias'
lumi: true
data_type: 'Upgrade'
evt_max: 100000
print_freq: 10000
ntuple_file: tuple.root
geometry_version: run3/2024.Q1.2-v00.00
conditions_version: master