Probability distributions for the center of gravity are fundamental tools for track fitting. The ... more Probability distributions for the center of gravity are fundamental tools for track fitting. The center of gravity is a widespread algorithm for position reconstruction in tracker detectors for particle physics. Its standard use is always accompanied by an easy guess (Gaussian) for the probability distribution of the positioning errors. This incorrect assumption degrades the results of the fit. The explicit error forms evident Cauchy–Agnesi tails that render the use of variance minimizations problematic. Therefore, it is important to report probability distributions for some combinations of random variables essential for track fitting: x=ξ/(ξ+μ), y=(ξ−μ)/[2(ξ+μ)], w=ξ/μ, x=θ(x3−x1)(−x3)/(x3+x2)+θ(x1−x3)x1/(x1+x2) and x=(x1−x3)/(x1+x2+x3). The first two are partial forms of the two strip center of gravity. The fourth is the complete two strip center of gravity, and the fifth is a partial form of the three strip center of gravity. For the complexity of the forth equation, only approxi...
A very simple Gaussian model is used to illustrate an interesting fitting result: a linear growth... more A very simple Gaussian model is used to illustrate an interesting fitting result: a linear growth of the resolution with the number N of detecting layers. This rule is well beyond the well-known rule proportional to N for the resolution of the usual fits. The effect is obtained with the appropriate form of the variance for each hit (observation). The model reconstructs straight tracks with N parallel detecting layers, the track direction is the selected parameter to test the resolution. The results of the Gaussian model are compared with realistic simulations of silicon micro-strip detectors. These realistic simulations suggest an easy method to select the essential weights for the fit: the lucky model. Preliminary results of the lucky model show an excellent reproduction of the linear growth of the resolution, very similar to that given by realistic simulations. The maximum likelihood evaluations complete this exploration of the growth in resolution.
To complete a previous paper, the probability density functions of the center-of-gravity as posit... more To complete a previous paper, the probability density functions of the center-of-gravity as positioning algorithm are derived with classical methods. These methods, as suggested by the textbook of Probability, require the preliminary calculation of the cumulative distribution functions. They are more complicated than those previously used for these tasks. In any case, the cumulative probability distributions could be useful. The combinations of random variables are those essential for track fitting $x={\xi}/{(\xi+\eta)}$, $x=\theta(x_3-x_1) (-x_3)/(x_3+x_2) +\theta(x_1-x_3)x_1/(x_1+x_2)$ and $x=(x_1-x_3)/(x_1+x_2+x_3)$. The first combination is a partial form of the two strip center-of-gravity. The second is the complete form, and the third is a simplified form of the three strip center-of-gravity. The cumulative probability distribution of the first expression was reported in the previous publications. The standard assumption is that $\xi$, $\eta$, $x_1$, $x_2$ and $x_3$ are indepe...
The center of gravity is a widespread algorithm for position reconstruction in particle physics. ... more The center of gravity is a widespread algorithm for position reconstruction in particle physics. For track fitting, its standard use is always accompanied by an easy guess for the probability distribution of the positioning errors. This is an incorrect assumption that degrades the results of the fit. The explicit error forms show evident Cauchy-(Agnesi) tails that render problematic the use of variance minimizations. Here, we report the probability distributions for some combinations of random variables, impossible to find in literature, but essential for track fitting: $x={\xi}/{(\xi+\eta)}$, $y={(\xi-\eta)}/[2{(\xi+\eta)}]$, $w=\xi/\eta$, $x=\theta(x_3-x_1) (-x_3)/(x_3+x_2) +\theta(x_1-x_3)x_1/(x_1+x_2)$ and $x=(x_1-x_3)/(x_1+x_2+x_3)$. The first three are directly connected to each other and are partial forms of the two-strip center of gravity. The fourth is the complete two-strip center of gravity. For its very complex form, it allows only approximate expressions of the probabil...
To complete a previous work, the probability density functions for the errors in the center-ofgra... more To complete a previous work, the probability density functions for the errors in the center-ofgravity as positioning algorithm are derived with the usual methods of the cumulative distribution functions. These methods introduce substantial complications compared to the approaches used in a previous publication on similar problems. The combinations of random variables considered are: Xg3 = θ (x2 − x1)(x1 − x3)/(x1 + x2 + x3)+ θ (x1 − x2)(x1 + 2x4)/(x1 + x2 + x4) and Xg4 = (θ (x4 − x5)(2x4 + x1 − x3)/(x1 + x2 + x3 + x4)+θ (x5 − x4)(x1 − x3 − 2x5)/(x1 + x2 + x3 + x5) The complete and partial forms of the probability density functions of these expressions of the center-of-gravity algorithms are calculated for general probability density functions of the observation noise. The cumulative probability distributions are the essential steps in this study, never calculated elsewhere.
Silicon micro-strip detectors are fundamental tools for the high energy physics. Each detector is... more Silicon micro-strip detectors are fundamental tools for the high energy physics. Each detector is formed by a large set of parallel narrow strips of special surface treatments (diode junctions) on a slab of very high quality silicon crystals. Their development and use required a large amount of work and research. A very synthetic view is given of these important components and of their applications. Some details are devoted to the basic subject of the track reconstruction in silicon strip trackers. Recent demonstrations substantially modified the usual understanding of this argument.
A standard criterium in statistics is to define an optimal estimator as the one with the minimum ... more A standard criterium in statistics is to define an optimal estimator as the one with the minimum variance. Thus, the optimality is proved with inequality among variances of competing estimators. The demonstrations of inequalities among estimators are essentially based on the Cramer, Rao and Frechet methods. They require special analytical properties of the probability functions, globally indicated as regular models. With an extension of the Cramer–Rao–Frechet inequalities and Gaussian distributions, it was proved the optimality (efficiency) of the heteroscedastic estimators compared to any other linear estimator. However, the Gaussian distributions are a too restrictive selection to cover all the realistic properties of track fitting. Therefore, a well-grounded set of inequalities must overtake the limitations to regular models. Hence, the inequalities for least-squares estimators are generalized to any model of probabilities. The new inequalities confirm the results obtained for th...
Two new fitting methods are explored for momentum reconstruction. They give a substantial increas... more Two new fitting methods are explored for momentum reconstruction. They give a substantial increase of momentum resolution compared to standard fit. The key point is the use of a different (realistic) probability distribution for each hit (heteroscedasticity). In the first fitting method an effective variance is calculated for each hit, the second method uses the search of the maximum likelihood. The tracker model is similar to the PAMELA tracker with its two sided detectors. Here, each side is simulated as a momentum reconstruction device. One of the two is similar to silicon micro-strip detectors of large use in running experiments. The gain obtained in momentum resolution is measured as the virtual magnetic field and the virtual signal-to-noise ratio required by the standard fits to overlap with the best of the new methods. For the low noise side, the virtual magnetic field must be increased 1.5 times to reach the overlap and 1.8 for the other. For the high noise side, the increas...
An experimental set up, dedicated to isolate an error present in the $\eta$-algorithm, gave an un... more An experimental set up, dedicated to isolate an error present in the $\eta$-algorithm, gave an unexpected result. The average of a center of gravity algorithm at orthogonal particle incidence turns out to be non zero. This non zero average signals an asymmetry in the response function of the strips, and introduces a further parameter in the corrections: the shift of the strip response center of gravity respect its geometrical position. A strategy to extract this parameter from a standard data set is discussed. Some simulations with various asymmetric response functions are explored for this test. The method is able to detect easily the asymmetry parameters introduced in the simulations. Its robustness is tested against angular rotations, and we see an almost linear variation with the angle. This simple property is used to simulate a determination of a Lorentz angle with and without the asymmetry of the response function.
The Cramer–Rao–Frechet inequality is reviewed and extended to track fitting. A diffused opinion a... more The Cramer–Rao–Frechet inequality is reviewed and extended to track fitting. A diffused opinion attributes to this inequality the limitation of the resolution of the track fits with the number N of observations. It will be shown that this opinion is incorrect, the weighted least squares method is not subjected to that N-limitation and the resolution can be improved beyond those limits. In previous publications, simulations with realistic models and simple Gaussian models produced interesting results: linear growths of the peaks of the distributions of the fitted parameters with the number N of observations, much faster than the N of the standard least-squares. These results could be considered a violation of a well-known 1 / N -rule for the variance of an unbiased estimator, frequently reported as the Cramer–Rao–Frechet bound. To clarify this point beyond any doubt, a direct proof of the consistency of those results with this inequality would be essential. Unfortunately, such proof ...
The construction of a well tuned probability distributions is illustrated in synthetic way, these... more The construction of a well tuned probability distributions is illustrated in synthetic way, these probability distributions are optimized to produce a faithful realizations of the impact point distributions of particles in silicon strip detector. Their use for track fitting shows a drastic improvements of a factor two, for the low noise case, and a factor three, for the high noise case, respect to the standard approach. The tracks are well reconstructed even in presence of hits with large errors, with a surprising effect of hit discarding. The applications illustrated are simulations of the PAMELA tracker, but other type of trackers can be handled similarly. The probability distributions are calculated for the center of gravity algorithms, and they are very different from gaussian probabilities. These differences are crucial to accurately reconstruct tracks with high error hits and to produce the effective discarding of the too noisy hits (outliers). The similarity of our distributions with the Cauchy distribution forced us to abandon the standard deviation for our comparisons and instead use the full width at half maximum. A set of mathematical approaches must be developed for these applications, some of them are standard in wide sense, even if very complex. One is essential and, in its absence, all the others are useless. Therefore, in this paper, we report the details of this critical approach. It extracts physical properties of the detectors, and allows the insertion of the functional dependence from the impact point in the probability distributions. Other papers will be dedicated to the remaining parts.
The realistic probability distributions of a previous article are applied to the reconstruction o... more The realistic probability distributions of a previous article are applied to the reconstruction of tracks in constant magnetic field. The complete forms and their schematic approximations produce excellent momentum estimations, drastically better than standard fits. A simplified derivation of one of our probability distributions is illustrated. The momentum reconstructions are compared with standard fits (least squares) with two different position algorithms: the eta-algorithm and the two-strip center of gravity. The quality of our results are expressed as the increase of the magnetic field and signal-to-noise ratio that overlap the standard fit reconstructions with ours best distributions. The data and the simulations are tuned on the tracker of a running experiment and its double sided microstrip detectors, here each detector side is simulated to measure the magnetic bending. To overlap with our best distributions, the magnetic field must be increased by a factor 1.5 for the least...
Probability distributions for the center of gravity are fundamental tools for track fitting. The ... more Probability distributions for the center of gravity are fundamental tools for track fitting. The center of gravity is a widespread algorithm for position reconstruction in tracker detectors for particle physics. Its standard use is always accompanied by an easy guess (Gaussian) for the probability distribution of the positioning errors. This incorrect assumption degrades the results of the fit. The explicit error forms evident Cauchy–Agnesi tails that render the use of variance minimizations problematic. Therefore, it is important to report probability distributions for some combinations of random variables essential for track fitting: x=ξ/(ξ+μ), y=(ξ−μ)/[2(ξ+μ)], w=ξ/μ, x=θ(x3−x1)(−x3)/(x3+x2)+θ(x1−x3)x1/(x1+x2) and x=(x1−x3)/(x1+x2+x3). The first two are partial forms of the two strip center of gravity. The fourth is the complete two strip center of gravity, and the fifth is a partial form of the three strip center of gravity. For the complexity of the forth equation, only approxi...
A very simple Gaussian model is used to illustrate an interesting fitting result: a linear growth... more A very simple Gaussian model is used to illustrate an interesting fitting result: a linear growth of the resolution with the number N of detecting layers. This rule is well beyond the well-known rule proportional to N for the resolution of the usual fits. The effect is obtained with the appropriate form of the variance for each hit (observation). The model reconstructs straight tracks with N parallel detecting layers, the track direction is the selected parameter to test the resolution. The results of the Gaussian model are compared with realistic simulations of silicon micro-strip detectors. These realistic simulations suggest an easy method to select the essential weights for the fit: the lucky model. Preliminary results of the lucky model show an excellent reproduction of the linear growth of the resolution, very similar to that given by realistic simulations. The maximum likelihood evaluations complete this exploration of the growth in resolution.
To complete a previous paper, the probability density functions of the center-of-gravity as posit... more To complete a previous paper, the probability density functions of the center-of-gravity as positioning algorithm are derived with classical methods. These methods, as suggested by the textbook of Probability, require the preliminary calculation of the cumulative distribution functions. They are more complicated than those previously used for these tasks. In any case, the cumulative probability distributions could be useful. The combinations of random variables are those essential for track fitting $x={\xi}/{(\xi+\eta)}$, $x=\theta(x_3-x_1) (-x_3)/(x_3+x_2) +\theta(x_1-x_3)x_1/(x_1+x_2)$ and $x=(x_1-x_3)/(x_1+x_2+x_3)$. The first combination is a partial form of the two strip center-of-gravity. The second is the complete form, and the third is a simplified form of the three strip center-of-gravity. The cumulative probability distribution of the first expression was reported in the previous publications. The standard assumption is that $\xi$, $\eta$, $x_1$, $x_2$ and $x_3$ are indepe...
The center of gravity is a widespread algorithm for position reconstruction in particle physics. ... more The center of gravity is a widespread algorithm for position reconstruction in particle physics. For track fitting, its standard use is always accompanied by an easy guess for the probability distribution of the positioning errors. This is an incorrect assumption that degrades the results of the fit. The explicit error forms show evident Cauchy-(Agnesi) tails that render problematic the use of variance minimizations. Here, we report the probability distributions for some combinations of random variables, impossible to find in literature, but essential for track fitting: $x={\xi}/{(\xi+\eta)}$, $y={(\xi-\eta)}/[2{(\xi+\eta)}]$, $w=\xi/\eta$, $x=\theta(x_3-x_1) (-x_3)/(x_3+x_2) +\theta(x_1-x_3)x_1/(x_1+x_2)$ and $x=(x_1-x_3)/(x_1+x_2+x_3)$. The first three are directly connected to each other and are partial forms of the two-strip center of gravity. The fourth is the complete two-strip center of gravity. For its very complex form, it allows only approximate expressions of the probabil...
To complete a previous work, the probability density functions for the errors in the center-ofgra... more To complete a previous work, the probability density functions for the errors in the center-ofgravity as positioning algorithm are derived with the usual methods of the cumulative distribution functions. These methods introduce substantial complications compared to the approaches used in a previous publication on similar problems. The combinations of random variables considered are: Xg3 = θ (x2 − x1)(x1 − x3)/(x1 + x2 + x3)+ θ (x1 − x2)(x1 + 2x4)/(x1 + x2 + x4) and Xg4 = (θ (x4 − x5)(2x4 + x1 − x3)/(x1 + x2 + x3 + x4)+θ (x5 − x4)(x1 − x3 − 2x5)/(x1 + x2 + x3 + x5) The complete and partial forms of the probability density functions of these expressions of the center-of-gravity algorithms are calculated for general probability density functions of the observation noise. The cumulative probability distributions are the essential steps in this study, never calculated elsewhere.
Silicon micro-strip detectors are fundamental tools for the high energy physics. Each detector is... more Silicon micro-strip detectors are fundamental tools for the high energy physics. Each detector is formed by a large set of parallel narrow strips of special surface treatments (diode junctions) on a slab of very high quality silicon crystals. Their development and use required a large amount of work and research. A very synthetic view is given of these important components and of their applications. Some details are devoted to the basic subject of the track reconstruction in silicon strip trackers. Recent demonstrations substantially modified the usual understanding of this argument.
A standard criterium in statistics is to define an optimal estimator as the one with the minimum ... more A standard criterium in statistics is to define an optimal estimator as the one with the minimum variance. Thus, the optimality is proved with inequality among variances of competing estimators. The demonstrations of inequalities among estimators are essentially based on the Cramer, Rao and Frechet methods. They require special analytical properties of the probability functions, globally indicated as regular models. With an extension of the Cramer–Rao–Frechet inequalities and Gaussian distributions, it was proved the optimality (efficiency) of the heteroscedastic estimators compared to any other linear estimator. However, the Gaussian distributions are a too restrictive selection to cover all the realistic properties of track fitting. Therefore, a well-grounded set of inequalities must overtake the limitations to regular models. Hence, the inequalities for least-squares estimators are generalized to any model of probabilities. The new inequalities confirm the results obtained for th...
Two new fitting methods are explored for momentum reconstruction. They give a substantial increas... more Two new fitting methods are explored for momentum reconstruction. They give a substantial increase of momentum resolution compared to standard fit. The key point is the use of a different (realistic) probability distribution for each hit (heteroscedasticity). In the first fitting method an effective variance is calculated for each hit, the second method uses the search of the maximum likelihood. The tracker model is similar to the PAMELA tracker with its two sided detectors. Here, each side is simulated as a momentum reconstruction device. One of the two is similar to silicon micro-strip detectors of large use in running experiments. The gain obtained in momentum resolution is measured as the virtual magnetic field and the virtual signal-to-noise ratio required by the standard fits to overlap with the best of the new methods. For the low noise side, the virtual magnetic field must be increased 1.5 times to reach the overlap and 1.8 for the other. For the high noise side, the increas...
An experimental set up, dedicated to isolate an error present in the $\eta$-algorithm, gave an un... more An experimental set up, dedicated to isolate an error present in the $\eta$-algorithm, gave an unexpected result. The average of a center of gravity algorithm at orthogonal particle incidence turns out to be non zero. This non zero average signals an asymmetry in the response function of the strips, and introduces a further parameter in the corrections: the shift of the strip response center of gravity respect its geometrical position. A strategy to extract this parameter from a standard data set is discussed. Some simulations with various asymmetric response functions are explored for this test. The method is able to detect easily the asymmetry parameters introduced in the simulations. Its robustness is tested against angular rotations, and we see an almost linear variation with the angle. This simple property is used to simulate a determination of a Lorentz angle with and without the asymmetry of the response function.
The Cramer–Rao–Frechet inequality is reviewed and extended to track fitting. A diffused opinion a... more The Cramer–Rao–Frechet inequality is reviewed and extended to track fitting. A diffused opinion attributes to this inequality the limitation of the resolution of the track fits with the number N of observations. It will be shown that this opinion is incorrect, the weighted least squares method is not subjected to that N-limitation and the resolution can be improved beyond those limits. In previous publications, simulations with realistic models and simple Gaussian models produced interesting results: linear growths of the peaks of the distributions of the fitted parameters with the number N of observations, much faster than the N of the standard least-squares. These results could be considered a violation of a well-known 1 / N -rule for the variance of an unbiased estimator, frequently reported as the Cramer–Rao–Frechet bound. To clarify this point beyond any doubt, a direct proof of the consistency of those results with this inequality would be essential. Unfortunately, such proof ...
The construction of a well tuned probability distributions is illustrated in synthetic way, these... more The construction of a well tuned probability distributions is illustrated in synthetic way, these probability distributions are optimized to produce a faithful realizations of the impact point distributions of particles in silicon strip detector. Their use for track fitting shows a drastic improvements of a factor two, for the low noise case, and a factor three, for the high noise case, respect to the standard approach. The tracks are well reconstructed even in presence of hits with large errors, with a surprising effect of hit discarding. The applications illustrated are simulations of the PAMELA tracker, but other type of trackers can be handled similarly. The probability distributions are calculated for the center of gravity algorithms, and they are very different from gaussian probabilities. These differences are crucial to accurately reconstruct tracks with high error hits and to produce the effective discarding of the too noisy hits (outliers). The similarity of our distributions with the Cauchy distribution forced us to abandon the standard deviation for our comparisons and instead use the full width at half maximum. A set of mathematical approaches must be developed for these applications, some of them are standard in wide sense, even if very complex. One is essential and, in its absence, all the others are useless. Therefore, in this paper, we report the details of this critical approach. It extracts physical properties of the detectors, and allows the insertion of the functional dependence from the impact point in the probability distributions. Other papers will be dedicated to the remaining parts.
The realistic probability distributions of a previous article are applied to the reconstruction o... more The realistic probability distributions of a previous article are applied to the reconstruction of tracks in constant magnetic field. The complete forms and their schematic approximations produce excellent momentum estimations, drastically better than standard fits. A simplified derivation of one of our probability distributions is illustrated. The momentum reconstructions are compared with standard fits (least squares) with two different position algorithms: the eta-algorithm and the two-strip center of gravity. The quality of our results are expressed as the increase of the magnetic field and signal-to-noise ratio that overlap the standard fit reconstructions with ours best distributions. The data and the simulations are tuned on the tracker of a running experiment and its double sided microstrip detectors, here each detector side is simulated to measure the magnetic bending. To overlap with our best distributions, the magnetic field must be increased by a factor 1.5 for the least...
Uploads
Papers by Giovanni Landi