Papers by Alexandre Gimenez Alvarez
Brazilian Journal of Radiation Sciences, Feb 19, 2019
The greatest impact of the tomography technology currently occurs in medicine. The success is due... more The greatest impact of the tomography technology currently occurs in medicine. The success is due to the fact that human body presents standardized dimensions with well-established composition. These conditions are not found in industrial objects. In industry, there is a great deal of interest in using the tomography in order to know the inner part of (i) manufactured industrial objects or (ii) the machines and their means of production. In these cases, the purpose of the tomography is: (a) to control the quality of the final product and (b) to optimize the production, contributing to the pilot phase of the projects and analyzing the quality of the means of production. This scan system is a non-destructive, efficient and fast method for providing sectional images of industrial objects and it is able to show the dynamic processes and the dispersion of the materials structures within these objects. In this context, it is important that the reconstructed image may present a great spatial resolution with a satisfactory temporal resolution. Thus, the algorithm to reconstruct the images has to meet these requirements. This work consists in the analysis of three different iterative algorithm methods, namely the Maximum Likelihood Estimation Method (MLEM), the Maximum Likelihood Transmitted Method (MLTR) and the Simultaneous Iterative Reconstruction Method (SIRT. The analyses involved the measurement of the contrast to noise ratio (CNR), the root mean square error (RMSE) and the Modulation Transfer Function (MTF),in order to know which algorithm fits the conditions to optimize the system better. The algorithms and the image quality analyses were performed by Matlab® 2013b.
Journal of Cosmology and Astroparticle Physics, Sep 1, 2020
We present an updated analysis of the gamma-ray flux from the directions of classical dwarf spher... more We present an updated analysis of the gamma-ray flux from the directions of classical dwarf spheroidal galaxies, deriving new constraints on WIMP dark matter (DM) annihilation using a decade of Fermi-LAT data. Among the major novelties, we infer the dwarfs' J-factors by including new observations without imposing any a priori parametric profile for the DM distribution. While statistically compatible with results obtained from more conventional parameterisations, this procedure reduces the theoretical bias imposed on the data. Furthermore, we retain the full data-driven shape of the J-factors' empirical probability distributions when setting limits on DM, without imposing log-normality as is typically done. In conjunction with the data-driven J-factors, we improve on a new method for estimating the probability distribution function of the astrophysical background at the dwarf position [1], fully profiling over background uncertainties. We show that, for most "classical" dwarfs, the background systematic uncertainty dominates over the uncertainty on their J-factors. Raw distributions of Jand D-factors (the latter being the analogous of J-factors for decaying DM) are available upon request.
KnE engineering, Dec 27, 2018
At present, the generation of 3D objects by computer has become a fundamental tool for the develo... more At present, the generation of 3D objects by computer has become a fundamental tool for the development of most sciences. Modeling a three-dimensional surface on a computer whose display is a two-dimensional graphic screen presents some challenges such as simulating the depth on the graphic screen. To overcome this drawback, the authors propose to use Vector Differential Analysis (Differential Geometry), since calculating the normal vector to the Surface eliminates hidden sections and differentiates external faces of internal faces to texturize differently. In the same way, taking advantage of the properties of the Vector Gradient, it is possible to simulate light intensities on the surfaces. Resumen En la actualidad la generación de objetos 3D por computador se ha convertido en una herramienta fundamental para el desarrollo de la mayoría de las ciencias. Modelar una superficie tridimensional en un computador cuyo despliegue es una pantalla grafica bidimensional presenta algunos desafíos como el de simular la profundidad en la pantalla gráfica. Para superar este inconveniente se propone utilizar el Análisis Diferencial Vectorial (Geometría Diferencial), ya que mediante el cálculo del vector normal a la superficie se puede eliminar secciones escondidas y diferenciar caras externas de caras internas para texturizar de diferente forma. De igual manera aprovechando las propiedades del Vector Gradiente se logra simular intensidades de luz sobre las superficies.
Journal of Cosmology and Astroparticle Physics, 2020
We present an updated analysis of the gamma-ray flux from the directions of classical dwarf spher... more We present an updated analysis of the gamma-ray flux from the directions of classical dwarf spheroidal galaxies, deriving new constraints on WIMP dark matter (DM) annihilation using a decade of Fermi-LAT data. Among the major novelties, we infer the dwarfs' J-factors by including new observations without imposing any a priori parametric profile for the DM distribution. While statistically compatible with results obtained from more conventional parameterisations, this procedure reduces the theoretical bias imposed on the data. Furthermore, we retain the full data-driven shape of the J-factors' empirical probability distributions when setting limits on DM, without imposing log-normality as is typically done. In conjunction with the data-driven J-factors, we use a new method for estimating the probability distribution function of the astrophysical background at the dwarf position [1], fully profiling over background uncertainties. We show that, for most "classical" dwarfs, the background systematic uncertainty dominates over the uncertainty on their Jfactors.
Brazilian Journal of Radiation Sciences, 2019
This paper describes the Monte Carlo simulation, using MCNP4C, of a multichannel third generation... more This paper describes the Monte Carlo simulation, using MCNP4C, of a multichannel third generation tomography system containing a two radioactive sources 192I (316.5 – 468 KeV) and 137Cs (662 KeV), and a set of fifteen NaI(Tl) detectors, with dimensions of 1 inch diameter and 2 inches thick, in fan beam geometry, positioned diametrically opposite. Each detector moves 10 steps of 0,24o, totalizing 150 virtual detectors per projection, and then the system rotate 2 degrees. The Monte Carlo simulation was performed to evaluate the viability of this configuration. For this, a multiphase phantom containing polymethyl methacralate (PMMA ((r @ 1.19 g/cm3)), iron (r @ 7.874 g/cm3), aluminum (r @ 2.6989 g/cm3) and air (r @ 1.20479E-03 g/cm3) was simulated. The simulated number of histories was 1.1E+09 per projection and the tally used were the F8, which gives the pulse height of each detector. The data obtained by the simulation was used to reconstruct the simulated phantom using the statisti...
Brazilian Journal of Radiation Sciences, 2019
In this work the pathway of the chemical product and the kinetics parameters were evaluated in a ... more In this work the pathway of the chemical product and the kinetics parameters were evaluated in a laboratory plant settled, using 0.4 GBq (10 mL) of 67Ga citrate as radiotracer and 18 NaI(Tl) radiation detectors. The AnaComp program was used to estimate the kinetic para ameters of the acetone production. The yield of the acetone production was estimated by the percentage ratio between the areas under the curve (AUC) of the curve profiles of the final product compartment divided by the concentration found inside the chemical reactor whose result was 87% yield during the first 30 minutes of reaction.
Brazilian Journal of Radiation Sciences, 2019
The greatest impact of the computed tomography (CT) applications currently occurs in medicine. In... more The greatest impact of the computed tomography (CT) applications currently occurs in medicine. In industry there is much interest of using CT in order to know the interior of: (i) industrial objects; (ii) machines and their means of production. The purpose of this tomography is to: (a) control the quality of the final product and (b) optimize production and analyze the quality of the means of production. An instant non-scanning tomography system is being developed at the IPEN. This tomography comprised different collimators was simulated with Monte Carlo using the MCNP4C. The image quality was evaluated with Matlab® 2013b analyzing the contrast to noise (CNR), root mean square ratio (RMSE), signal to noise ratio (SNR) and the spatial resolution by the Modulation Transfer Function (MTF(f)), to identify which collimator fits better to the tomography in development. It was simulated three situations; (i) with no collimator; (ii) ø 25x 50 mm2 cylindrical collimator with a septum of ø5.0...
Uploads
Papers by Alexandre Gimenez Alvarez