Papers by Amin Shokrollahi
Oil & Gas Science and Technology – Revue d’IFP Energies nouvelles
Nowadays, incredible growth of the energy consumption has changed the global attention to the pro... more Nowadays, incredible growth of the energy consumption has changed the global attention to the production and utilization of the heavy crude oils such as bitumen resources around the globe. Amongst the bitumen properties, density is an important parameter which improves bitumen recovery efficiency and transportation quality. For easy production of bitumen, n-alkanes are usually injected into the reservoir to reduce its viscosity and density; however, there are few numbers of models focusing on proper estimation/prediction of diluted bitumen mixture density in literature. In present work, a new method was proposed to accurately prognosticate the bitumen/n-tetradecane mixture density as a function of thermodynamic conditions using Gene Expression Programming (GEP) for the first time as a function of solvent composition, pressure and temperature. Consequently, the proposed model here predicts the mixture density with the average Absolute Relative Deviation (AARD%) of 0.3016% and R-squar...
Petroleum, 2016
Controlling sand production in the petroleum industry has been a long-standing problem for more t... more Controlling sand production in the petroleum industry has been a long-standing problem for more than 70 years. To provide technical support for sand control strategy, it is necessary to predict the conditions at which sanding occurs. To this end, for the first time, least square support machine (LSSVM) classification approach, as a novel technique, is applied to identify the conditions under which sand production occurs. The model presented in this communication takes into account different parameters that may play a role in sanding. The performance of proposed LSSVM model is examined using field data reported in open literature. It is shown that the developed model can accurately predict the sand production in a real field. The results of this study indicates that implementation of LSSVM modeling can effectively help completion designers to make an on time sand control plan with least deterioration of production.
Journal of Unconventional Oil and Gas Resources, 2015
Natural gas is a very important energy source. The production, processing and transportation of n... more Natural gas is a very important energy source. The production, processing and transportation of natural gas can be affected significantly by gas hydrates. Pipeline blockages due to hydrate formation causes operational problems and a decrease in production performance. This paper presents an improved artificial neural network (ANN) method to predict the hydrate formation temperature (HFT) for a wide range of gas mixtures. A new approach was used to define the variables for formation of a hydrate structure according to each species presented in natural gas mixtures. This approach resulted in a strong network with a precise prediction, especially in the case of sour gases. This study also presents a detailed comparison of the results predicted by this ANN model with those of other correlations and thermodynamics-based models for an estimation of the HFT. The results showed that the proposed ANN model predictions are in much better agreement with the experimental data than the existing models and correlations. Finally, outlier detection was performed on the entire data set to identify any defective measurements of the experimental data.
ABSTRACT In this paper, I will give a brief introduction to the theory of low-density parity-chec... more ABSTRACT In this paper, I will give a brief introduction to the theory of low-density parity-check codes, and their decoding. I will emphasize the case of correcting erasures as it is still the best understood and most accessible case. At the end of the paper, I will also describe more recent developments.
ABSTRACT In this paper, I will give a brief introduction to the theory of low-density parity-chec... more ABSTRACT In this paper, I will give a brief introduction to the theory of low-density parity-check codes, and their decoding. I will emphasize the case of correcting erasures as it is still the best understood and most accessible case. At the end of the paper, I will also describe more recent developments.
2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060), 2000
Abstract Multiple antennas can greatly increase the data rate and reliability of a wireless commu... more Abstract Multiple antennas can greatly increase the data rate and reliability of a wireless communication link in a fading environment, but the practical success of using multiple antennas depends crucially on our ability to design high-rate space-time constellations ...
2000 IEEE Wireless Communications and Networking Conference. Conference Record (Cat. No.00TH8540), 2000
We construct signal constellations for differential transmission with multiple basestation antenn... more We construct signal constellations for differential transmission with multiple basestation antennas. The signals are derived using the theory of fixed-point-free groups and are especially suitable for mobile cellular applications because they do not require the handset to have more than one antenna or to know the time-varying propagation environment. Yet we achieve full transmitter diversity and excellent performance gains over
2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060), 2000
Abstract Multiple antennas can greatly increase the data rate and reliability of a wireless commu... more Abstract Multiple antennas can greatly increase the data rate and reliability of a wireless communication link in a fading environment, but the practical success of using multiple antennas depends crucially on our ability to design high-rate space-time constellations ...
Source and channel coding using fountain codes(also known as rateless erasure codes) have been of... more Source and channel coding using fountain codes(also known as rateless erasure codes) have been of much interest during the last years. The capacity achieving property of these codes along with the low decoding complexity have greatly contributed to these codes becoming more frequently applied in both compression and protection against channel disruptions. One of the major problems of compression and source coding is the Slepian-Wolf problem. The problem gained importance because of suggesting that, no matter of seperate or jointly encoding of the sources X and Y, the sufficient rate for reconstructing them remains the same. The report is mostly devoted to fountain code approach of Slepian-Wolf problem. Suitable raptor code design helps us to achieve any arbitrary point of Slepian-Wolf region. Two different points of the Slepian-Wolf region are achieved with being only 3.8% and 5.4% off the Slepian-Wolf limit respectively. This fountain code approach is expected to outperform the classical code approach such as Low-Density-Parity-Check(LDPC) codes in terms of decoding complexity and proximity to desired Slepian-Wolf point. In a later part of the thesis, the Belief Propagation (BP) is applied to LT-Markov sub-graphs. Gilber-Eliot (GE) channel is the assumed Markov channel in this work. We come up with the overhead of 16.7% while applying the former Binary Symmetric Channel(BSC) design to LT-Markov sub-graphs.
We outline a procedure for using pseudorandom generators to construct binary codes with good prop... more We outline a procedure for using pseudorandom generators to construct binary codes with good properties, assuming the existence of sufficiently hard functions. Specifically, we give a polynomial time algorithm, which for every integers n and k, constructs polynomially many linear codes of block length n and dimension k, most of which achieving the Gilbert-Varshamov bound. The success of the procedure relies on the assumption that the exponential time class of E def = DTIME[2 O(n) ] is not contained in the sub-exponential space class DSPACE[2 o(n) ].
Source and channel coding using fountain codes(also known as rateless erasure codes) have been of... more Source and channel coding using fountain codes(also known as rateless erasure codes) have been of much interest during the last years. The capacity achieving property of these codes along with the low decoding complexity have greatly contributed to these codes becoming more frequently applied in both compression and protection against channel disruptions. One of the major problems of compression and source coding is the Slepian-Wolf problem. The problem gained importance because of suggesting that, no matter of seperate or jointly encoding of the sources X and Y, the sufficient rate for reconstructing them remains the same. The report is mostly devoted to fountain code approach of Slepian-Wolf problem. Suitable raptor code design helps us to achieve any arbitrary point of Slepian-Wolf region. Two different points of the Slepian-Wolf region are achieved with being only 3.8% and 5.4% off the Slepian-Wolf limit respectively. This fountain code approach is expected to outperform the classical code approach such as Low-Density-Parity-Check(LDPC) codes in terms of decoding complexity and proximity to desired Slepian-Wolf point. In a later part of the thesis, the Belief Propagation (BP) is applied to LT-Markov sub-graphs. Gilber-Eliot (GE) channel is the assumed Markov channel in this work. We come up with the overhead of 16.7% while applying the former Binary Symmetric Channel(BSC) design to LT-Markov sub-graphs.
Lecture Notes in Computer Science, 2005
We analyze a generalization of a recent algorithm of Bleichenbacher et al. for decoding interleav... more We analyze a generalization of a recent algorithm of Bleichenbacher et al. for decoding interleaved codes on the Q-ary symmetric channel for large Q. We will show that for any m and any the new algorithms can decode up to a fraction of at least βm βm+1 (1−R−2Q −1/2m )− errors, where β = ln(q m −1) ln(q m ) , and that the error probability of the decoder is upper bounded by O(1/q n ), where n is the block-length. The codes we construct do not have a-priori any bound on their length.
The polynomial time algorithm of Lenstra, Lenstra, and Lovász [15] for factoring integer polynomi... more The polynomial time algorithm of Lenstra, Lenstra, and Lovász [15] for factoring integer polynomials and variants thereof have been widely used to show that various computational problems in number theory have polynomial time solutions. Among them is the problem of factoring polynomials over algebraic number fields, which is used itself as a major subroutine for several other algorithms. Although a theoretical breakthrough, algorithms based on factorization of polynomials are notoriously slow and hard to implement, with running times ranging between $O(n^12)$ and $O(n^18)$ depending on which variant of the lattice basis reduction is used. Here, n is an upper bound for the maximum of the degrees and the bit-lengths of the coefficients of the polynomials involved. On the other hand, in many situations one does not need the full power of factorization, so one may ask whether there exist faster algorithms in these cases. In this paper we develop more efficient Monte Carlo algorithms to ...
Lecture Notes in Computer Science, 2005
We analyze a generalization of a recent algorithm of Bleichenbacher et al. for decoding interleav... more We analyze a generalization of a recent algorithm of Bleichenbacher et al. for decoding interleaved codes on the Q-ary symmetric channel for large Q. We will show that for any m and any the new algorithms can decode up to a fraction of at least βm βm+1 (1−R−2Q −1/2m )− errors, where β = ln(q m −1) ln(q m ) , and that the error probability of the decoder is upper bounded by O(1/q n ), where n is the block-length. The codes we construct do not have a-priori any bound on their length.
We design sequences of low-density parity check codes that provably perform at rates extremely cl... more We design sequences of low-density parity check codes that provably perform at rates extremely close to the Shannon capacity. The codes are built from highly irregular bipartite graphs with carefully chosen degree patterns on both sides. Our theoretical analysis of the codes is based on [1]. Additionally, based on the assumption that the underlying communication channel is symmetric, we prove that the probability densities at the message nodes of the graph satisfy a certain symmetry. This enables us to derive a succinct description of the density evolution for the case of a belief propagation decoder. Furthermore, we prove a stability condition which implies an upper bound on the fraction of errors that a belief propagation decoder can correct when applied to a code induced from a bipartite graph with a given degree distribution. Our codes are found by optimizing the degree structure of the underlying graphs. We develop several strategies to perform this optimization. We also...
Algebraic coding theory is one of the areas that routinely gives rise to computational problems i... more Algebraic coding theory is one of the areas that routinely gives rise to computational problems involving various structured matrices, such as Hankel, Vandermonde, Cauchy matrices, and certain generalizations thereof. Their structure has often been used to derive efficient algorithms; however, the use of the structure was pattern-specific, without applying a unified technique. In contrast, in several other areas, where structured matrices are also widely encountered, the concept of displacement rank was found to be useful to derive efficient algorithms in a unified manner (i.e., not depending on a particular pattern of structure). The latter technique allows one to "compress," in a unified way, different types of n × n structured matrices to only αn parameters. This typically leads to computational savings (in many applications the number α, called the displacement rank, is a small fixed constant).
We design sequences of low-density parity check codes that provably perform at rates extremely cl... more We design sequences of low-density parity check codes that provably perform at rates extremely close to the Shannon capacity. The codes are built from highly irregular bipartite graphs with carefully chosen degree patterns on both sides. Our theoretical analysis of the codes is based on [1]. Additionally, based on the assumption that the underlying communication channel is symmetric, we prove that the probability densities at the message nodes of the graph satisfy a certain symmetry. This enables us to derive a succinct description of the density evolution for the case of a belief propagation decoder. Furthermore, we prove a stability condition which implies an upper bound on the fraction of errors that a belief propagation decoder can correct when applied to a code induced from a bipartite graph with a given degree distribution. Our codes are found by optimizing the degree structure of the underlying graphs. We develop several strategies to perform this optimization. We also...
IEEE Transactions on Information Theory, 2015
Uploads
Papers by Amin Shokrollahi