Papers by taha hussein ali
RSC Advances, 2021
Chemical investigation of Aptenia cordifolia roots extract, using chromatographic and spectroscop... more Chemical investigation of Aptenia cordifolia roots extract, using chromatographic and spectroscopic techniques, resulted in isolation and identification of eight known compounds.

This study investigates the VAR time series data of the overall expenditures and income in the Ku... more This study investigates the VAR time series data of the overall expenditures and income in the Kurdistan Region of Iraq. It applies multivariate wavelet shrinkage within the VAR model, comparing it to traditional methods to identify the most appropriate model. The chosen model will then be used to predict general expenditures and revenues for the years 2022-2026. The analysis involved assessing the stationarity of the expenditure and revenue time series, which are interrelated variables during the interval 1997-2021, and identifying the overall trend through differencing to achieve stationarity. The proposed method incorporated multivariate wavelet shrinkage in the VAR model to address data contamination in expenditures and revenue using various wavelets like Coiflets, Daubechies, Symlets, and Fejér-Korovkin at different orders. Threshold levels were estimated using the SURE method and soft thresholding rules to denoise the data for the following analysis within the VAR model. Model selection was based on Akaike and Bayes information criteria. The analysis, conducted using MATLAB, indicated the superiority of the proposed method over traditional methods, forecasting a continued rise in expenditures and revenues for the Iraqi Kurdistan region from 2022 to 2026. The findings suggest that advanced techniques can offer more accurate economic forecasts, benefiting regional planning and policy-making.

In this paper, we are used Bayesian on survival function estimator based on the mixed distributio... more In this paper, we are used Bayesian on survival function estimator based on the mixed distribution of exponential distribution as primary distribution and Gama distribution as a function of probability of data, and the data was collected from Rizgari Hospital Erbil for stroke brain patients between 2015 and 2016. On the other hand, we compared with the traditional method that assumed exponential and Gamma distributions based on the Goodness of Fit tests depended, Since the value of the calculation ᵡ 2 is equal to (10.767), It is less than the value of the tabular ᵡ 2 which equals (11.345) for the variable x, that's mean that we accept the null hypothesis (H0) which states that the data is distributed exponential distribution and this is confirmed by the P. value Which equals (0.014) Which is greater than the value of the moral level 1% We conclude that the data have an exponential distribution. While, for the variable t again distributed the Gamma distribution, because the statistical Cal. ᵡ 2 is less than tab. ᵡ 2 which are equal to (0.476, 11.335) respectively, that's means accepted the null hypotheses. We are also confirmed that the P. value equal to (0.924) is greater than level 1%. By using EasyFit Program, as well as using MATLAB and SPSS statistical programs. We concluded that the mixed and proposed combination of survival function for brain stroke was expectancy, appropriate and efficient. H0: The variables x is distributed exponential distribution.

IRAOI JOURNAL OF STATISTICAL SCIENCES, Nov 30, 2023
The research presented a comparative study in time series analysis and forecasting using VAR mode... more The research presented a comparative study in time series analysis and forecasting using VAR models, which depend on the existence of a significant relationship between the studied variables, and ARIMAX models, which depend on the linear effect of the independent variables (model input) on the dependent variable (model output). The models were analyzed using time series data for the Iraqi general budget for the period (2004-2020), which represents foreign reserves and government spending. Time series of government expenditure was forecast for the years (2021-2024) and a comparison was made between the efficiency of the models estimated through the mean square error (MSE) criterion. The analysis was carried out using the MATLAB program, and the results of the analysis concluded that the VAR model was more efficient than the ARIMAX model for this data and the increase in foreign reserves and government spending for the Iraqi will continue during the coming period (2021-2024).

Quality and Reliability Engineering International
In this research, it was proposed to create three new robust multivariate charts corresponding to... more In this research, it was proposed to create three new robust multivariate charts corresponding to the |S|‐ chart, which are robust to outliers, using three methods, an algorithm, namely the Rousseuw and Leroy algorithm, Maronna and Zamar, and the family of ‘concentration algorithms’ by Olive and Hawkins. Then the comparison between the proposed and classical method of the researcher Shewhart depends on the total variance (trace variance matrix), the general variance (determinant of the variance matrix), and the difference between the upper and lower control limits to obtain the most efficient charts against outliers through simulation and real data and using a program in the MATLAB language designed for this purpose. The study concluded that the proposed charts dealt with the problem of the influence of outliers and were more efficient than the classical method, in addition, the proposed robust chart (Orthogonalized Gnanadesikan‐Kettenring) was more efficient than the rest of the pr...
Optics and Lasers in Engineering, 2024
TANMIYAT AL-RAFIDAIN, 2007

Mağallaẗ tikrīt li-l-ʻulūm al-idāriyyaẗ wa-al-iqtiṣādiyyaẗ, May 25, 2022
The research presents a new hybrid model that proposes its use for accurate time series predictio... more The research presents a new hybrid model that proposes its use for accurate time series prediction, which combines wavelet transformations to remove de-noise of the data before using it in artificial neural network and applied for time series. To find out the effectiveness and efficiency of the proposed method on artificial neural network models in prediction, the proposed method was firstly applied to the generation time series data (first-order auto-regression) through several simulation examples by changing the value of the parameters and sample size with the generation data being repeated 25 times, secondly the application on the real data represents the monthly average of the price of an ounce of gold in the Kurdistan Region, To compare the simulation results and the real data of the proposed and traditional method, then design a program in Matlab language for this purpose and based on the criteria (MSE, MAD, R2). The results of the research concluded that the proposed method is more accurate than the traditional method in estimating the parameters of the time series model.
Psychiatry and Clinical Psychopharmacology, 2017
In recent years, ecstasy has become a very popular recreational drug. To date, very little data i... more In recent years, ecstasy has become a very popular recreational drug. To date, very little data is available regarding its toxic side effects. Specifically, reports relating ecstasy use to the clinical diagnosis of leukoencephalopathy are quite rare. In this report, we present an interesting case of a 33-year-old female with a recent history of ecstasy abuse who presented with delirium and cognitive impairment. In addition, we review published case reports that explore the connection between MDMA and leukoencephalopathy.
Communications in Statistics - Simulation and Computation, 2021
This paper proposes a new improvement of the Nadaraya-Watson kernel non-parametric regression est... more This paper proposes a new improvement of the Nadaraya-Watson kernel non-parametric regression estimator and the bandwidth of this new improvement is obtained depending on universal threshold level ...
In this research, a new improvement of the Nadaraya-Watson kernel non parametric regression estim... more In this research, a new improvement of the Nadaraya-Watson kernel non parametric regression estimator is proposed and the bandwidth of this new improvement is obtained depending on the three differ...
Polytechnic Journal, Mar 17, 2018

IRAQI JOURNAL OF STATISTICAL SCIENCES
There are many statistical methods related to the forecasting of time series without any input va... more There are many statistical methods related to the forecasting of time series without any input variables such as autoregressive integrated moving average (ARIMA models). In this research, some linear dynamic systems, represented by ARIMA with exogenous input variables (ARIMAX models) were used to forecast crude oil prices (considered as output variable) for OPEC organization with the help of crude oil production (considered as input variable) depending on the data starting from the period of 1973 until 2018. Using traditional ARIMAX method and proposed method (Bivariate Wavelet Filtering) for the time series data in order to select one of them for forecasting through comparing some measures of accuracy, such as MSE, FPE, and AIC. Then, applying crude oil prices for OPEC using the traditional ARIMAX models and ARIMAX models with applying the bivariate wavelet filtering, especially bivariate Haar wavelet. The main conclusions of the research were that the success of bivariate wavelet filtering in forecasting of crude oil prices using proposed model was more appropriate than traditional models, and the forecasting of crude oil prices using proposed method in 2020 will be fairly less than 2019.
Books by taha hussein ali

Salahaldin University, 2024
Numerical analysis A branch of mathematics/computer science dealing with the study of algorithms ... more Numerical analysis A branch of mathematics/computer science dealing with the study of algorithms for the numerical solution of problems formulated and studied in other branches of mathematics. Numerical Analysis is an applied mathematics technique that allows staggeringly large amounts of data to be processed and analyzed for trends, thereby aiding in forming conclusions. They are providing massive increases in the speed and usefulness of calculations. The tasks of numerical analysis. The tasks of numerical analysis, First specialised in mathematical methods of analysis include the development of fast and reliable numerical methods together with the provision of a suitable error analysis and is concerned with all aspects of the numerical solution of a problem, from the theoretical development and understanding of numerical methods to their practical implementation as reliable and efficient computer programs. Second, computer science numerical analysis software is being embedded in popular software packages, e.g., spreadsheet programs, allowing many people to perform modelling even when they are unaware of the mathematics involved in the process. This requires creating reliable, efficient, and accurate numerical analysis software; and it requires designing problem-solving environments (PSE) in which it is relatively easy to model a given situation. The overall goal of the field of numerical analysis is the design and analysis of techniques to give approximate but accurate solutions to hard problems, The broad objectives are to learn about the existence and uniqueness criteria for numerical methods and to learn about convergent criteria. The specific objectives of the course are the student should be able to the variety of theorems and mathematical applications which they suggested in the following statements to Find numerical approximations to the roots of an equation by Newton method, Secant Method. Polynomial Apply Taylor and Maclaurian Series to
mathematical problems. Use finite differences Interpolations (Newton forward and Newton backwards) and Lagrange Interpolation. MATLAB provides many routines for standard tasks in computing, ranging from elementary math operations, over linear algebra, statistics and random numbers, interpolation, optimization, Fourier analysis and filtering, and sparse matrix computation, to computational geometry. MATLAB provides many routines for standard tasks in computing, ranging from elementary math operations, over linear algebra, statistics and random numbers, interpolation, optimization, Fourier analysis and filtering, and sparse matrix computation, to computational geometry. More information can be found, as always, in the MATLAB documentation More information can be found, as always, and, the number of software packages, are predominant among these in the academic environment, and versions of these software packages are available for most common computer systems. The text now assumes that the student is using MATLAB for computations

Salahaldin University, 2023
This book has been prepared for the beginners to help them understand basic to advanced functiona... more This book has been prepared for the beginners to help them understand basic to advanced functionality of MATLAB. After completing this chapter 1 (Which included an explanation of the Matlab language) you will find yourself at a moderate level of expertise in using MATLAB from where you can take yourself to next levels.
On other side, I have long been fascinated by the interplay of variables in multivariate data and by the challenge of unraveling the effect of each variable. My continuing objective has been to present the power and utility of multivariate analysis in a highly readable format.
Practitioners and researchers in all applied disciplines often measure several variables on each subject or experimental unit. In some cases, it may be productive to isolate each variable in a system and study it separately. Typically, however, the variables are not only correlated with each other, but each variable is influenced by the other variables as it affects a test statistic or descriptive statistic. Thus, in many instances, the variables are intertwined in such a way that when analyzed individually they yield little information about the system. Using multivariate analysis, the variables can be examined simultaneously in order to access the key features of the process that produced them. The multivariate approach enables us to (1) explore the joint performance of the variables and (2) determine the effect of each variable in the presence of the others.
Multivariate analysis provides both descriptive and inferential procedures—we can search for patterns in the data or test hypotheses about patterns of a priori interest. With multivariate descriptive techniques, we can peer beneath the tangled web of variables on the surface and extract the essence of the system. Multivariate inferential procedures include hypothesis tests that (1) process any number of variables without inflating the Type I error rate and (2) allow for whatever intercorrelations the variables possess. A wide variety of multivariate descriptive and inferential procedures is readily accessible in statistical software packages.

This book has been prepared for the beginners to help them understand
basic to advanced functiona... more This book has been prepared for the beginners to help them understand
basic to advanced functionality of MATLAB. After completing this
chapter 1 (Which included an explanation of the Matlab language) you
will find yourself at a moderate level of expertise in using MATLAB
from where you can take yourself to next levels.
On other side, in spite of the availability of highly innovative tools in
statistics, the main tool of the applied statistician remains the linear
model. The linear model involves the simplest and seemingly most
restrictive statistical properties: independence, normality, constancy of
variance, and linearity. However, the model and the statistical methods
associated
with
it
are
surprisingly
versatile
and
robust.
More
importantly, mastery of the linear model is a prerequisite to work with
advanced
statistical
tools
because
most
advanced
tools
are
generalizations of the linear model. The linear model is thus central to
the training of any statistician, applied or theoretical.
This book develops the basic theory of linear models for regression,
analysis-of variance, and analysis–of–covariance. Applications are
illustrated by examples and problems using real data. This combination
of theory and applications will prepare the reader to further explore the
literature and to more correctly interpret the output from a linear
models computer package and MATLAB.
This introductory linear models book is designed primarily for a one-
semester course for advanced undergraduates or MS students. It
includes more material than can be covered in one semester so as to
give an instructor a choice of topics and to serve as a reference book
for researchers who wish to gain a better understanding of regression
and analysis-of-variance. The book would also serve well as a text for
PhD classes in which the instructor is looking for a one-semester
introduction, and it would be a good supplementary text or reference
for a more advanced PhD class for which the students need to review
the basics on their own.
Our overriding objective in the preparation of this book has been
clarity of exposition. We hope that students, instructors, researchers,
and practitioners will find these linear models' text more comfortable
than most. In the final stages of development, we asked students for
Thesis Chapters by taha hussein ali
Salahalddin university, 2024
Time series analysis refers to problems in which observations are collected at regular time inter... more Time series analysis refers to problems in which observations are collected at regular time intervals and there are correlations among successive observations. Applications cover virtually all areas of Statistics but some of the most important include economic and financial time series and many areas of environmental or ecological data.
In this course, I shall cover some of the most important methods for dealing with these problems. In the case of time series, these include the basic definitions of autocorrelations etc., then time-domain model fitting including autoregressive and moving average processes, spectral methods, and some discussion of the effect of time series correlations on other kinds of statistical inference, such as the estimation of means and regression coefficients.
Uploads
Papers by taha hussein ali
Books by taha hussein ali
mathematical problems. Use finite differences Interpolations (Newton forward and Newton backwards) and Lagrange Interpolation. MATLAB provides many routines for standard tasks in computing, ranging from elementary math operations, over linear algebra, statistics and random numbers, interpolation, optimization, Fourier analysis and filtering, and sparse matrix computation, to computational geometry. MATLAB provides many routines for standard tasks in computing, ranging from elementary math operations, over linear algebra, statistics and random numbers, interpolation, optimization, Fourier analysis and filtering, and sparse matrix computation, to computational geometry. More information can be found, as always, in the MATLAB documentation More information can be found, as always, and, the number of software packages, are predominant among these in the academic environment, and versions of these software packages are available for most common computer systems. The text now assumes that the student is using MATLAB for computations
On other side, I have long been fascinated by the interplay of variables in multivariate data and by the challenge of unraveling the effect of each variable. My continuing objective has been to present the power and utility of multivariate analysis in a highly readable format.
Practitioners and researchers in all applied disciplines often measure several variables on each subject or experimental unit. In some cases, it may be productive to isolate each variable in a system and study it separately. Typically, however, the variables are not only correlated with each other, but each variable is influenced by the other variables as it affects a test statistic or descriptive statistic. Thus, in many instances, the variables are intertwined in such a way that when analyzed individually they yield little information about the system. Using multivariate analysis, the variables can be examined simultaneously in order to access the key features of the process that produced them. The multivariate approach enables us to (1) explore the joint performance of the variables and (2) determine the effect of each variable in the presence of the others.
Multivariate analysis provides both descriptive and inferential procedures—we can search for patterns in the data or test hypotheses about patterns of a priori interest. With multivariate descriptive techniques, we can peer beneath the tangled web of variables on the surface and extract the essence of the system. Multivariate inferential procedures include hypothesis tests that (1) process any number of variables without inflating the Type I error rate and (2) allow for whatever intercorrelations the variables possess. A wide variety of multivariate descriptive and inferential procedures is readily accessible in statistical software packages.
basic to advanced functionality of MATLAB. After completing this
chapter 1 (Which included an explanation of the Matlab language) you
will find yourself at a moderate level of expertise in using MATLAB
from where you can take yourself to next levels.
On other side, in spite of the availability of highly innovative tools in
statistics, the main tool of the applied statistician remains the linear
model. The linear model involves the simplest and seemingly most
restrictive statistical properties: independence, normality, constancy of
variance, and linearity. However, the model and the statistical methods
associated
with
it
are
surprisingly
versatile
and
robust.
More
importantly, mastery of the linear model is a prerequisite to work with
advanced
statistical
tools
because
most
advanced
tools
are
generalizations of the linear model. The linear model is thus central to
the training of any statistician, applied or theoretical.
This book develops the basic theory of linear models for regression,
analysis-of variance, and analysis–of–covariance. Applications are
illustrated by examples and problems using real data. This combination
of theory and applications will prepare the reader to further explore the
literature and to more correctly interpret the output from a linear
models computer package and MATLAB.
This introductory linear models book is designed primarily for a one-
semester course for advanced undergraduates or MS students. It
includes more material than can be covered in one semester so as to
give an instructor a choice of topics and to serve as a reference book
for researchers who wish to gain a better understanding of regression
and analysis-of-variance. The book would also serve well as a text for
PhD classes in which the instructor is looking for a one-semester
introduction, and it would be a good supplementary text or reference
for a more advanced PhD class for which the students need to review
the basics on their own.
Our overriding objective in the preparation of this book has been
clarity of exposition. We hope that students, instructors, researchers,
and practitioners will find these linear models' text more comfortable
than most. In the final stages of development, we asked students for
Thesis Chapters by taha hussein ali
In this course, I shall cover some of the most important methods for dealing with these problems. In the case of time series, these include the basic definitions of autocorrelations etc., then time-domain model fitting including autoregressive and moving average processes, spectral methods, and some discussion of the effect of time series correlations on other kinds of statistical inference, such as the estimation of means and regression coefficients.
mathematical problems. Use finite differences Interpolations (Newton forward and Newton backwards) and Lagrange Interpolation. MATLAB provides many routines for standard tasks in computing, ranging from elementary math operations, over linear algebra, statistics and random numbers, interpolation, optimization, Fourier analysis and filtering, and sparse matrix computation, to computational geometry. MATLAB provides many routines for standard tasks in computing, ranging from elementary math operations, over linear algebra, statistics and random numbers, interpolation, optimization, Fourier analysis and filtering, and sparse matrix computation, to computational geometry. More information can be found, as always, in the MATLAB documentation More information can be found, as always, and, the number of software packages, are predominant among these in the academic environment, and versions of these software packages are available for most common computer systems. The text now assumes that the student is using MATLAB for computations
On other side, I have long been fascinated by the interplay of variables in multivariate data and by the challenge of unraveling the effect of each variable. My continuing objective has been to present the power and utility of multivariate analysis in a highly readable format.
Practitioners and researchers in all applied disciplines often measure several variables on each subject or experimental unit. In some cases, it may be productive to isolate each variable in a system and study it separately. Typically, however, the variables are not only correlated with each other, but each variable is influenced by the other variables as it affects a test statistic or descriptive statistic. Thus, in many instances, the variables are intertwined in such a way that when analyzed individually they yield little information about the system. Using multivariate analysis, the variables can be examined simultaneously in order to access the key features of the process that produced them. The multivariate approach enables us to (1) explore the joint performance of the variables and (2) determine the effect of each variable in the presence of the others.
Multivariate analysis provides both descriptive and inferential procedures—we can search for patterns in the data or test hypotheses about patterns of a priori interest. With multivariate descriptive techniques, we can peer beneath the tangled web of variables on the surface and extract the essence of the system. Multivariate inferential procedures include hypothesis tests that (1) process any number of variables without inflating the Type I error rate and (2) allow for whatever intercorrelations the variables possess. A wide variety of multivariate descriptive and inferential procedures is readily accessible in statistical software packages.
basic to advanced functionality of MATLAB. After completing this
chapter 1 (Which included an explanation of the Matlab language) you
will find yourself at a moderate level of expertise in using MATLAB
from where you can take yourself to next levels.
On other side, in spite of the availability of highly innovative tools in
statistics, the main tool of the applied statistician remains the linear
model. The linear model involves the simplest and seemingly most
restrictive statistical properties: independence, normality, constancy of
variance, and linearity. However, the model and the statistical methods
associated
with
it
are
surprisingly
versatile
and
robust.
More
importantly, mastery of the linear model is a prerequisite to work with
advanced
statistical
tools
because
most
advanced
tools
are
generalizations of the linear model. The linear model is thus central to
the training of any statistician, applied or theoretical.
This book develops the basic theory of linear models for regression,
analysis-of variance, and analysis–of–covariance. Applications are
illustrated by examples and problems using real data. This combination
of theory and applications will prepare the reader to further explore the
literature and to more correctly interpret the output from a linear
models computer package and MATLAB.
This introductory linear models book is designed primarily for a one-
semester course for advanced undergraduates or MS students. It
includes more material than can be covered in one semester so as to
give an instructor a choice of topics and to serve as a reference book
for researchers who wish to gain a better understanding of regression
and analysis-of-variance. The book would also serve well as a text for
PhD classes in which the instructor is looking for a one-semester
introduction, and it would be a good supplementary text or reference
for a more advanced PhD class for which the students need to review
the basics on their own.
Our overriding objective in the preparation of this book has been
clarity of exposition. We hope that students, instructors, researchers,
and practitioners will find these linear models' text more comfortable
than most. In the final stages of development, we asked students for
In this course, I shall cover some of the most important methods for dealing with these problems. In the case of time series, these include the basic definitions of autocorrelations etc., then time-domain model fitting including autoregressive and moving average processes, spectral methods, and some discussion of the effect of time series correlations on other kinds of statistical inference, such as the estimation of means and regression coefficients.