Papers by Mohiy M. Hadhoud
CRC Press eBooks, Dec 15, 2012
Advances in intelligent systems and computing, Nov 10, 2015
This paper investigates the problem of the effectiveness of input quality on the performance eval... more This paper investigates the problem of the effectiveness of input quality on the performance evaluation of Arabic OCR systems. The experimental results show that performance for Arabic OCR systems gives accepted error rate for low noisy images and gives high error rate for images with high noises and Arabic OCR accuracy can be increased by filtering the noise images. Robust word-based and character-based accuracy metrics are used to show the performance evaluation of different Arabic OCR engines using different samples such as newspapers, books, regular text are used.
Mobile ad hoc network (MANET) suffers from temporary link failures and route changes. Moreover, T... more Mobile ad hoc network (MANET) suffers from temporary link failures and route changes. Moreover, TCP performs poorly when most packet losses are due to congestion. Most of research performed for improving TCP performance over MANET requires feedback from lower layers. Several attempts have been proposed for a layered TCP improvement. Yet, their percentage enhancements are not satisfactory. In this paper,
International Journal of Computing, Aug 1, 2014
An efficient algorithm for detecting frontal-view faces in color images is proposed. The proposed... more An efficient algorithm for detecting frontal-view faces in color images is proposed. The proposed algorithm has a special task; it detects faces in the presence of skin-tone regions (human body, clothes, and background) Firstly, a pixel based color classifier is applied to segment the skin pixels from background. Next, a hybrid cluster algorithm is applied to partition the skin region. We introduce a new symmetry approach, which is the main distinguishing feature of the proposed algorithm. It measures a symmetrical value, searches for the real center of the region, and then removes the extra unsymmetrical skin pixels. The cost functions are adopted to locate the real two eyes of the candidate face region. A template matching process is preformed between an aligning frontal face model and the candidate face region as a verification step. Experimental results reveal that our algorithm can perform the detection of faces successfully under wide variations.
International Conference on Aerospace Sciences & Aviation Technology, May 1, 1999
Traditionally, digital filters were used for signal-to-noise ratio (SNR) improvement. This paper ... more Traditionally, digital filters were used for signal-to-noise ratio (SNR) improvement. This paper proposes a modified method to enhance and improve the digital filter performance. This method is based on the combination of coherent time averaging technique and digital filtering. This paper discusses the processing of single and multiple input sinusoidal signals with additive noise. By using this method in processing multiple signals, we can isolate the weak in-band signals from the strong out-of-band signals, which consequently improve the weak signal delectability. Cascaded digital filtering is applied using similar or different types of digital filters. A comparison study is provided for single stage and cascaded stages digital filters. It is shown that this modified method leads to better results in SNR improvement of weak signals than in the case of the direct application of digital filtering.
During the last decade, Intrusion Detection Systems (IDSs) have played an important role in defen... more During the last decade, Intrusion Detection Systems (IDSs) have played an important role in defending critical computer systems and networks from cyber-attacks. Anomaly detection techniques have received a particularly great amount of attention because they offer intrinsic ability to detect unknown attacks. In this paper, we propose an enhanced hybrid anomaly detection approach based on negative selection algorithm and metaheuristics. The enhancements include tuning some of its parameters value automatically without predefining them. NSL-KDD dataset; which is a modified version of the widely used KDDCUP99 dataset; is used for performance evaluation. KDDCUP99 dataset is criticized by its inability to reflected recent network traffic behaviour. So, a real time experiment was performed to capture and construct a recent dataset to ensure the performance of the proposed enhancements. Performance evaluation shows that the proposed approach outperforms other competitors of machine learning algorithms on both datasets.
International journal of computer theory and engineering, 2009
s the popularity of wireless networks increases, so does the need to protect them. Encryption alg... more s the popularity of wireless networks increases, so does the need to protect them. Encryption algorithms play a main role in information security systems. On the other side, those algorithms consume a significant amount of computing resources such as CPU time, memory, and battery power. This paper illustrates the key concepts of security, wireless networks, and security over wireless networks. Wireless security is demonstrated by applying the common security standards like (802.11 WEP and 802.11i WPA,WPA2) and provides evaluation of six of the most common encryption algorithms on power consumption for wireless devices namely: AES (Rijndael), DES, 3DES, RC2, Blowfish, and RC6. A comparison has been conducted for those encryption algorithms at different settings for each algorithm such as different sizes of data blocks, different data types, battery power consumption, date transmission through wireless network and finally encryption/decryption speed. Experimental results are given to demonstrate the effectiveness of each algorithm.
CRC Press eBooks, Dec 15, 2012
This paper presents a noniterative regularized inverse solution to the image interpolation proble... more This paper presents a noniterative regularized inverse solution to the image interpolation problem. The suggested solution is based on the segmentation of the image to be interpolated to overlapping blocks and interpolating each block separately. The purpose of the overlapping of blocks is to avoid edge effects. A global regularization parameter is used in interpolating each block. In this suggested implementation, a single matrix inversion process of moderate dimensions is required in the whole interpolation process. The suggested solution avoids the large computational complexity due to the matrices of large dimensions involved in the interpolation process. The performance of this suggested image interpolation algorithm is compared to the standard iterative regularized interpolation scheme and to polynomial interpolation schemes such as the cubic spline image interpolation algorithm
Internet and networks applications are growing very fast, so the needs to protect such applicatio... more Internet and networks applications are growing very fast, so the needs to protect such applications are increased. Encryption algorithms play a main role in information security systems. On the other side, those algorithms consume a significant amount of computing resources such as CPU time, memory, and battery power. This paper provides evaluation of six of the most common encryption algorithms namely: AES (Rijndael), DES, 3DES, RC2, Blowfish, and RC6. A comparison has been conducted for those encryption algorithms at different settings for each algorithm such as different sizes of data blocks, different data types ,battery power consumption, different key size and finally encryption/decryption speed. Simulation results are given to demonstrate the effectiveness of each algorithm. .
Maǧallaẗ Kulliyyaẗ Dār Al-ʿulūm, Jan 23, 2021
Digital Signal Processing, Jul 1, 2008
ABSTRACT
2017 27th International Conference on Computer Theory and Applications (ICCTA)
Planning a software release is one of the challenge task in software engineering. Planning a soft... more Planning a software release is one of the challenge task in software engineering. Planning a software release includes assigning requirements to sequence of releases in the most beneficial way within the limited effort, budget, and time available. The complexity of the software release planning is due to the incompleteness and the uncertainty challenges that characterize the software release planning problem. In order to handle efficiently the incompleteness and uncertainty challenges, the Interval Evidential Reasoning Aggregation Algorithm can be utilized. Although several methods have been proposed to handle the incompleteness and uncertainty challenges for the software release planning problem, they are all based on optimization methods. The objective of this paper is to propose a novel model to handle the uncertainty and incomplete challenges of the software release planning based on multiple attributes aggregation algorithms such as Interval Evidential Reasoning Aggregation Algorithm rather than optimization method. The Interval Evidential Reasoning Aggregation Algorithm will lead to developing a high quality, satisfied stakeholders, and attractive software releases. For the validation purposes, the Interval Evidential Reasoning Aggregation Algorithm is applied to plan a new release for updating a faculty website project.
This paper proposes a simplified fractal image compression algorithm which is implemented on a bl... more This paper proposes a simplified fractal image compression algorithm which is implemented on a block by block basis. This algorithm achieves a compression ratio of up to 1:10 with a peak signal to noise ratio (PSNR) as high as 35dB. The idea of the proposed algorithm is based on the segmentation of the image, first, into blocks to setup reference blocks. The image is then decomposed again into block ranges and a search process is carried out to find the reference blocks with best match. The transmitted or stored values, after compression, are the reference block values and the indices of the reference block that achieves the best match. If there is no match, the average value of the block range is transmitted or stored instead. It proposes also the effect of using the spiral architecture instead of square block decomposition and searching in fractal compression. Comparisons with other systems; conventional square, the proposed simplified fractal compression and the standard JPEG are...
2017 14th IEEE Annual Consumer Communications & Networking Conference (CCNC), 2017
Participatory sensing is an emerging paradigm in which citizens voluntarily use their mobile phon... more Participatory sensing is an emerging paradigm in which citizens voluntarily use their mobile phones to capture and share sensed data from their surrounding environment in order to monitor and analyze some phenomena. Participating users can disrupt the system by contributing corrupted, fabricated, or erroneous data. Different reputation systems have been proposed to monitor participants' behavior and to estimate their honesty. There are some attacks that were not considered by the existing reputation systems in this context including corruption, collusion, and on-off attack. In this paper, we propose a more robust and efficient reputation system designed for these applications. Our reputation system incorporates a mechanism to defend against those attacks. Experimental results indicate that our system can accurately estimate the quality of contributions even if a collusion is committed. It can tolerate up to 60% of colluding adversaries involved in the sensing campaign. This enables our system to aggregate the data more accurately compared with the state-ofthe art. Moreover, the system can detect adversaries even if they launch on-off attack and strategically contribute some good data with high probability (e.g. 0.8).
Participatory sensing is an emerging paradigm in which citizens voluntarily use their mobile phon... more Participatory sensing is an emerging paradigm in which citizens voluntarily use their mobile phones to capture and share sensed data from their surrounding environment in order to monitor and analyze some phenomena (e.g., weather, road traffic, pollution, etc.). Malicious participants can disrupt the system by contributing corrupted, fabricated, or erroneous data. Different trust and reputation systems have been proposed in literature to monitor participants’ behavior and to estimate their honesty. A trust mapping function is exploited to assign a score to each contribution which reflects its quality as perceived by the application server. Thus, the application server can aggregate the data more accurately. In this paper, we compare different trust mapping functions and measure the accuracy of the aggregated data based on those functions.
IJCI. International Journal of Computers and Information, 2016
Participatory sensing is an emerging paradigm in which citizens voluntarily use their mobile phon... more Participatory sensing is an emerging paradigm in which citizens voluntarily use their mobile phones to capture and share sensed data from their surrounding environment in order to monitor and analyze some phenomena (e.g., weather, road traffic, pollution, etc.). Participating users can disrupt the system by contributing corrupted, fabricated, or erroneous data. Different reputation systems have been proposed to monitor participants' behavior and to estimate their honesty. There are some attacks that were not considered by the existing reputation systems in the context of participatory sensing applications including corruption, collusion, and on-off attack. In this paper, we propose a more robust and efficient reputation system designed for these applications. Our reputation system incorporates a mechanism to defend against those attacks. Experimental results indicate that our system can accurately estimate the quality of contributions even if collusion is committed. It can tolerate up to 60% of colluding adversaries involved in the sensing campaign. This enables our system to aggregate the data more accurately compared with the state-of-the art. Moreover, the system can detect adversaries even if they launch on-off attack and strategically contribute some good data with high probability (e.g. 0.8).
Image Super-Resolution and Applications, 2012
Uploads
Papers by Mohiy M. Hadhoud