Software testing is a most important but expensive activity. To get the most efficient and effect... more Software testing is a most important but expensive activity. To get the most efficient and effective testing, test cases are designed on the basis of conditions. While designing test cases, many test cases are developed that are of no use or produced in duplicate. Exhaustive testing requires program execution with all possible combinations of values for program variables, which is impractical due to resource limitations. Redundant test cases or the test cases that are of no use, simply increases the testing effort and hence increases the cost. Our goal is to reduce the time spent in testing by reducing the number of test cases. For this we have incorporated fuzzy techniques to reduce the number of test cases so that more efficient and accurate results may be achieved. Fuzzy clustering is a class of algorithms for cluster analysis in which the allocation of similar test cases is done to clusters that would help in finding out redundancy incorporated by test cases. We proposed a methodology based on fuzzy clustering by which we can significantly reduce the test suite. The final test suite resulted from methodology will yield good results for conditions/path coverage.
Software Engineering especially project planning, scheduling, monitoring and control are based on... more Software Engineering especially project planning, scheduling, monitoring and control are based on accurate estimate of the cost and effort. In the initial stage of Software Development Life Cycle (SDLC), it is hard to accurately measure software effort that may lead to possibility of project failure. Here, an empirical comparison of existing software cost estimation models based on the techniques used in those models has been elaborated using statistical criteria. On the basis of findings of empirical evaluation of existing models, a Neuro-Fuzzy Software Cost Estimation model has been proposed to hold best practices found in other models and to optimize software cost estimation. Proposed model gives good result as compared to other considered software cost estimation methods for the defined parameters in overall but it is also dependent on type of project, data and technique used in implementation.
It is a complex task to optimize query as well as to validate the correctness and effectiveness o... more It is a complex task to optimize query as well as to validate the correctness and effectiveness of query optimizer. A query optimizer should estimate and compare the costs of executing a query using different execution strategies and should choose the strategy with the lower cost estimate. To fairly and realistically compare different strategies accurate cost estimation is required. This is a challenging task to measure quality of query optimization as modern query optimizers provide more advanced optimization strategies and adaptive techniques. This paper describes different ways to improve the performance of SQL Server queries, index optimization with occasional references to particular SQL code and how to achieve the best performance for the given tables and queries by giving some tips for query optimization in Microsoft SQL Server. The paper provides a detailed overview of query optimization, Optimization techniques, testing of optimization techniques that are used to validate the query optimizer of Microsoft's SQL Server and issues in query optimization testing.
With the contempo slump and the immutable crush to deliver more services at a lower cost. Deliver... more With the contempo slump and the immutable crush to deliver more services at a lower cost. Delivery model offers lower cost, and can make quick construction services. IT economics are changing rapidly, and large companies, in particular, looking for new ways to secure capital at a lower cost to maintain the viability of the company. Task scheduling problems are first class related to the overall efficiency of cloud computing facilities. Most developed algorithms for automation planning approach in one parameter of quality of service (QoS). However, if we consider more than one QoS parameter then the problem becomes more challenging. To address the problem, we need to introduce a scheduling strategy for multi-workflows with multiple QoS constrained for cloud computing. We need to introduce an optimized algorithm for task scheduling in cloud computing and its implementation. Furthermore, Load Balancing is a method to distribute workload across one or more servers, network interfaces, hard drives, or other computing resources. Use these components with the load balancing, on the one chamber, grow well in redundancy.
Software cost estimation is one of the most challenging tasks in software engineering. For the es... more Software cost estimation is one of the most challenging tasks in software engineering. For the estimation, Function points are useful in the business application software domain and problematic in the real-time software domain. Full Function Points (FFP) are useful for functionality-based estimation, specifically for real-time and embedded software. Functional size measurement method that has user view of functional requirements developed by Common Software Measurement International Consortium (COSMIC) called COSMIC-FFP. By using COSMIC-FFP model, an early prediction of the functional complexity of the software throughout the software development life cycle within given budget constraints, reliability can be done. In this paper, a detailed analysis with process flow of COSMIC-FFP model has been discussed.
The scope of this process covers plan the plan phase as well as execution of the work package aga... more The scope of this process covers plan the plan phase as well as execution of the work package against the work package plan till completion of the work package. All discussions and meetings for status reporting, and project closure are part of the scope of this process which are described in this paper. The purpose of this paper is to elaborate the Project Monitoring and Control process using process flow diagrams, Entry/Exit Criteria, Input/Output and steps or action which are performed at each phase.
Image compression is used to reduce the amount of data required to represent a digital image. The... more Image compression is used to reduce the amount of data required to represent a digital image. The aim of this paper is to analyze the various image compression methods, factors on which image compression techniques are based and examine the performance of image compression using a detailed empirical evaluation of wavelet functions (wavelets), Discrete Cosine Transformation (DCT) and Neural Network (NN) in terms of retained energy, peak signal to noise ratio, output image size etc. Image compression using various techniques has been implemented using MatLab R2013.
Software test cases play an important role in Software Testing. A lot of test cases are developed... more Software test cases play an important role in Software Testing. A lot of test cases are developed while designing test cases that are of no use or redundant in nature. These types of test cases increase the effort and cost involved in testing and results in decrease of software testing efficiency. In this paper, a Genetic Algorithm based methodology has been proposed which can significantly reduce the test suite and leads to optimization of the efficiency of software testing. The testing efficiency can be increased by identifying the most critical paths. The proposed Genetic Algorithm based approach select the software path clusters that are weighted in accordance with the criticality of the path. Proposed algorithm has been evaluated by applying simple Genetic Algorithm and Fitness Scaling on a case study. GA with Fitness scaling helps in improving the efficiency of software testing by effectively reducing software test cases.
Abstract -Agile methodology that utilizes iterative development and prototyping are widely used i... more Abstract -Agile methodology that utilizes iterative development and prototyping are widely used in variety of industry projects as a light weight development method which can satisfy to the changes of requirements. Short iterations are used that are required for efficient product delivery. Traditional software development processes are not much efficient to manage the rapid change in requirements. Despite the advantages of Agile, criticism on agile methodology states that it fails to pay attention to architectural and design issues and therefore is bound to produce small design-decisions. Here, in this paper we identify the impacts that agile methodology has on software development processes with respect to quality within the organizational, methodical, and cultural framework. He has guided many research scholars and M.Tech scholars. His areas of specialization are Software Engineering and Computer Graphics. He has done many projects sanctioned from UGC.He has more than 18 years teaching experience.
This study investigates the characteristic of the sorting algorithms with reference to number of ... more This study investigates the characteristic of the sorting algorithms with reference to number of comparisons made for the specific number of elements. Sorting algorithms are used by many applications to arrange the elements in increasing/decreasing order or any other permutation. Sorting algorithms, like Quick Sort, Merge Sort, Heap Sort, Insertion Sort, Bubble Sort etc. have different complexities depending on the number of elements to sort. The purpose of this investigation is to determine the number of comparisons, number of swap operations and after that plotting line graph for the same to extract values for polynomial equation. The values a, b and c got is then used for drawing parabola graph. The study concludes what algorithm to use for a large number of elements. For larger arrays, the best choice is Quick sort, which uses recursion method to sort the elements and leads to faster results. Least square method and Matrix inversion method is used to get the value of constants a, b and c for each polynomial equation of sorting algorithms. After calculating the values, Graph is drawn for each sorting algorithm for the polynomial equation i.e. Y=AX 2 + BX + C or Y=AX lgX + BX + C.
Software cost estimation is one of the most challenging tasks in software engineering. Over the p... more Software cost estimation is one of the most challenging tasks in software engineering. Over the past years the estimators have used parametric cost estimation models to establish software cost, however the challenges to accurate cost estimation keep evolving with the advancing technology. A detailed review of various cost estimation methods developed so far is presented in this paper. Planned effort and actual effort has been comparison in detail through applying on NASA projects. This paper uses Back-Propagation neural networks for software cost estimation. A model based on Neural Network has been proposed that takes KLOC of the project as input, uses COCOMO model parameters and gives effort as output. Artificial Neural Network represents a complex set of relationship between the effort and the cost drivers and is a potential tool for estimation. The proposed model automates the software cost estimation task and helps project manager to provide fast and realistic estimate for the project effort and development time that in turn gives software cost.
Handwritten character recognition is a difficult problem due to the great variations of writing s... more Handwritten character recognition is a difficult problem due to the great variations of writing styles, different size of the characters. Multiple types of handwriting styles from different persons are considered in this work. An image with higher resolution will certainly take much longer time to compute than a lower resolution image. In the practical image acquisition systems and conditions, shape distortion is common processes because different people's handwriting has different shape of characters. The process of recognizing character recognition in this work has been divided into 2 phases. In the first phase, Image preprocessing is done in which image is firstly converted into binary form based on some threshold value obtained through Otsu's method. After that removal of noise is done using median filter. After that feature extraction takes place that is done here through Fourier descriptor method using Fourier transform and correlation between template made through training data and test data is obtained. A multilayer feed forward neural network is created and trained through Back Propagation algorithm. After the training, testing is done to match the pattern with test data. Results for various convergence objective of neural network are obtained and analyzed.
Preprocessing techniques are the first step in a character recognition system. This paper deals w... more Preprocessing techniques are the first step in a character recognition system. This paper deals with the various preprocessing techniques involved in character recognition system with different kind of images ranges from simple handwritten form based documents and documents containing colored and complex background and varied intensities. Here, we are going to discuss all important preprocessing techniques like skew detection and correction, image enhancement techniques of contrast stretching, binarization, noise removal techniques, normalization and segmentation, morphological processing techniques.
Software Engineering aims to produce a quality software product that is delivered on time, within... more Software Engineering aims to produce a quality software product that is delivered on time, within the allocated budget, and with the requirements expected by the customer but unfortunately maximum of the times this goal is rarely achieved. A software life cycle is the series of identifiable stages that a software product undergoes during its lifetime. However, a properly managed project in a matured software engineering environment can consistently achieve this goal. This research is concerned with the methodologies that examine the life cycle of software through the development models, which are known as software development life cycle. Hereby, we are representing traditional i.e. waterfall, Iteration, Spiral models as well as modern development methodologies like Agile methodologies that includes Extreme programming, Scrum, Feature Driven Development; Component based software development methodologies etc. All of these models have advantages and disadvantages as well. Therefore, the main objective of this research is to represent different models of software development by showing the good and bad practices of each model. A comparative analysis of traditional as well as modern methodologies is made.
Feature plays a very important role in the area of image processing. Before getting features, var... more Feature plays a very important role in the area of image processing. Before getting features, various image preprocessing techniques like binarization, thresholding, resizing, normalization etc. are applied on the sampled image. After that, feature extraction techniques are applied to get features that will be useful in classifying and recognition of images. Feature extraction techniques are helpful in various image processing applications e.g. character recognition. As features define the behavior of an image, they show its place in terms of storage taken, efficiency in classification and obviously in time consumption also. Here in this paper, we are going to discuss various types of features, feature extraction techniques and explaining in what scenario, which features extraction technique, will be better. Hereby in this paper, we are going to refer features and feature extraction methods in case of character recognition application.
Image compression is used to reduce the amount of data required to represent a digital image. The... more Image compression is used to reduce the amount of data required to represent a digital image. The aim of this paper is to analyze the various image compression methods, factors on which image compression techniques are based and examine the performance of image compression using a detailed empirical evaluation of wavelet functions (wavelets), Discrete Cosine Transformation (DCT) and Neural Network (NN) in terms of retained energy, peak signal to noise ratio, output image size etc. Image compression using various techniques has been implemented using MatLab R2013.
Component Based Software Engineering (CBSE) constructs a quality software system by reusing exist... more Component Based Software Engineering (CBSE) constructs a quality software system by reusing existing components. For the construction of high-quality software system, reusability plays an important role. Software component should be designed and implemented in such a way that many different programs can reuse them. Reuse of software can increase the productivity and quality of software by reducing effort, time and cost which was elapsed in designing and developing reusable software component. In this paper, a Neuro-fuzzy model has been proposed that uses software component design patterns for analysis and Chidamber and Kemerer (CK) metric for evaluation, optimization and categorization of reusability for component based software. The work is divided into 2 phases. In the first phase, analysis and optimization of reusability are empirically evaluated with high precision value using CK metric and unsupervised Self Organizing Map (SOM) Neural Network. In the second phase, reusability is categorized as very low, low, medium, high and very high using a supervised Back propagation Neural Network (BPNN) and fuzzy inference rules applied on CK metric values. The proposed model may help a software designer to evaluate and optimize the reusability of components while designing software to make quality software system.
The goal of Component Based Software Engineering (CBSE) is to deliver high quality, more reliable... more The goal of Component Based Software Engineering (CBSE) is to deliver high quality, more reliable and more maintainable software systems in a shorter time and within limited budget by reusing and combining existing quality components. A high quality system can be achieved by using quality components, framework and integration process that plays a significant role. So, techniques and methods used for quality assurance and assessment of a component based system is different from those of the traditional software engineering methodology. In this paper, we are presenting a model for optimizing Chidamber and Kemerer (CK) metric values of component-based software. A deep analysis of a series of CK metrics of the software components design patterns is done and metric values are drawn from them. By using unsupervised neural network-Self Organizing Map, we have proposed a model that provides an optimized model for Software Component engineering model based on reusability that depends on CK metric values. Average, standard deviated and optimized values for the CK metric are compared and evaluated to show the optimized reusability of component based model. Index Terms – Chidamber and Kemerer (CK) metric; Component Based Software Engineering (CBSE); Neural Network (NN); Self Organizing Map (SOM).
It is a complex task to optimize query as well as to validate the correctness and effectiveness o... more It is a complex task to optimize query as well as to validate the correctness and effectiveness of query optimizer. A query optimizer should estimate and compare the costs of executing a query using different execution strategies and should choose the strategy with the lower cost estimate. To fairly and realistically compare different strategies accurate cost estimation is required. This is a challenging task to measure quality of query optimization as modern query optimizers provide more advanced optimization strategies and adaptive techniques. This paper describes different ways to improve the performance of SQL Server queries, index optimization with occasional references to particular SQL code and how to achieve the best performance for the given tables and queries by giving some tips for query optimization in Microsoft SQL Server. The paper provides a detailed overview of query optimization, Optimization techniques, testing of optimization techniques that are used to validate the query optimizer of Microsoft's SQL Server and issues in query optimization testing.
Software testing is a most important but expensive activity. To get the most efficient and effect... more Software testing is a most important but expensive activity. To get the most efficient and effective testing, test cases are designed on the basis of conditions. While designing test cases, many test cases are developed that are of no use or produced in duplicate. Exhaustive testing requires program execution with all possible combinations of values for program variables, which is impractical due to resource limitations. Redundant test cases or the test cases that are of no use, simply increases the testing effort and hence increases the cost. Our goal is to reduce the time spent in testing by reducing the number of test cases. For this we have incorporated fuzzy techniques to reduce the number of test cases so that more efficient and accurate results may be achieved. Fuzzy clustering is a class of algorithms for cluster analysis in which the allocation of similar test cases is done to clusters that would help in finding out redundancy incorporated by test cases. We proposed a methodology based on fuzzy clustering by which we can significantly reduce the test suite. The final test suite resulted from methodology will yield good results for conditions/path coverage.
Software Engineering especially project planning, scheduling, monitoring and control are based on... more Software Engineering especially project planning, scheduling, monitoring and control are based on accurate estimate of the cost and effort. In the initial stage of Software Development Life Cycle (SDLC), it is hard to accurately measure software effort that may lead to possibility of project failure. Here, an empirical comparison of existing software cost estimation models based on the techniques used in those models has been elaborated using statistical criteria. On the basis of findings of empirical evaluation of existing models, a Neuro-Fuzzy Software Cost Estimation model has been proposed to hold best practices found in other models and to optimize software cost estimation. Proposed model gives good result as compared to other considered software cost estimation methods for the defined parameters in overall but it is also dependent on type of project, data and technique used in implementation.
It is a complex task to optimize query as well as to validate the correctness and effectiveness o... more It is a complex task to optimize query as well as to validate the correctness and effectiveness of query optimizer. A query optimizer should estimate and compare the costs of executing a query using different execution strategies and should choose the strategy with the lower cost estimate. To fairly and realistically compare different strategies accurate cost estimation is required. This is a challenging task to measure quality of query optimization as modern query optimizers provide more advanced optimization strategies and adaptive techniques. This paper describes different ways to improve the performance of SQL Server queries, index optimization with occasional references to particular SQL code and how to achieve the best performance for the given tables and queries by giving some tips for query optimization in Microsoft SQL Server. The paper provides a detailed overview of query optimization, Optimization techniques, testing of optimization techniques that are used to validate the query optimizer of Microsoft's SQL Server and issues in query optimization testing.
With the contempo slump and the immutable crush to deliver more services at a lower cost. Deliver... more With the contempo slump and the immutable crush to deliver more services at a lower cost. Delivery model offers lower cost, and can make quick construction services. IT economics are changing rapidly, and large companies, in particular, looking for new ways to secure capital at a lower cost to maintain the viability of the company. Task scheduling problems are first class related to the overall efficiency of cloud computing facilities. Most developed algorithms for automation planning approach in one parameter of quality of service (QoS). However, if we consider more than one QoS parameter then the problem becomes more challenging. To address the problem, we need to introduce a scheduling strategy for multi-workflows with multiple QoS constrained for cloud computing. We need to introduce an optimized algorithm for task scheduling in cloud computing and its implementation. Furthermore, Load Balancing is a method to distribute workload across one or more servers, network interfaces, hard drives, or other computing resources. Use these components with the load balancing, on the one chamber, grow well in redundancy.
Software cost estimation is one of the most challenging tasks in software engineering. For the es... more Software cost estimation is one of the most challenging tasks in software engineering. For the estimation, Function points are useful in the business application software domain and problematic in the real-time software domain. Full Function Points (FFP) are useful for functionality-based estimation, specifically for real-time and embedded software. Functional size measurement method that has user view of functional requirements developed by Common Software Measurement International Consortium (COSMIC) called COSMIC-FFP. By using COSMIC-FFP model, an early prediction of the functional complexity of the software throughout the software development life cycle within given budget constraints, reliability can be done. In this paper, a detailed analysis with process flow of COSMIC-FFP model has been discussed.
The scope of this process covers plan the plan phase as well as execution of the work package aga... more The scope of this process covers plan the plan phase as well as execution of the work package against the work package plan till completion of the work package. All discussions and meetings for status reporting, and project closure are part of the scope of this process which are described in this paper. The purpose of this paper is to elaborate the Project Monitoring and Control process using process flow diagrams, Entry/Exit Criteria, Input/Output and steps or action which are performed at each phase.
Image compression is used to reduce the amount of data required to represent a digital image. The... more Image compression is used to reduce the amount of data required to represent a digital image. The aim of this paper is to analyze the various image compression methods, factors on which image compression techniques are based and examine the performance of image compression using a detailed empirical evaluation of wavelet functions (wavelets), Discrete Cosine Transformation (DCT) and Neural Network (NN) in terms of retained energy, peak signal to noise ratio, output image size etc. Image compression using various techniques has been implemented using MatLab R2013.
Software test cases play an important role in Software Testing. A lot of test cases are developed... more Software test cases play an important role in Software Testing. A lot of test cases are developed while designing test cases that are of no use or redundant in nature. These types of test cases increase the effort and cost involved in testing and results in decrease of software testing efficiency. In this paper, a Genetic Algorithm based methodology has been proposed which can significantly reduce the test suite and leads to optimization of the efficiency of software testing. The testing efficiency can be increased by identifying the most critical paths. The proposed Genetic Algorithm based approach select the software path clusters that are weighted in accordance with the criticality of the path. Proposed algorithm has been evaluated by applying simple Genetic Algorithm and Fitness Scaling on a case study. GA with Fitness scaling helps in improving the efficiency of software testing by effectively reducing software test cases.
Abstract -Agile methodology that utilizes iterative development and prototyping are widely used i... more Abstract -Agile methodology that utilizes iterative development and prototyping are widely used in variety of industry projects as a light weight development method which can satisfy to the changes of requirements. Short iterations are used that are required for efficient product delivery. Traditional software development processes are not much efficient to manage the rapid change in requirements. Despite the advantages of Agile, criticism on agile methodology states that it fails to pay attention to architectural and design issues and therefore is bound to produce small design-decisions. Here, in this paper we identify the impacts that agile methodology has on software development processes with respect to quality within the organizational, methodical, and cultural framework. He has guided many research scholars and M.Tech scholars. His areas of specialization are Software Engineering and Computer Graphics. He has done many projects sanctioned from UGC.He has more than 18 years teaching experience.
This study investigates the characteristic of the sorting algorithms with reference to number of ... more This study investigates the characteristic of the sorting algorithms with reference to number of comparisons made for the specific number of elements. Sorting algorithms are used by many applications to arrange the elements in increasing/decreasing order or any other permutation. Sorting algorithms, like Quick Sort, Merge Sort, Heap Sort, Insertion Sort, Bubble Sort etc. have different complexities depending on the number of elements to sort. The purpose of this investigation is to determine the number of comparisons, number of swap operations and after that plotting line graph for the same to extract values for polynomial equation. The values a, b and c got is then used for drawing parabola graph. The study concludes what algorithm to use for a large number of elements. For larger arrays, the best choice is Quick sort, which uses recursion method to sort the elements and leads to faster results. Least square method and Matrix inversion method is used to get the value of constants a, b and c for each polynomial equation of sorting algorithms. After calculating the values, Graph is drawn for each sorting algorithm for the polynomial equation i.e. Y=AX 2 + BX + C or Y=AX lgX + BX + C.
Software cost estimation is one of the most challenging tasks in software engineering. Over the p... more Software cost estimation is one of the most challenging tasks in software engineering. Over the past years the estimators have used parametric cost estimation models to establish software cost, however the challenges to accurate cost estimation keep evolving with the advancing technology. A detailed review of various cost estimation methods developed so far is presented in this paper. Planned effort and actual effort has been comparison in detail through applying on NASA projects. This paper uses Back-Propagation neural networks for software cost estimation. A model based on Neural Network has been proposed that takes KLOC of the project as input, uses COCOMO model parameters and gives effort as output. Artificial Neural Network represents a complex set of relationship between the effort and the cost drivers and is a potential tool for estimation. The proposed model automates the software cost estimation task and helps project manager to provide fast and realistic estimate for the project effort and development time that in turn gives software cost.
Handwritten character recognition is a difficult problem due to the great variations of writing s... more Handwritten character recognition is a difficult problem due to the great variations of writing styles, different size of the characters. Multiple types of handwriting styles from different persons are considered in this work. An image with higher resolution will certainly take much longer time to compute than a lower resolution image. In the practical image acquisition systems and conditions, shape distortion is common processes because different people's handwriting has different shape of characters. The process of recognizing character recognition in this work has been divided into 2 phases. In the first phase, Image preprocessing is done in which image is firstly converted into binary form based on some threshold value obtained through Otsu's method. After that removal of noise is done using median filter. After that feature extraction takes place that is done here through Fourier descriptor method using Fourier transform and correlation between template made through training data and test data is obtained. A multilayer feed forward neural network is created and trained through Back Propagation algorithm. After the training, testing is done to match the pattern with test data. Results for various convergence objective of neural network are obtained and analyzed.
Preprocessing techniques are the first step in a character recognition system. This paper deals w... more Preprocessing techniques are the first step in a character recognition system. This paper deals with the various preprocessing techniques involved in character recognition system with different kind of images ranges from simple handwritten form based documents and documents containing colored and complex background and varied intensities. Here, we are going to discuss all important preprocessing techniques like skew detection and correction, image enhancement techniques of contrast stretching, binarization, noise removal techniques, normalization and segmentation, morphological processing techniques.
Software Engineering aims to produce a quality software product that is delivered on time, within... more Software Engineering aims to produce a quality software product that is delivered on time, within the allocated budget, and with the requirements expected by the customer but unfortunately maximum of the times this goal is rarely achieved. A software life cycle is the series of identifiable stages that a software product undergoes during its lifetime. However, a properly managed project in a matured software engineering environment can consistently achieve this goal. This research is concerned with the methodologies that examine the life cycle of software through the development models, which are known as software development life cycle. Hereby, we are representing traditional i.e. waterfall, Iteration, Spiral models as well as modern development methodologies like Agile methodologies that includes Extreme programming, Scrum, Feature Driven Development; Component based software development methodologies etc. All of these models have advantages and disadvantages as well. Therefore, the main objective of this research is to represent different models of software development by showing the good and bad practices of each model. A comparative analysis of traditional as well as modern methodologies is made.
Feature plays a very important role in the area of image processing. Before getting features, var... more Feature plays a very important role in the area of image processing. Before getting features, various image preprocessing techniques like binarization, thresholding, resizing, normalization etc. are applied on the sampled image. After that, feature extraction techniques are applied to get features that will be useful in classifying and recognition of images. Feature extraction techniques are helpful in various image processing applications e.g. character recognition. As features define the behavior of an image, they show its place in terms of storage taken, efficiency in classification and obviously in time consumption also. Here in this paper, we are going to discuss various types of features, feature extraction techniques and explaining in what scenario, which features extraction technique, will be better. Hereby in this paper, we are going to refer features and feature extraction methods in case of character recognition application.
Image compression is used to reduce the amount of data required to represent a digital image. The... more Image compression is used to reduce the amount of data required to represent a digital image. The aim of this paper is to analyze the various image compression methods, factors on which image compression techniques are based and examine the performance of image compression using a detailed empirical evaluation of wavelet functions (wavelets), Discrete Cosine Transformation (DCT) and Neural Network (NN) in terms of retained energy, peak signal to noise ratio, output image size etc. Image compression using various techniques has been implemented using MatLab R2013.
Component Based Software Engineering (CBSE) constructs a quality software system by reusing exist... more Component Based Software Engineering (CBSE) constructs a quality software system by reusing existing components. For the construction of high-quality software system, reusability plays an important role. Software component should be designed and implemented in such a way that many different programs can reuse them. Reuse of software can increase the productivity and quality of software by reducing effort, time and cost which was elapsed in designing and developing reusable software component. In this paper, a Neuro-fuzzy model has been proposed that uses software component design patterns for analysis and Chidamber and Kemerer (CK) metric for evaluation, optimization and categorization of reusability for component based software. The work is divided into 2 phases. In the first phase, analysis and optimization of reusability are empirically evaluated with high precision value using CK metric and unsupervised Self Organizing Map (SOM) Neural Network. In the second phase, reusability is categorized as very low, low, medium, high and very high using a supervised Back propagation Neural Network (BPNN) and fuzzy inference rules applied on CK metric values. The proposed model may help a software designer to evaluate and optimize the reusability of components while designing software to make quality software system.
The goal of Component Based Software Engineering (CBSE) is to deliver high quality, more reliable... more The goal of Component Based Software Engineering (CBSE) is to deliver high quality, more reliable and more maintainable software systems in a shorter time and within limited budget by reusing and combining existing quality components. A high quality system can be achieved by using quality components, framework and integration process that plays a significant role. So, techniques and methods used for quality assurance and assessment of a component based system is different from those of the traditional software engineering methodology. In this paper, we are presenting a model for optimizing Chidamber and Kemerer (CK) metric values of component-based software. A deep analysis of a series of CK metrics of the software components design patterns is done and metric values are drawn from them. By using unsupervised neural network-Self Organizing Map, we have proposed a model that provides an optimized model for Software Component engineering model based on reusability that depends on CK metric values. Average, standard deviated and optimized values for the CK metric are compared and evaluated to show the optimized reusability of component based model. Index Terms – Chidamber and Kemerer (CK) metric; Component Based Software Engineering (CBSE); Neural Network (NN); Self Organizing Map (SOM).
It is a complex task to optimize query as well as to validate the correctness and effectiveness o... more It is a complex task to optimize query as well as to validate the correctness and effectiveness of query optimizer. A query optimizer should estimate and compare the costs of executing a query using different execution strategies and should choose the strategy with the lower cost estimate. To fairly and realistically compare different strategies accurate cost estimation is required. This is a challenging task to measure quality of query optimization as modern query optimizers provide more advanced optimization strategies and adaptive techniques. This paper describes different ways to improve the performance of SQL Server queries, index optimization with occasional references to particular SQL code and how to achieve the best performance for the given tables and queries by giving some tips for query optimization in Microsoft SQL Server. The paper provides a detailed overview of query optimization, Optimization techniques, testing of optimization techniques that are used to validate the query optimizer of Microsoft's SQL Server and issues in query optimization testing.
Uploads
Papers by Gaurav Gupta