Papers by International Journal of Scientific Research in Science, Engineering and Technology IJSRSET
International Journal of Scientific Research in Science, Engineering and Technology, 2020
In the fast-evolving financial industry, predictive modeling has emerged as an essential tool for... more In the fast-evolving financial industry, predictive modeling has emerged as an essential tool for strategic decision-making and risk assessment. Traditional data warehouses, however, often lack the agility required to support these complex, data-intensive predictive processes. With the integration of Artificial Intelligence (AI), finance data warehouses are undergoing a paradigm shift. This paper examines how AI-driven approaches are enhancing the predictive modeling capabilities of finance data warehouses, focusing on advanced data processing, machine learning algorithms, and real-time data analytics. By analyzing AI's role in this transformation, the article provides insights into how finance organizations can leverage AI-powered data warehouses to improve accuracy in predictions, streamline data handling, and accelerate decision-making processes. This work contributes to the ongoing discussion on AI's transformative potential in the financial sector, aiming to inform and guide future innovations.
International Journal of Scientific Research in Science, Engineering and Technology, 2024
Integrating multimodal data, such as medical imaging, electronic health records (EHRs), and genom... more Integrating multimodal data, such as medical imaging, electronic health records (EHRs), and genomic data, is critical for comprehensive healthcare diagnostics. However, these data sources' heterogeneity and high dimensionality present challenges in developing robust and accurate diagnostic models. This paper proposes a hybrid deep learning architecture that combines Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), and Transformer models to achieve efficient multimodal data fusion for healthcare diagnostics. The proposed architecture leverages CNNs for extracting spatial features from image data, RNNs for capturing temporal dependencies in sequential data, and Transformers for cross-modality attention and fusion. A comprehensive evaluation of benchmark healthcare datasets, such as MIMIC-III, ChestX-ray14, and UK Biobank, demonstrates the model's superior diagnostic accuracy, interpretability, and generalization compared to existing methods. This study highlights the potential of hybrid deep learning architectures for improving diagnostic precision, enabling early disease detection, and facilitating personalized treatment strategies in real-world clinical settings. Future work will focus on enhancing model interpretability and reducing computational complexity for more practical deployment.
International Journal of Scientific Research in Science, Engineering and Technology, 2023
This research considers the use of AWS technology with SAP applications and specific address busi... more This research considers the use of AWS technology with SAP applications and specific address business continuity and disaster recovery. A special emphasis is made on the advantages of the AWS solutions the elasticity, backup with automation and scaling up, the global redundancy, which makes traditional high availability and disaster recovery solutions obsolete. Issues such as integration issues and the cost aspect of integration are highlighted and followed by their solutions. This work shows how the AWS tools enhance the usability of SAP systems in terms of efficiency, extensibility, and fault tolerance in order to maintain business and operational sustainability. The identified information illustrates the numerous benefits obtained from the integration of AWS technologies in the modernization of SAP environments and improving SAP protection from vulnerabilities.
International Journal of Scientific Research in Science, Engineering and Technology, 2023
This research paper analyzes cryptographic techniques, performance considerations, security chall... more This research paper analyzes cryptographic techniques, performance considerations, security challenges, and compliance requirements associated with CMEK. The research also delves into advanced concepts and future directions, including homomorphic encryption and blockchain-based key management. CMEK offers a solution by allowing customers to retain control over their encryption keys while leveraging cloud infrastructure. This study explores the architecture, implementation, and implications of CMEK across various cloud service models.
International Journal of Scientific Research in Science, Engineering and Technology, 2022
SAP data center migrations are the subject of this paper with special attention to the methods, p... more SAP data center migrations are the subject of this paper with special attention to the methods, problems, and opportunities. The migration strategies are planning, data transfer techniques and the application of advanced devices and technology. Siemens, The Coca-Cola Company, and Shell are used as examples to explain how the approach work and the results achieved. Everyday problems like data accuracy and system failure, for instance, are discussed under pertinent solutions and designs and. It also outlines the positive effects of the paper on business such as improved performance and cost effectiveness and goes ahead to describe long terms effects such as scalability and flexibility. Other trends and technologies like Cloud computing, AI, and others are also covered to understand the future of SAP migrations.
International Journal of Scientific Research in Science, Engineering and Technology, 2022
This research considers the use of AWS technology with SAP applications and specific address busi... more This research considers the use of AWS technology with SAP applications and specific address business continuity and disaster recovery. A special emphasis is made on the advantages of the AWS solutions the elasticity, backup with automation and scaling up, the global redundancy, which makes traditional high availability and disaster recovery solutions obsolete. Issues such as integration issues and the cost aspect of integration are highlighted and followed by their solutions. This work shows how the AWS tools enhance the usability of SAP systems in terms of efficiency, extensibility, and fault tolerance in order to maintain business and operational sustainability. The identified information illustrates the numerous benefits obtained from the integration of AWS technologies in the modernization of SAP environments and improving SAP protection from vulnerabilities.
International Journal of Scientific Research in Science, Engineering and Technology, 2021
This research paper investigates the application of window-based refresh strategies to enhance th... more This research paper investigates the application of window-based refresh strategies to enhance the performance of data extracts in large-scale data management systems. Traditional extract, transform, load (ETL) processes often struggle with the increasing volume and velocity of data in modern environments. Window-based refresh strategies offer a promising solution by focusing on specific subsets of data during each refresh cycle. This study examines various window-based techniques, including time-based, size-based, and hybrid approaches, and evaluates their effectiveness in improving extract performance. Through extensive analysis and empirical testing, we demonstrate that window-based strategies can significantly reduce processing time and resource utilization while maintaining data consistency and integrity. The paper also explores optimization techniques, challenges, and future research directions in this field.
International Journal of Scientific Research in Science, Engineering and Technology, 2021
SAP data center migrations are the subject of this paper with special attention to the methods, p... more SAP data center migrations are the subject of this paper with special attention to the methods, problems, and opportunities. The migration strategies are planning, data transfer techniques and the application of advanced devices and technology. Siemens, The Coca-Cola Company, and Shell are used as examples to explain how the approach work and the results achieved. Everyday problems like data accuracy and system failure, for instance, are discussed under pertinent solutions and designs and. It also outlines the positive effects of the paper on business such as improved performance and cost effectiveness and goes ahead to describe long terms effects such as scalability and flexibility. Other trends and technologies like Cloud computing, AI, and others are also covered to understand the future of SAP migrations.
International Journal of Scientific Research in Science, Engineering and Technology, 2020
This article is devoted to the peculiarities of telemetry data processing pipelines optimization ... more This article is devoted to the peculiarities of telemetry data processing pipelines optimization for the platforms of massively multiplayer gaming. Since the amount, velocity, and variety of gameplay data continue to increase, real-time data handling has to be optimised for the sake of system performance and player experience. Based on MPI, Apache Spark, machine learning models, the work identifies approaches for predictive analytics and real-time data processing. That it examines how cloud environments are addressing fault tolerance and proposed different ways of collecting, processing and deploying models. AI and edge computing’s future advancements are also expected to address problems with data privacy, delay, and expandability.
International Journal of Scientific Research in Science, Engineering and Technology, 2019
This research paper explores the critical domain of automated testing for Distributed Shared Memo... more This research paper explores the critical domain of automated testing for Distributed Shared Memory (DSM) systems in multi-processor environments. As the complexity of multi-core and distributed computing systems continues to grow, ensuring the reliability and performance of DSM implementations becomes increasingly challenging. This study investigates various automated testing strategies, including test generation techniques, fault injection mechanisms, and concurrency detection methods. It also examines automated test execution frameworks, real-time monitoring solutions, and advanced verification and validation techniques. The research highlights the challenges faced in DSM testing, such as scalability issues and non-determinism, and proposes future directions for research, including the integration of artificial intelligence and cloud-based testing platforms. The findings of this study contribute to the advancement of DSM testing methodologies and provide valuable insights for both researchers and practitioners in the field of distributed systems and parallel computing.
This research paper explores the critical domain of automated testing for Distributed Shared Memo... more This research paper explores the critical domain of automated testing for Distributed Shared Memory (DSM) systems in multi-processor environments. As the complexity of multi-core and distributed computing systems continues to grow, ensuring the reliability and performance of DSM implementations becomes increasingly challenging. This study investigates various automated testing strategies, including test generation techniques, fault injection mechanisms, and concurrency detection methods. It also examines automated test execution frameworks, real-time monitoring solutions, and advanced verification and validation techniques. The research highlights the challenges faced in DSM testing, such as scalability issues and non-determinism, and proposes future directions for research, including the integration of artificial intelligence and cloud-based testing platforms. The findings of this study contribute to the advancement of DSM testing methodologies and provide valuable insights for both researchers and practitioners in the field of distributed systems and parallel computing.
International Journal of Scientific Research in Science, Engineering and Technology, Jun 8, 2019
ERP systems are critical in the administration of corporate activities and address huge volumes o... more ERP systems are critical in the administration of corporate activities and address huge volumes of transactional and operation information. However, since ERP systems combine many operations of an organization into one system it is prone to a possibility of developing an anomaly that can come from an erroneous data input or even instances of hacking hence causing operational insecurity and loss. This work seeks to understand how AI (Artificial Intelligence) and Machine Leaning (ML) can be used to determine abnormalities in ERP systems. Conventional methods of anomaly detection do not allow for detailed recognition and handling of complex patterns; thus, AI and ML are suitable for dynamic systems. The paper discusses different forms of anomalous situations in ERP systems and examines the potential of different learning techniques in increasing the effectiveness of anomaly identification. The framework that is presented provides for the incorporation of ML relied anomaly recognition in ERP systems to optimize operational efficiency as well as error identification in real time.
International Journal of Scientific Research in Science, Engineering and Technology, 2024
This study explores the extent to which SmartPM's AI-driven analytics can predict and mitigate th... more This study explores the extent to which SmartPM's AI-driven analytics can predict and mitigate the potential impending schedule delays on complex infrastructure projects. From this study, the exhaustive research based on case study and associated data asserts that AI-driven predictive analytics can significantly leverage the outcomes of projects by bringing the identified potential delays to notice before such occurrences and provide actionable insight regarding the possible strategies to mitigate it.
This study explores the extent to which SmartPM's AI-driven analytics can predict and mitigate th... more This study explores the extent to which SmartPM's AI-driven analytics can predict and mitigate the potential impending schedule delays on complex infrastructure projects. From this study, the exhaustive research based on case study and associated data asserts that AI-driven predictive analytics can significantly leverage the outcomes of projects by bringing the identified potential delays to notice before such occurrences and provide actionable insight regarding the possible strategies to mitigate it.
International Journal of Scientific Research in Science, Engineering and Technology, 2023
Due to the quick development of multimedia technology, now can see the increasing amount of multi... more Due to the quick development of multimedia technology, now can see the increasing amount of multimedia data that is being generated or transmitted over the internet. digital image is one form of multimedia data, that can take different formats and different sizes. And here comes the main objective of image compression in transmitting or storing the data in an effective manner by decreasing the redundant or irrelevant information without significant loss of visual quality in image data. Compression of the images is utilized in various services, including TV broadcasts, satellite imagery, martial communications, webinars, medical imaging, weather reporting, etc. This paper provides a survey of the main techniques in image compression, covering both lossy & lossless approaches. The techniques that depend on lossy compression lose some of the image details during compression, while lossless compression techniques keep the image information without losing. Vector and scalar Quantization, transform coding, block truncation Coding, etc. are all examples of lossy approaches. Run length coding, entropy encoding, statistical coding, etc. are all examples of lossless techniques. This paper will assist the researchers in learning about these techniques and choosing appropriate techniques for their work.
International Journal of Scientific Research in Science, Engineering and Technology
This paper discusses the security of data in cloud computing. It is a study of data in the cloud ... more This paper discusses the security of data in cloud computing. It is a study of data in the cloud and aspects related to it concerning security. The paper will go in to details of data protection methods and approaches used throughout the world to ensure maximum data protection by reducing risks and threats. Availability of data in the cloud is beneficial for many applications but it poses risks by exposing data to applications which might already have security loopholes in them. Similarly, use of virtualization for cloud computing might risk data when a guest OS is run over a hypervisor without knowing the reliability of the guest OS which might have a security loophole in it. The paper will also provide an insight on data security aspects for Data-in-Transit and Data-at-Rest. The study is based on all the levels of SaaS (Software as a Service), PaaS (Platform as a Service) and IaaS (Infrastructure as a Service)
International Journal of Scientific Research in Science, Engineering and Technology, 2024
This study investigates the creation of copper nanoparticles (CuNPs) using chemical reduction and... more This study investigates the creation of copper nanoparticles (CuNPs) using chemical reduction and laser ablation in liquid (LASIS). UV-visible spectroscopy is used to examine the optical characteristics of the nanoparticles created by these techniques. The purpose of the study is to compare the stability, efficacy, and particle size of CuNPs produced using different techniques. When comparing the LASIS method to the chemical reduction process, Transmission Electron Microscopy (TEM) examination revealed that the former produced smaller and more uniform nanoparticles. This work demonstrates the effectiveness of both synthesis techniques, with LASIS clearly outperforming the other in the production of superior CuNPs with more control over particle size and dispersion. A thorough explanation of the chemical reduction method and LASIS used in the synthesis of copper nanoparticles is provided, and UV-visible spectroscopy is used to characterize the resulting particles.
This research analyzes the role of the protocol department in facilitating the activities of regi... more This research analyzes the role of the protocol department in facilitating the activities of regional heads in Central Buton District, as well as identifying the factors that hinder its effectiveness. The research method employed is descriptive qualitative, detailing findings through interviews, observations, and document analysis. The results indicate that the protocol department of the Central Buton District Secretariat has not fully played its expected role in facilitating the activities of the regent. While the protocol department is able to maintain the government's image by placing the regent appropriately at events held within the internal environment, it faces challenges in positioning the regent favorably at events organized by other regional governments. The subpar performance of the protocol department is also evident in the frequent tardiness of the regent to event venues or other meetings, causing delays in the commencement of activities. These delays are attributed to the regent's busy schedule, which the protocol department has not effectively managed. Furthermore, the constraints faced by the protocol department in facilitating the activities of regional heads in Central Buton District include: lack of employee compliance with work discipline, unhealthy competition among employees, inadequate evaluation systems, and lenient sanctions against offenders.
International Journal of Scientific Research in Science, Engineering and Technology, 2024
Recent advancements in technology have enabled the storage of voluminous data. As this data is ab... more Recent advancements in technology have enabled the storage of voluminous data. As this data is abundant, there is a need to create summaries that would capture the relevant details of the original source. Since manual summarization is a very taxing process, researchers have been actively trying to automate this process using modern computers that could try to comprehend and generate natural human language. Automated text summarization has been one of the most researched areas in the realm of Natural Language Processing (NLP). Extractive and abstractive summarization are two of the most commonly used techniques for generating summaries. In this study, we present a new methodology that takes the aforementioned summarization techniques into consideration and based on the input, generates a summary that is seemingly better than that generated using a single approach. Further, we have made an attempt to provide this methodology as a service that is deployed on the internet and is remotely accessible from anywhere. This service provided is scalable, fully responsive, and configurable. Next, we also discuss the evaluation process through which we came up with the best model out of many candidate models. Lastly, we conclude by discussing the inferences that we gained out of this study and provide a brief insight into future directions that we could explore.
International Journal of Scientific Research in Science, Engineering and Technology, 2024
Recent advancements in technology have enabled the storage of voluminous data. As this data is ab... more Recent advancements in technology have enabled the storage of voluminous data. As this data is abundant, there is a need to create summaries that would capture the relevant details of the original source. Since manual summarization is a very taxing process, researchers have been actively trying to automate this process using modern computers that could try to comprehend and generate natural human language. Automated text summarization has been one of the most researched areas in the realm of Natural Language Processing (NLP). Extractive and abstractive summarization are two of the most commonly used techniques for generating summaries. In this study, we present a new methodology that takes the aforementioned summarization techniques into consideration and based on the input, generates a summary that is seemingly better than that generated using a single approach. Further, we have made an attempt to provide this methodology as a service that is deployed on the internet and is remotely accessible from anywhere. This service provided is scalable, fully responsive, and configurable. Next, we also discuss the evaluation process through which we came up with the best model out of many candidate models. Lastly, we conclude by discussing the inferences that we gained out of this study and provide a brief insight into future directions that we could explore.
Uploads
Papers by International Journal of Scientific Research in Science, Engineering and Technology IJSRSET
Aim: In the present study the relationship of tumor marker panel and breast cancer in an Iraqi population was investigated.
Patients and Methods: 100 women with breast cancer and 100 healthy controls were included in the study. All patients and control groups serum samples were subjected for determination of CEA, CA15.3, CA27.29, BRCA1, and BRCA2.
Results: Serum mean values of CA 15-3, CA 27-29, CEA, BRCA1 and BRCA2 were significantly higher in women with breast cancer than in controls. OR and relative risk confirm the association between serum increase of the five markers to breast cancer. AUC of ROC indicated the high sensitivity of their determination in breast cancer.
Conclusion:the present study show evidence that serum CA15-3, CA27-29 and CEA simultaneous determination arepotintial markers for early diagnosis of breast cancer metastasis and treatment minitoring.
In the Pithampur Industrial area there is no proper drainage system, which may become a major problem for drinking water contamination in future as number of industries increasing day-by-day. The industries are also having residential areas that generate domestic waste & the discharge was going as such because of no drainage system.
A study was conducted to determine the water quality of surface and ground water sources, near a busy industrial area at Indore. Buddhi, et al. (2009) reported that the ground water quality of this industrial area had deteriorated. Similar observations were found in the present study as substantiated by the values of Hardness, Chlorides and Alkalinity.
The major problem indicated by the residents was related to health issues and the decline in crop yield.
Finally it was concluded that the Ground water in the sector no. 1 of the Pithampur industrial area was hard and had higher concentration of chlorides. Though, the surface water had relatively less hardness and chloride content. The overall quality of water indicated that these were within the permissible limits, and could be utilized for drinking purpose after suitable treatment.
In this paper most important role discusses are smart Card, smart card Reader/Writer and microcontroller unit. Smart card is a chip card embedded with a computer chip that stores and transacts data. This data is associated with either value or information or both and is stored and processed within the card's chip. The card data is transacted through a reader that is part of a computing system. Designing of unique ID card for personal transactions i.e. one ID card is used for the different applications like attendance, access control, money transactions etc. this development is very helpful for the public in daily life.
In the present study, 2.5% (by weight) Sic particles were incorporated by using Friction Stir Processing (FSP) into the 6351 Aluminium alloy to form particulate composite layered materials. Samples were subjected to constant rotational and traverse speeds of the FSP tool with and without Sic reinforcements. Samples were subjected to machining by Friction stir machining process on CNC machine with constant rotational and transverse speed by using HSS tools having 16mm and 18mm as diameter. Micro structural observations were carried out by employing optical microscopy of the modified surfaces. Mechanical properties were evaluated by Hardness test on vikers Hardness tester and Surface Roughness Values by Surface Roughness tester.
Aim: To determine the frequency rate in different age groups.
Patients and methods: A total of 148 breast cancer cases included in the study. The analysis of variables performed with stratification of 10 years age interval.
Results: Age distribution of breast cancer indicated that 20.3% of cases were with age of ≤20 years and 14.9% were with age of 16-18 years. In addition more than half of cases [52.7%] were with age of less than 30 years. Furthermore, 79.7% of breast cancer cases were in women ≤40 years of age. Only 5.4% of cases were with age of > 45 years. Odd ratio confirmed a significant association between age and breast cancer development in our study cohort. The highest frequency was in women with age of 21-30 years, followed by those with age of 31-40 years. Age of women with breast cancer significantly influences the CEA and ER mean serum values whether the analysis performed on group or individual stratification. In addition, P53 mean serum level in women with breast cancer was significantly different when the analysis performed on individual stratification, however, non-significant differences was achieved between age group. The same pattern was demonstrated for CA 27-29 and PR.
Conclusion: Age at diagnosis was with two decades earlier than that in Western countries.
In this paper the experiments have been conducted by using Taguchi Methodology with the DOE techniques and developed a mathematical model to predict material removal rate & average surface roughness using input parameters such as pulse on(Ton),pulse off (Toff) , current (I), voltage (V).