Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
…
4 pages
1 file
In this information world, large amounts of data are collected and analyzed every day. Cloud computing is the most known model for supporting large and complex data. Organizations are moving toward cloud computing for getting benefit of its cost reduction and elasticity features but cloud computing has potential risk and vulnerabilities. One of major problem in moving to cloud computing is its security and privacy concerns.Encryption is standalone problem for the security of data stored on the cloud. So we proposed method which combines the concept of encryption along with data deduplication methodology to enhance the privacy of data over cloud. Data deduplication is a specialized data compression technique for eliminating duplicate copies of repeating data in storage. In turns this technique saves the cost and time associated with redundant accessing and processing of data overhead involve as compared to normal operations..
IAEME PUBLICATION, 2020
In recent years, the business and private people use cloud storage providers to outsource their data in a cloud environment. Cloud computing environment uses the various resources available in the internet to offer many services to date holders. Data violation event makes end –to– end encryption is necessary for preventing the data from intruders. Existing deduplication schema does not work well on encrypted data also suffer from security issues. Unfortunately, semantically secure encryption schema provides different storage optimization techniques to deduplicate data stored in cloud based on data ownership challenge, deduplication and proxy re-encryption. Our proposed method initiates the concepts of data popularity conflict that data known by many users do not require strong production as unpopular data. We also extend the original schema by focusing popularity of the data also analyze the efficiency of the proposed schema and highlight the clear functionality. The efficiency of the system can be analyzed by real dataset's properties and present clear evaluation of the system. Data deduplication significantly focuses the handling of sensitive decryption shares and popular data in cloud storage. The experimental results show that the proposed scheme is protected under Symmetric External Diffie – Hellman assumption in the random oracle model.
Most of the IT industry heading towards the cloud –based storage services .cloud storage is witnessed to weak in security and privacy for public cloud environments .To tackle these security challenges ,we propose a new deduplication at client side for secure data storing and data sharing among the cloud users through public cloud .our proposal is mainly concerned on the owner of the file is encrypted the data that he intended to upload to the cloud by applying per data key, so data access controlled by data owner and log file which contains retrieval rights of the cloud users ,an authorised user can decipher an encrypted file with his private key.
Now a day’s Data storage over cloud has been very fresh and popular technology. It offers you to store your large amount of data with less cost estimation and to retrieve that stored information anytime from cloud; so it mostly likes by people and researchers. But problem faced by many of the data storage systems is that storing duplicate copies of files having no use. So main confront to cloud services is that managing that ever-increasing quantity of data having only unique information or data copies. For making scalable data management, new technique or term comes in picture which is called as deduplication. In this term deduplication, it stores only unique file rather than keeping multiple copies of the data which having indistinguishable data. Maintaining data privacy and its secrecy is key term considered in any cloud environment. Providing data confidentiality for users or the owners is mainly accomplished by using method Convergent encryption as an alternative to the previous encryption technique. The concept named Duplicate Check which is used for penetrating the records that are duplicated and those are based on results which only stores unique contents. We have shown experimental results that gives us less utilization of storage space and simultaneously consuming less Network bandwidth.
Asset Analytics, 2019
Data duplication is a data quality problem which may exist in database system where the same record is stored multiple times in the same or different database systems. Data duplication issue may lead to issues like data redundancy, wasted cost, lost income, negative impact on response rate, ROI, and brand reputation, poor customer service, inefficiency and lack of productivity, decreased user adoption, inaccurate reporting, less informed decisions, and poor business process. The solution to the problem of data duplication may be countered with data deduplication which is often termed as intelligent compression or single instance storage. Data deduplication eradicates duplicate copies of information resulting in the reduction of storage overheads and in enhancement of various performance parameters. The recent study on data deduplication has shown that there exists modern data redundancy in primary storage in the cloud infrastructure. Data redundancy can be reduced in primary storage system of cloud architecture using data deduplication. The research work carried out highlights the identified and established methods of data deduplication based on capacity and performance parameters. In the research work, the authors have proposed a performance-oriented data (POD) deduplication scheme which improves performance and primary storage system in the cloud. In addition to this, security analysis using encryption technique has also been performed and demonstrated to protect the sensitive data after the completion of deduplication process.
In this paper we study about hybrid cloud approach for secure authorized deduplication. Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data, and has been rapidly used in clouds to reduce the amount of storage space. To protect the privacy of sensitive data while supporting deduplication, the convergent encryption technique has been used to encrypt the data before outsourcing. To better protect data security, this paper makes the first attempt to formally address the problem of authorized data deduplication. This technique is different from traditional deduplication systems, the differential privileges of users are further considered in duplicate check besides the data itself. We also present several new deduplication constructions that have been supporting the authorized deduplication in a hybrid cloud environment. Security analysis demonstrates that our deduplication scheme is secure by the definitions specified in the proposed security model. As a proof of concept, we implement a prototype of our proposed authorized deduplication scheme and conduct tested experiments using our prototype. We show that our proposed authorized deduplication scheme incurs minimal overhead compared to normal operations.
2015
Cloud Computing provides many resources to users as services such as highly available storage space. To manage the ever-increasing volume of data in cloud is a critical task. To make data management scalable in cloud computing, data deduplication is a technique. It is the technique of data compression for eliminating duplicate copies of repeating data in cloud storage to reduce the amount of storage space. To better protect the security of data during deduplication, the technique can be used called convergent encryption. This technique is used to encrypt the data before outsource to the data storage server and it should be authorized. The differential privileges of users can be considered while supporting the data deduplication. The data is encrypted by using convergent key, which is derived from the hash value of the same data. A critical issue of making convergent encryption practical is to efficiently and reliably manage a huge number of convergent keys derived for different data...
IJARIIT, 2018
Cloud Computing is an information technology concept which plays a vital role in data processing and data storing. It also plays a crucial role in the Internet of Things (IoT). The data stored in the cloud should be secured to prevent the unauthorized access. There comes a data security concept known as Encryption. In order to maintain the users, Privacy and the security of the data is stored in the cloud in the encrypted or cipher-text format. By this, only the encrypted data is going to be stored in the cloud which reduces the usage of the storage devices up to a great extent. This is mainly used for storing very large size datasets like for the big data. We have a lot of deduplication schemes which avoids the duplicate data, but the main problem with those schemes are lack of security and lack of tractability for the secure data access control. Due to these two problems, very few of them are taken into practice. In this, we used a scheme known as Attribute-based Encryption to deduplicated the encrypted data to provide secure data access control.
International Journal of Security and Its Applications, 2016
At the present time, cloud computing furnish large volume of area for storage of data as well as huge equivalent computing at affordable rate. Due to advantages of cloud computing it turn out to be widespread; extreme quantity of data can be stored on cloud. But, rise in size of data has raised numerous new obstacles. Deduplication is one of significant compression method of data for reducing carbon copy of replicating data that being used in cloud to reduce the volume of storage area and it utilizes less network bandwidth. Data Deduplication is the convergent encryption technique that has been projected to encrypt the data before it's been sent out that preserve the confidentiality of responsive data. For better data security, it makes the initial effort that officially point out the dilemma of authorized data Deduplication. There are numerous new Deduplication structures that provides authorized duplicate check in hybrid cloud structural design that acquire negligible overhead over standard operation.
Nipun Chhabra, 2019
Data Deduplication techniques were invented to eradicate duplicate data which result in storage of single copies of data only. Data Deduplication decreases the disk space required to store the backups in the storage space, tracks and eliminate the second copy of data inside the storage unit. It allows only one instance data occurrence to be stored originally and then following instances will be given reference pointer to the original data stored. In a Big data storage environment, huge amount of data needs to be secure. For this proper management, work, fraud detection, analysis of data privacy is an important topic to be considered. This paper examines and evaluates the prevailing deduplication techniques and which are presented in tabular form. In this survey, it was observed that the confidentiality and safety of data has been compromised at many levels in prevalent techniques for deduplication. Although much research is being carried out in various areas of cloud computing still work pertaining to this topic is scant.
Ankimi ndaj vendimit të përgjimit, 2023
Ky punim është botuar në nr. 1 të Revistës "Jeta Juridike", viti 2023, të Shkollës së Magjistraturës. Në të trajtohen çështje të tilla si: a) vendimet e ankimueshme të përgjimit; b) ankimi ndaj vendimit të refuzimit të përgjimit; c) procedura e gjykimit rishikues të ankimeve të përgjimit; ç) problemet kushtetuese dhe interpretimi pajtues me Kushtetutën të nenit 222/a të Kodit të Procedurës Penale; d) nisja e afatit të ankimit ndaj vendimeve të përgjimit. Gjithashtu jepen edhe konkluzionet dhe rekomandimet respektive mbi këtë studim.
Routledge Handbook of Sport Expertise, 2015
International Journal of Power Electronics and Drive Systems (IJPEDS), 2023
Bibliotheca Agustiniana, vol. XIII, 2023, pp. 198-258.
Studies in language companion series, 2013
Revista Norte Histórico. Estudios de historia regional N° 4, 2015
Third World Approaches to International Law Review , 2019
Journal of Intelligent Manufacturing, 1992
International Journal of Management Studies
European Journal of Nuclear Medicine and Molecular Imaging, 2003
Health and quality of life outcomes, 2024
International journal of applied economics, finance and accounting, 2023
Managing Agricultural Greenhouse Gases, 2012
International Journal of Disaster Management, 2024
Asian Economic and Financial Review, 2019