Affiliations M.Tech (CSE), Priyadarshini Institute of Technology & Science for womenâ€™sAssociate Professor ( Dept.of CSE), Priyadarshini Institute of Technology & Science for Womenâ€™s
Data deduplication may be a methodology of reducing storage want, which has elimination of redundant information.
Only 1 distinctive instance of the information is really maintained on storage media. Information deduplication is additionally called
â€œintelligent compressionâ€ or â€œsingle-instance-storageâ€. Itâ€™s been wide employed in cloud storage to cut back the number of space for
storing and save information measure. To shield confidentiality of sensitive information whereas supporting deduplication, the
encoding technique has been projected. To raised shield information security, we have a tendency to create an effort to formally
address the matter of licensed information deduplication. Totally different from previous deduplication systems, the differential
privileges of users area unit any thought-about in duplicate check beside the information itself. We have a tendency to given a brand
new deduplication construction supporting licensed duplicate register hybrid cloud design. Security analysis demonstrates that our
schema is secure in terms of the definitions per the projected security model. We have a tendency to show that our projected schema
of licensed deduplication incurs borderline overhead compared to traditional operations.
K.Pushpalatha,B. Ranjithkumar."Secure Approved Deduplication in Hybrid Cloud". International Journal of Computer Engineering In Research Trends (IJCERT) ,ISSN:2349-7084 ,Vol.2, Issue 10,pp.853-856, October- 2015, URL :https://ijcert.org/ems/ijcert_papers/V2I1011.pdf,
 S. Quinlan and S. Dorward. Venti: a new approach to archival storage. In Proc. USENIX FAST, Jan 2002.
 J. R. Douceur, A. Adya, W. J. Bolosky, D. Simon, and M. Theimer. Reclaiming space from duplicate files in a serverless distributed file system. In ICDCS , pages 617â€“624, 2002.
 Jin Li, Yan Kit Li, Xiaofeng Chen, Patrick P. C. Lee, Wenjing Lou. A Hybrid Cloud Approach for Secure Authorized Deduplication M. Wegmuller, J. P. von der Weid, P. Oberson, and N. Gisin, â€œHigh resolution fiber distributed measurements with coherent OFDR,â€ in Proc. ECOCâ€™00, 2000, paper 11.3.4, p. 109.
 M. Bellare, S. Keelveedhi, and T. Ristenpart. Message-locked encryption and secure deduplication. In EUROCRYPT, pages 296â€“312, 2013.
 M. Bellare, C. Namprempre, and G. Neven. Security proofs for identity-based identification and signature schemes. J. Cryptology, 22(1):1â€“61, 2009.
 S. Bugiel, S. Nurnberger, A. Sadeghi, and T. Schneider. Twin clouds: An architecture for secure cloud computing. In Workshop on Cryptography and Security in Clouds (WCSC 2011), 2011.
 M. W. Storer, K. Greenan, D. D. E. Long, and E. L. Miller. Secure data deduplication. In Proc. of StorageSS, 2008.
 J. Li, X. Chen, M. Li, J. Li, P. Lee, and. Lou. Secure deduplication with efficient and reliable convergent key management. In IEEE Transactions on Parallel and Distributed Systems, 2013.
Authors are not required to pay any article-processing charges (APC) for their article to be published open access in Journal IJCERT. No charge is involved in any stage of the publication process, from administrating peer review to copy editing and hosting the final article on dedicated servers. This is free for all authors.
News & Events
Latest issue :Volume 10 Issue 1 Articles In press
☞ INVITING SUBMISSIONS FOR THE NEXT ISSUE :
☞ LAST DATE OF SUBMISSION : 31st March 2023
☞ SUBMISSION TO FIRST DECISION : In 7 Days
☞ FINAL DECISION : IN 3 WEEKS FROM THE DAY OF SUBMISSION
All the authors, conference coordinators, conveners, and guest editors kindly check their articles' originality before submitting them to IJCERT. If any material is found to be duplicate submission or sent to other journals when the content is in the process with IJCERT, fabricated data, cut and paste (plagiarized), at any stage of processing of material, IJCERT is bound to take the following actions.
1. Rejection of the article.
2. The author will be blocked for future communication with IJCERT if duplicate articles are submitted.
3. A letter regarding this will be posted to the Principal/Director of the Institution where the study was conducted.
4. A List of blacklisted authors will be shared among the Chief Editors of other prestigious Journals
We have been screening articles for plagiarism with a world-renowned tool: Turnitin However, it is only rejected if found plagiarized. This more stern action is being taken because of the illegal behavior of a handful of authors who have been involved in ethical misconduct. The Screening and making a decision on such articles costs colossal time and resources for the journal. It directly delays the process of genuine materials.