1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.
10 years of news and resources for members of the IEEE Signal Processing Society
Memory is one of the three key factors for cognitive dynamic systems. A paper titled A Comprehensive Study of the Past, Present, and Future of Data Deduplication, published in the Proceedings of the IEEE Sep. 2016, provides a comprehensive survey of the state of the art in data deduplication technologies for storage systems, covering key technologies, main applications, open problems, and future research directions.
"The amount of digital data has been growing at an explosive rate in recent years and the trend is expected to continue into the future. This “data deluge” has created new challenges in the area of mass storage systems. One of the most important challenges involves the effective management of storage costs generated by the high volume of data. A number of storage technologies such as thin provisioning, automated tiering, data reduction, and so on offer interesting solutions for efficiently managing data and thus reducing storage cost. Some solutions also consider the use of solid-state drives and energy efficiency. What is interesting to note, as per workload analyses conducted by a number of corporations, is that a lot of data in primary and secondary storage systems is actually redundant. This means that the data could be eliminated without any loss of information. As a result, data deduplication technologies have been an important research topic and have highly influenced the development of commercial storage systems in order to increase storage efficiency and reduce storage costs.
The general methodology of data deduplication is an extension of Lempel–Ziv data (file) compression, which employs dictionary-based byte-level similarity search。 In data deduplicaton, storage capacity overhead is reduced by identifying “chunks” of data with the same content within huge data sets and only storing them once. It eliminates redundant data at the file or subfile level and identifies duplicate content by its cryptographically secure hash signature (i.e., collision resistant fingerprint), which is shown to be much more computationally efficient than the traditional compression approaches in largescale storage systems. This paper provides a comprehensive survey of the state of the art in data deduplication technologies for storage systems, covering in-depth key technologies and discussing main applications and the industry trend of data deduplication. It concludes with a discussion of the open problems and future research directions facing deduplication-based storage systems."
from: Wen Xia, Hong Jiang, Dan Feng, etc. A Comprehensive Study of the Past, Present, and Future of Data Deduplication. Proceedings of the IEEE Sep. 2016, pp. 1681-1710
|Call for Nominations: Chair, Women in Signal Processing Committee and Chair, Young Professionals Committee||10 July 2020|
|ALASKA 2 Steganalysis Challenge is Open||13 July 2020|
|Nominations Open for 2020 SPS Awards||1 September 2020|
|Call for Nominations: SPS Chapter of the Year Award||15 October 2020|
© Copyright 2020 IEEE – All rights reserved. Use of this website signifies your agreement to the IEEE Terms and Conditions.
A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity.