Welcome to the Digital Dark Ages — an era of unprecedented information gathering likely to leave no lasting impression on the future, thanks in large part to a cross-departmental lack of understanding of the business requirements for data archiving.Or at least that’s the tenor of a recent study conducted by SNIA’s 100 Year Archive Task Force, undertaken to shed light on the long-term fate of digital information as dictated by today’s datacenter migration and archiving policies.But whereas the late 5th and 6th centuries earned the “dark” moniker due to a dearth of texts from which to reconstruct historical events and philosophical leanings, today’s informational blind spot may in fact be developing due to a lack of attention being paid to establishing the reconstruction mechanism itself. “Long term, from an IT perspective, seems to come down to the 15-year period, during which three to four infrastructure refreshes will have occurred ,” says Jeff Porter, chairman of the SNIA data management forum. “IT will have the ability to keep the bytes and bring them along, but not necessarily the ability to maintain the applications, and it’s at that point that the risk of not being able to interpret the data begins to creep in.”Because of this, 60 percent of the survey’s 276 respondents said they are “highly dissatisfied” with the likelihood of being able to read their retained information 50 years hence. And, although 50 years in computing time may seem an eternity, according to the survey of IT, security, legal, inventory management, and archive professionals, 80 percent of organizations retain at least some portion of their data stores for more than 50 years — and 68 percent said they have data they must keep for more than a century.“We were quite surprised by those numbers,” Porter says. The irony, of course, is that the very accumulation of data to be archived is among the chief factors pushing long-term archiving concerns even further down the IT priority stack. There are simply too many other business-critical endeavors not getting done to put resources into ensuring the readability of data a quarter century or more down the line.“Archiving has not been viewed as a valuable business service to date,” Porter notes. “But regulatory compliance requirements and the risk of fines really have businesses looking at these issues today, and they’re discovering that archive has value and can help reduce their cost and limit their risk to legal exposures.”Chief among the SNIA task force’s post-survey directives is to get application providers and storage system vendors on the same page when it comes to providing organizations with a means for reproducing original content unaltered over time. “You have to get the application vendors involved. You have to move and store this data in a format that can be logically reproduced in the future and we don’t even know what the data formats will be 30 to 40 years from now,” Porter says.To achieve this, SNIA is betting on XAM (eXtensible Access Method), which it believes will provide much-needed metadata communication between applications and storage systems, thereby easing the ability to move data around in heterogeneous storage environments without application awareness.Whether XAM can truly lead to an archiving Rosetta stone that will open today’s troves to future generations remains to be seen. After all, XAM itself has yet to be seen, as XAM-based technology is slated for inaugural demonstration at Storage Networking World in Dallas in October. That said, any enterprise that has attempted to reclaim knowledge locked away in legacy data formats knows that the case for some sort of comprehensive solution to long-term data reconstruction issues is compelling.