“Digital memory is a double edged sword,” said Susan Schreiner, analyst for C4 Trends.
“It has made it easier for people to share and store data, but it is absolutely a problem if you don’t have the hardware to access it,” Schreiner told TechNewsWorld.
The problem remains that as systems are updated old programs don’t run, but this is a problem that has existed for years.
“My first computer was an Atari,” added Schreiner.
“I have a box of old Atari disks with articles on it, but no way to access it,” she noted. “You really have to keep the old hardware around and if you don’t have it there is no way to access the old data.”
Even if the hardware is available, it doesn’t necessarily provide a way to ensure that the data can be moved to more modern hardware.
“There is no elegant solution, no sophisticated solution when you’re talking about the older digital files,” Schreiner explained. “There is a shortness associated with digital memory that is catching people off-guard.
“People who only have done digital photography find out the hard way; one computer crash and it is all gone,” she stressed.
‘Digital Dark Age’ Imminent Warns ‘Father of the Internet’
A “Digital Dark Age” could be looming, warned Vint Cerf, vice president and chief Internet evangelist for Google. His warning came this week at the annual meeting of the American Association for the Advancement of Science conference in San Jose.
Cerf, who was a manger at for the United States’ Defense Advanced Research Projects Agency (DARPA) and funded groups that developed TCP/IP technology and has been recognized as a “Father of the Internet,” warned that as technology advances there is a risk of losing access to data on older technology.
“Old formats that contain documents, photos and other data may not be readable with the latest version of the software,” Cerf said, adding that backwards compatibility is not guaranteed, and over time there could be vast archives of digital content that simply can’t be accessed.
“Digital memory is a double edged sword,” said Susan Schreiner, analyst for C4 Trends.
“It has made it easier for people to share and store data, but it is absolutely a problem if you don’t have the hardware to access it,” Schreiner told TechNewsWorld.
New Take on an Old Concern
While Cerf suggested that this is a growing issue, notably with the adoption of more and more devices with ever-improved operating systems, the concern over data being left behind is not entirely new.
“An interesting thing about Cerf’s comments is that they spotlight an issue that’s been a subject of concern for at least two decades, usually due to the inaccessibility of some important document or another,” said Charles King, principal analyst at Pund-IT.
“Those issues typically spark individual recovery efforts but nothing like a concerted, industry-wide strategy or project to address the problem,” King told TechNewsWorld. “Around 2005, a number of folks began proposing that critical historical data and archives be posted online, and that’s generally where things have headed since then.
“But that doesn’t address the central point — that the IT marketplace has locked consumers and businesses into an ‘eternal upgrade cycle’ where new technologies are launched, mature, fail to keep up with new developments and are eventually superseded by next generation technologies,” King added.
The Digital Lost World
The problem remains that as systems are updated old programs don’t run, but this is a problem that has existed for years.
“My first computer was an Atari,” added Schreiner.
“I have a box of old Atari disks with articles on it, but no way to access it,” she noted. “You really have to keep the old hardware around and if you don’t have it there is no way to access the old data.”
Even if the hardware is available, it doesn’t necessarily provide a way to ensure that the data can be moved to more modern hardware.
“There is no elegant solution, no sophisticated solution when you’re talking about the older digital files,” Schreiner explained. “There is a shortness associated with digital memory that is catching people off-guard.
“People who only have done digital photography find out the hard way; one computer crash and it is all gone,” she stressed.
“The changing formats of data are frustrating for many, just ask anyone who went from albums, to CDs, to digital formats,” added Jim McGregor, principal analyst at Tirias Research.
“Unfortunately, digital formats continue to change,” McGregor told TechNewsWorld. “However, as more information moves to the cloud, converting these formats will likely become easier because it can be done by the servers and as a service to the user.”
Concerns About the Cloud
The issue is whether the cloud is the perfect solution in itself.
“You can lose access for periods or time, you could lose data if it is not properly backed up or a disaster occurs, and it is more prone to security intrusion attacks,” said McGregor.
The security concerns could also outweigh those of lost data.
“It strikes me as ironic that on the one hand people worry about security, privacy and having too much of their personal information out there on the Web, and on the other hand there’s this gradual decay of available information,” explained Dr. Joost van Dreunen, cofounder of Super Data Research.
X-Rays Mark the Spot
Cerf suggested that an X-ray snapshot of the content, its application, and most importantly the operating system, could be one way to ensure that future generations are able to reproduce the technology to retrieve said data.
There may be two problems with this solution however. One is that the X-ray would have to be preserved — and that likely should not be digitally. The other issue is that it might involve a corporate entity to maintain the X-ray, which brings the entire problem full circle.
“If your vendor of choice goes broke, you’re usually completely out of luck, but even successful vendors can be less than helpful when it comes to supporting out-of-date devices and data,” noted King.
“Frankly, I don’t see a way out of this quandary absent some sort of voluntary agreement or industry regulations,” King added. “In the current business climate and absent serious vocal demands by IT customers, neither of those outcomes seems likely.”
Future Data Mining
The data mining of tomorrow might not be looking for personal information or usage patterns by users, but rather could be literally a type of digital recovery of lost data.
“In 50 years from now there will no doubt be Internet archeologists data mining old information and using archaic software applications to uncover its stories,” van Dreunen told TechNewsWorld. “One day the Internet may be largely useless, inaccessible information.
“What will prove to be the most important skill above all, I think, is figuring out how to navigate this data wasteland,” van Dreunen added.
For now the best option might be to do what data storage experts suggest, back up the data and ensure it isn’t just left behind as platforms evolve.
“The old adage is the best; use multiple sources to backup your data and do it frequently,” said McGregor. “Cloud resources and services will be a key resource for all consumers going forward, but it is still best to have local storage as well if the information is deemed critical.
“On another note, I wouldn’t paint such a dire picture,” McGregor suggested. “Things change over time and network enhancements, cloud services and connectivity continue to improve the capabilities and reliability of technology.”