The OmnissiahOf course the other problem is how much do you bootstrap? I mean do you start with fire and the wheel? Smelting copper? Electricity? Quantum mechanics? I would think the "Ideal" archival material would have many layers starting with human readable that could also include instructions to read the next layer.
Example....
First layer: Information important to "primitive" people and instructions to grind glass to make a magnifier.
2nd layer to small to read clearly without a magnifier. Similar information but instructions on making a microscope.
3rd layer to small to read without a microscope, more information and instructions....
Not sure where we could find this magical material though.
Interesting...I didn't think that many people had seen that film.Or the glass crystals in Zardoz.
They had both:If you made an optical cube and a laser-scanner that can read dimensionally, this scales HUGE!
So, The Expanse data cube could be reality.
(I can't recall the movie/show that had data cubes other than perhaps, Star Trek and that I recall were chips/squares.)
It was not...the best of films...nor the worst of films. Conceptually interesting, but some bizarre "huh?" scenes for sure.No such film exists. Put fingers in ears and goes la la la la.
Is that you, John "There is no evidence anyone wants (the mouse) as an input device" Dvorak?From a purely cost point of view - that laser to write the data and the gear needed to read it back are expensive.
Combined with the slow speed something like this needs dramatic cost reductions in very very complicated and niche gear to be even remotely viable for anything but the most important and least dense data.
EDIT: I'd love to discuss this with the folks who downvoted me.![]()
Combined with the slow speed something like this needs dramatic cost reductions in very very complicated and niche gear to be even remotely viable for anything but the most important and least dense data.
One possible reason for a downvote is that optical storage since CDs have included various error correction schemes to account for a certain level of scratches. Granted, if the data is too damaged, good luck.CRC Error at sector: 836864268543
Retry Abort
Edit: My assumption is the down votes are from people too young to understand the reference. If you know, you know.
I recall that in Monkey Island this joke was patched out purely to save the tech support hotline the headache. It's a shame they didn't patch it back into the remake, long after anyone would have actually believed it.One possible reason for a downvote is that optical storage since CDs have included various error correction schemes to account for a certain level of scratches. Granted, if the data is too damaged, good luck.
Also, the worst is when disk 21 of your 25 disk install is bad.
Yes, I think a lot of folks miss the bit about being able to access the data after it's written. The "drive" needed to read the glass medium requires lots of expertise to setup and maintain, not to mention is prohibitively expensive.The medium may last forever... if nobody cracks it.
And in practice, 20 years later you'll probably struggle to find drives to read it.
If you want to preserve data, keep copying it.
And in Voyager, bio-neural gel packs... and in at least one episode, those gels got sick.
That was my thought too. Yes, we can generate Petabytes of data easily. But I imagine that data we intend to archive for 10,000 years will be curated to some degree - not just raw data dumps.Okay, true, there’s things generating petabytes of data. But no human being does, and neither do the vast majority of corporations and other entities.
Most people probably generate at most of few hundred GB of truly original data in their lifetimes unless they take a lot of video.
Completely accidental.Time to come clean, John... was that pun intentional or accidental?
You’re no doubt right, but as others have mentioned, some of the most valuable information historians have found has been stuff no one would have thought to archive at the time. The Roman curses at Bath, for example (chosen only because I’ve been there). I doubt anyone doing serious data archival at the time would have committed them to laser etched glass. But “Docimedis has lost two gloves and asks that the thief responsible should lose their minds and eyes in the goddess' temple.” gives a lot of insight into what “normal life” was like there and then.That was my thought too. Yes, we can generate Petabytes of data easily. But I imagine that data we intend to archive for 10,000 years will be curated to some degree - not just raw data dumps.
I hadn't thought about Zardoz for years, but I can still conjure up faint memories of the flying head.Interesting...I didn't think that many people had seen that film.
Right?! Why even bother getting out of bed!Why are we developing a system that will store data for millennia? The way we’re going, we won’t be around for more than 20 more years.
Wow, something must have changed - I was always under the impression that LTO was W/R N-1 gen, R was N-2 generations.LTO is a terrible medium for long term archival.
The tapes may be rated for "30 years" but in ~10 years they won't be manufacturing drives capable of reading current tapes.
E.g. LTO-10 drives can't even read LTO-9 tapes.
Wow, something must have changed - I was always under the impression that LTO was W/R N-1 gen, R was N-2 generations.
How exactly do you believe "the singularity" can recover data that was never recorded? Some math isn't reversible. 2+2=4 Now, let's start with the end of that equation. 4. What produced it? Is there any way to tell? I know there's a postulate that if we simply measured EVERY single particle and their momentum, we could "reverse" all the math of solved physics and figure out literally everything that's ever occurred. That... makes a lot of assumptions. No matter how advanced a computer, it can't measure every particle in the universe at once, not least because simultaneity breaks down at the universal scale, but also because of the uncertainty principle preventing that from being doable in the first place, and basic true randomness at the quantum level making some things literally irreversible even in principle. But, also, there's no way to say that a singularity can just "solve" physics completely. No intellect, no matter how advanced, can do such without observations.In 10,000 years we will either be far past the singularity and have no need for something like that as we could recall all data in all of human history or be using wooden clubs to fight off other and breaking the demon machines and everything associated with them.
What a fascinating read that is! The following particularly jumped out at me WRT the specific topic at hand (emphasis mine):Dr. Peter Gainsford (aka kiwihellenist) wrote at length about it here, including explaining the source of the myth of its destruction setting back civilization: Carl Sagan on a segment of Cosmos in 1980.
The survival of ancient books isn't something that depends on one repository: that would put them at the mercy of regime changes, shifts in governmental priorities, funding. Books survive if they were copied, repeatedly. The story of ancient books being lost isn't a story of library fires: it's a story of economics, long-term cultural developments, and above all, format shifts.
We're in the multi-hundred PB range. We're to the point of doing erasure coding across tapes to ensure we can actually read every bit on the 5-10 year cadence to move to new media.I am working with tape archival at EB range.
If your tape software is serious about data preservation, the tape drive will read and verify each written tape block of your data as you are writing it (Logical Block Protection feature) this is a feature that has been around for a while on TS11XX and LTO drives (even in the last generations of 10K drives).
Therefore written data is always tested by design at 0 performance cost contrary to disk.
Migrating data every 5 years is a bit complicated at EB scale as the cost of media is several order of magnitude higher than the infrastructure cost: today 10 years may be a more reasonable target but this is more likely to increase further as medium amortization cost and exponentially growing amounts of data requires to keep data (and drives and libraries) longer and longer until we reach the 30 years data duration wall.
Then I guess this problem is for future generations of tape archive service managers: not for me to solve, just for me to warn.
I don't disagree - I want this to succeed.I didn't downvote you but nobody is suggesting this isn't a niche tech. Having something for the most important data (and moderate not low density) would be great. So you likely got downvotes by setting up intentional or otherwise as a strawman.
I don't disagree - I want this to succeed.
I'm just being realistic on the timelines for it being viable. When your media reader/writer costs orders of magnitude more than the existing technology and runs orders of magnitude slower...that's a long ramp. There aren't a lot of external pressures to bring those technologies dramatically forward at this point.
I still have nightmares, the red outfit......... The horror, the horrorInteresting...I didn't think that many people had seen that film.
The glass is indeed cheap from estimates I've seen (from other companies too).LTO-10 drives aren't exactly cheap. If this hit the market at $50k for the drive and glass slabs for $50 each there would be a market. Even $100k and $200 there would be a small market.
As I commented on the earlier page, the loss of the Library of Alexandria was not catastrophic. As best we can tell, Carl Sagan made up the effects of its destruction out of whole cloth, or perhaps passed on something he heard but never verified. (Learning this was disappointing to me, as Sagan was critical to public science education.) Professional historians do not believe much, if anything, was permanently lost because there were hundreds of other libraries scattered around the Mediterranean, and copying manuscripts was common, if expensive.The Library of Alexandria was irreplaceable because replicating the entire collection would have required paying thousands of scribes—who were among the best-educated people in classical times, and consequently extremely valuable to society—to spend decades to do nothing but copy old books. The cost in human resources would have been too much for even the Roman emperors to bear.
Ah yes, the nightmare that is 4K Bluray has really driven that point home. There are processors right now that are incapable of playing back 4K movies, even with all the right software set up, purely because of Sony's insistence on setting up such a ridiculously layered DRM that they have effectively killed their own product. As it stands, if I want to play 4K movies, I use a console. If I want to RIP 4K, I need to hack custom firmware onto my drive, presuming someone's developed custom firmware for my specific model. With console makers eager to send optical drives to the dustbin of history, and streaming so easy to use and so popular, 4K is already a niche market. However, Sony's insistence on processor DRM which Intel themselves have decided to simply stop including, as well as insisting on locking down drives so people can't even get to their OWN movies through roundabout means without FURTHER roundabout means... THAT has flat out killed the product. It's not truly dead until companies simply stop producing 4K releases, but there's no way to sponge the writing off that stone at this point.Yes, I think a lot of folks miss the bit about being able to access the data after it's written. The "drive" needed to read the glass medium requires lots of expertise to setup and maintain, not to mention is prohibitively expensive.
Same thing with optical media for home use. This medium isn't prohibitively expensive. However, that disc might last your entire lifetime, but will the optical drive? Not to mention, will a computer exist that has the right interface for said optical drive?
Lol, I played that game and don't recall the joke. Oh well. Pretty sure that was before you could easily get game patches. Actually, how'd you even find out about that patch story? Like on FIDO net or IRC or one of the gaming magazines we used to have?I recall that in Monkey Island this joke was patched out purely to save the tech support hotline the headache. It's a shame they didn't patch it back into the remake, long after anyone would have actually believed it.
The mechanism by which aliens made contact in the novel Contact is the key.All of this is great, but has storing the data ever actually been the problem? I'm decidedly ignorant here, so I'm asking a genuine question. My read on this was always that it was the data itself that became obsolete, as file formats aged out, software changed, etc. If you don't have software that can read the data, what good is the data?
I was talking about the "floppy not found or corrupted" error rather than the CRC one, sorry! For more details, check this out!Lol, I played that game and don't recall the joke. Oh well. Pretty sure that was before you could easily get game patches. Actually, how'd you even find out about that patch story? Like on FIDO net or IRC or one of the gaming magazines we used to have?
edit: I just googled "CRC Error at sector: 836864268543" and have no Monkey Island results. Are you sure you're remembering this correctly?
Or the glass crystals in Zardoz.
About 5 trillion Olympic swimming pool's worth.So, in terms of storage density units, how many cat videos are we talking about?