So stagger your archive - primers that are naked eye, then increasing density as your reader progresses through the information. By the time they get to chip fab, you've got them reading media that has orders of magnitude better density than naked eye.I don't know if that's a solvable problem. The only data mostly guaranteed ("mostly" because sometimes evolution be weird, yo) to be accessible on millennial time frames (assuming the medium survives, of course) is something that is naked-eye discernable.
That puts a real hard cap on how dense the storage can be. Very quick googling and a napkin back tell me that it's on the order of 1012 pages of text to store a petabyte.
So you need to make some important compromises balancing quantity against accessibility. So you need to include instructions, maybe starting at "here's how to make electricity." By the time you get to "build a chip fab," of course, you're wondering what you're even doing, because if they don't have chip fabs this may be a significantly multigenerational effort.
So maybe you try to fall back to something that "only" needs optical microscopy, which is (by comparison to "build a chip fab") trivial. But while you can store maybe four orders of magnitude more information, that just reduces the page count per PB from 1012 to 108. That's a big deal, obviously...but we're measuring global data in the range of 101 zettabytes. Even assuming only 1% of that is unique, you're still adding four zeroes back on.
That's a lot of slides.
Something has to give: either accessibility or volume.
(Also you need a thousand-language Rosetta Stone and the assumption that there hasn't been such a stark discontinuity in human history that no workable vestige of a present-day language has survived).
Time to come clean, John... was that pun intentional or accidental?Writing remains a bottleneck in the system....
Of course the other problem is how much do you bootstrap? I mean do you start with fire and the wheel? Smelting copper? Electricity? Quantum mechanics? I would think the "Ideal" archival material would have many layers starting with human readable that could also include instructions to read the next layer.It would be an interesting mental exercise thinking of a way to ensure readability that far forward. You'd need some kind of instructions to replicate the technology without using the technology itself. The instructions would have to be as durable as the archive. Maybe laser etched metal using plain text?
they just need to perform some surgery to get the reader working over USB and then it'll be around forever. Right?
And... you can get them building the things required to build a chip fab earlier, so that by the time they get to "build a chip fab" they are guaranteed to know what you're talking about.So stagger your archive - primers that are naked eye, then increasing density as your reader progresses through the information. By the time they get to chip fab, you've got them reading media that has orders of magnitude better density than naked eye.
Sure; you can get a few orders of magnitude taken care of with microscopy, and you can certainly write down the path to that tech in a naked-eye-discernible way. That's pretty straightforward.So stagger your archive - primers that are naked eye, then increasing density as your reader progresses through the information. By the time they get to chip fab, you've got them reading media that has orders of magnitude better density than naked eye.
Even today, we've come to realize the internet isn't forever. There are a number of web sites that are simply gone, and while archive.org has done much to help, it doesn't cover everything. For example, at one point people could easily submit requests to take down certain pages for good from their archive. They're a little less likely to respond to takedown requests from, say, politicians and their embarrassing remarks these days (which is for the good). In any case, that still leaves a lot of content simply missing that didn't get archived for one reason or another, and ultimately, one year or another, their funding drive will fail, putting the archive at risk. I have high hopes in that event some benefactors will come together to make sure there's a good storage place for their archives, but then THAT needs to be maintained, somehow.That is probably one of the worst examples.
Observations OVER TIME are incredibly valuable. No matter how your great your telescope watching a stellar event happen is it can't see what it looked like 10,000 years ago. Talking our frame of refence obviously the events occured potentially millions of billions of years ago.
Anything where the change in observed data is useful older records are invaluable. Things like sea levels. Knowing the sea level today is a lot less useful than knowing the sea level over time ideally as far back in time as you can go. If someone had sea level measurements that survived from 10,000 years ago scientists would love that.
Not scientific data but archiving new stories (I would say newspaper but these days a lot of content is online) on a medium that last 10,000 would be worth its weight in gold to future historians.
I will concede the uses for this are extremely niche. Likely nobody is backing up their steam library onto one of these glass slabs. However for niches where true digitally perfect archives lasting even a century are desired this is useful tech if they can get it to a finished product which is a big if.
Maybe a bit of a more frugal choice of what we want to store for a "people" that will come after us for as much as we come after the Mesopotamians is a better perspective?
I don't want to diss the Square kilometer array or how great such projects are.
I am a scientist myself and we use big open access data projects, too.
BUT, I am not 100% sure the first priority for data we can store for 10,000 years is from these massive projects.
Such projects live from the data being EASILY accessable, so you need an electronic access system and active storage.
The case of “advanced so far we don’t remember how we got here” but from more or less the reverse point of view was in John Campbell’s 1937 SF story Forgetfulness, and the concept of “have a vague idea of the needed tool to fix something, but no idea how to build the tool to build the tool to build the tool…” was the basis of Raymond Jones’ 1950 SF story Tools of the Trade.I'm reminded of a short story where the "gods" of Earth arrive, announcing they seeded Earth with life eons ago with their impossibly advanced technology as a sort of retirement fund... and they're here to retire, and just need us to take them in and care for them in their advanced age. They've only got a few centuries left, tops. The whole thing was very tongue in cheek. They promised to provide their technology and how to use it to us mere humans. The problem is... they literally only knew the principles behind making their "current" tech, and precisely nothing about all the bootstrap tech you need to develop on the way TO it, so their explanations and diagrams were still entirely beyond human understanding. There was also a new law that had to be created. A lot of people became resentful of being assigned as a "god care" household, and so "god abuse" became an issue.
That may be overselling it just a bit. The Square Kilometer Array telescope, for example, is expected to need to archive 700 petabytes of data each year. That would mean over 140,000 glass slabs would be needed to store the data from this one telescope. Even assuming that the write speed could be boosted by adding significantly more lasers, you’d need over 600 Silica machines operating in parallel to keep up. And the Square Kilometer Array is far from the only project generating enormous amounts of data.
The Library of Alexandria was irreplaceable because replicating the entire collection would have required paying thousands of scribes—who were among the best-educated people in classical times, and consequently extremely valuable to society—to spend decades to do nothing but copy old books. The cost in human resources would have been too much for even the Roman emperors to bear.Storage that will last 10,000 year eh?
Say hello to my little friend....
View attachment 128560
The first thing that popped into my mind when archive + glass + 10,000y are put together is Library of Alexandria.
In 10,000 years we will either be far past the singularity and have no need for something like that as we could recall all data in all of human history or be using wooden clubs to fight off other and breaking the demon machines and everything associated with them.a few, yes.
but be realistic. in 10,000 years, this kind of data would tell them more about our technology than anything else. and they won't need TB upon TB of sample data to learn what they need to learn about that.
if we had hundreds of thousands of pages of Mayan astronomical data, their individual scientific value would be minimal. people would be framing them and hanging them on their wall as decoration.
Right, but if you are on LTO-9 and upgrade to LTO-10 it doesn't magically mean you have no LTO-9 drives left or that the older LTO drives you have immediately stop functioning.LTO is a terrible medium for long term archival.
The tapes may be rated for "30 years" but in ~10 years they won't be manufacturing drives capable of reading current tapes.
E.g. LTO-10 drives can't even read LTO-9 tapes.
I don't really care, since a copy of me in the sungularity isn't me. I was disassembled in the upload and won't exist any more.In 10,000 years we will either be far past the singularity and have no need for something like that as we could recall all data in all of human history or be using wooden clubs to fight off other and breaking the demon machines and everything associated with them.
Right, but if you are on LTO-9 and upgrade to LTO-10 it doesn't magically mean you have no LTO-9 drives left or that the older LTO drives you have immediately stop functioning.
One of the things you need to do is before ditching all the LTO-9 drives, you use them to read the LTO-9 tapes and migrate that data to LTO-10.
In those petabytes of data are images of stars that look perfectly harmless today, but will go berserk as supernovae 100, 200, 1,000, or even 10,000 years from now. It might also contain faint signals of extra-solar-system objects that may or may not crash onto Earth 5,000 years from now. And when these things happen, the past observed trajectories of the objects will be worth their weight in gold.I'm sorry, but why would you want to store the data from a telescope from 10.000 years?
All of this is great, but has storing the data ever actually been the problem? I'm decidedly ignorant here, so I'm asking a genuine question. My read on this was always that it was the data itself that became obsolete, as file formats aged out, software changed, etc. If you don't have software that can read the data, what good is the data?
Of course it's been a problem. Have you not heard of bit rot in optical discs, or moldy/demagnetized tapes in tape backups?All of this is great, but has storing the data ever actually been the problem? I'm decidedly ignorant here, so I'm asking a genuine question. My read on this was always that it was the data itself that became obsolete, as file formats aged out, software changed, etc. If you don't have software that can read the data, what good is the data?
There have been ten thousand "hey look at this wacky shit that will obviously never, ever make it out of the laboratory" technologies. It's like a clock, every six months. Tick, a new bullshit data storage method, tock, a new bullshit battery technology. You know the first person to demonstrate using a laser to record and retrieve data volumetrically in a crystal? Van Heerden in fucking 1963.
Excellent History of the World reference. And a very hilarious scene.Our lord, our lord Jejova has brought you these 15, crash, 10, 10 commandments for all to obey! Glass seems to be a fragile medium to employ. Then again, so is spinning rust over the long term.
Yeah, very much this. The Complaint Tablet to Ea-Nasir is probably the last thing the hapless Ea-Nasir or his customer/victim Nanni would have wanted the posterity to see, but what it tells us about the Mesopotamian society is priceless, arguably far more important than all those vacuous who-killed-who inscriptions that their kings proudly left behind.From a historians perspective it is important to have access to a healthy amount of data from society as a whole but also daily records from "ordinary" people. In the 18th and 19th century it was mainly from letters and diaries, first from the upper classes but gradually spreading downwards. Today we store our thoughts and life happenings in social media, images and videos, and AI chat logs. That is an enormous amount of data. Last year it was estimated we took 2 trillion photos.
Should we let it go, and let later generations guess what we were up to? Without digital media very little is stored for future generations. Almost no science, engineering, political transformations, population statistics. Also, no letters, no books, no photos, no art and most buildings, clothes and furnitures are semi-disposable, at best lasting a single lifetime. Without storage our time will be a black hole in history.
That may be overselling it just a bit. The Square Kilometer Array telescope, for example, is expected to need to archive 700 petabytes of data each year. That would mean over 140,000 glass slabs would be needed to store the data from this one telescope.
I am working with tape archival at EB range.In reality you migrate to a new medium/format every 5-10 years to ensure you can reliably read back the data.
An untested copy isn't a copy.
Humans have been writing for about 6000 years. The ability to read Akkadian was lost for about 2000 of those years but despite that we still use the Babylonian 60 minute hour. The names of the days of the week in English are 1500 years old but at least a 1/3 of that they weren't written down. The names of the months in English are 2000 years old and are mostly in a dead language. Information that gets used doesn't disappear. Only information that doesn't get used vanishes. I doubt anyone has looked at my undergraduate thesis in decades and its sitting in a box file somewhere, if it exist at all. Not all information is actually worth preserving.The most surprising result of this type of storage is its durability: most modern storage systems degrade over time, losing information, while this new storage system would guarantee 10,000 years!
No such film exists. Put fingers in ears and goes la la la la.Or the glass crystals in Zardoz.