Microsoft’s new 10,000-year data storage medium: glass

GFKBill

Ars Tribunus Militum
2,864
Subscriptor
I don't know if that's a solvable problem. The only data mostly guaranteed ("mostly" because sometimes evolution be weird, yo) to be accessible on millennial time frames (assuming the medium survives, of course) is something that is naked-eye discernable.

That puts a real hard cap on how dense the storage can be. Very quick googling and a napkin back tell me that it's on the order of 1012 pages of text to store a petabyte.

So you need to make some important compromises balancing quantity against accessibility. So you need to include instructions, maybe starting at "here's how to make electricity." By the time you get to "build a chip fab," of course, you're wondering what you're even doing, because if they don't have chip fabs this may be a significantly multigenerational effort.

So maybe you try to fall back to something that "only" needs optical microscopy, which is (by comparison to "build a chip fab") trivial. But while you can store maybe four orders of magnitude more information, that just reduces the page count per PB from 1012 to 108. That's a big deal, obviously...but we're measuring global data in the range of 101 zettabytes. Even assuming only 1% of that is unique, you're still adding four zeroes back on.

That's a lot of slides.

Something has to give: either accessibility or volume.

(Also you need a thousand-language Rosetta Stone and the assumption that there hasn't been such a stark discontinuity in human history that no workable vestige of a present-day language has survived).
So stagger your archive - primers that are naked eye, then increasing density as your reader progresses through the information. By the time they get to chip fab, you've got them reading media that has orders of magnitude better density than naked eye.
 
Upvote
9 (9 / 0)

tcowher

Ars Tribunus Militum
1,755
It would be an interesting mental exercise thinking of a way to ensure readability that far forward. You'd need some kind of instructions to replicate the technology without using the technology itself. The instructions would have to be as durable as the archive. Maybe laser etched metal using plain text?
Of course the other problem is how much do you bootstrap? I mean do you start with fire and the wheel? Smelting copper? Electricity? Quantum mechanics? I would think the "Ideal" archival material would have many layers starting with human readable that could also include instructions to read the next layer.
Example....
First layer: Information important to "primitive" people and instructions to grind glass to make a magnifier.
2nd layer to small to read clearly without a magnifier. Similar information but instructions on making a microscope.
3rd layer to small to read without a microscope, more information and instructions....

Not sure where we could find this magical material though.
 
Upvote
7 (7 / 0)

adespoton

Ars Legatus Legionis
10,690
So stagger your archive - primers that are naked eye, then increasing density as your reader progresses through the information. By the time they get to chip fab, you've got them reading media that has orders of magnitude better density than naked eye.
And... you can get them building the things required to build a chip fab earlier, so that by the time they get to "build a chip fab" they are guaranteed to know what you're talking about.

In fact, the best way to arrange the data is to increase density as the instructions get more technical, such that they will only be able to read the instructions if they have the capacity to understand them already.

And a nice thing about glass is... you can etch eye-visible stuff near the outer surface, OR, etch your voxels such that the instructions for decoding them are themselves visible in the encoded structure. So each pass at the same slab with better technology will reveal another layer of encoded data.
 
Upvote
8 (8 / 0)

Control Group

Ars Legatus Legionis
19,268
Subscriptor++
So stagger your archive - primers that are naked eye, then increasing density as your reader progresses through the information. By the time they get to chip fab, you've got them reading media that has orders of magnitude better density than naked eye.
Sure; you can get a few orders of magnitude taken care of with microscopy, and you can certainly write down the path to that tech in a naked-eye-discernible way. That's pretty straightforward.

But it feels like there's a real step function jump from microscopy to high-density digital storage that you can't really ease into. If you've got electricity and (good enough glass manufacturing skills for) microscopy, vacuum tubes are straightforward. And, demonstrably, you can get from tubes to transistors without modern chip manufacture. But the ramp from that point is rough.

It's been almost a century for us to get from tubes to current storage density, and that's been a civilizational effort. If it had just been a handful of universities trying to do archaeology?

And when we're talking about zettabytes of data, you can't regress current tech very far before it stops being possible.

Now, that is obviously a worst-case scenario, where the technology to read these things has been lost due to a global collapse of high-tech society. Which is a bit more likely today than it was just a couple years ago, but still. If we assume that the problem is more akin to zip drives and Betamax than it is a matter of "what is binary", then the only problem we need to solve is the durability of the medium. Given any reason to do so, "we" could reverse-engineer either and design a device to read them. They just need to exist.
 
Upvote
4 (4 / 0)
That is probably one of the worst examples.

Observations OVER TIME are incredibly valuable. No matter how your great your telescope watching a stellar event happen is it can't see what it looked like 10,000 years ago. Talking our frame of refence obviously the events occured potentially millions of billions of years ago.

Anything where the change in observed data is useful older records are invaluable. Things like sea levels. Knowing the sea level today is a lot less useful than knowing the sea level over time ideally as far back in time as you can go. If someone had sea level measurements that survived from 10,000 years ago scientists would love that.

Not scientific data but archiving new stories (I would say newspaper but these days a lot of content is online) on a medium that last 10,000 would be worth its weight in gold to future historians.

I will concede the uses for this are extremely niche. Likely nobody is backing up their steam library onto one of these glass slabs. However for niches where true digitally perfect archives lasting even a century are desired this is useful tech if they can get it to a finished product which is a big if.
Even today, we've come to realize the internet isn't forever. There are a number of web sites that are simply gone, and while archive.org has done much to help, it doesn't cover everything. For example, at one point people could easily submit requests to take down certain pages for good from their archive. They're a little less likely to respond to takedown requests from, say, politicians and their embarrassing remarks these days (which is for the good). In any case, that still leaves a lot of content simply missing that didn't get archived for one reason or another, and ultimately, one year or another, their funding drive will fail, putting the archive at risk. I have high hopes in that event some benefactors will come together to make sure there's a good storage place for their archives, but then THAT needs to be maintained, somehow.

In any case, there's sites, and macromedia shockwave games, and stuff like that which is simply gone for good. Heck, when Dell just deleted all their support pages full of ancient drivers, that COULD have been the end of it had not some people shown initiative in downloading a number of those drivers in advance.
 
Upvote
11 (11 / 0)

mschira

Ars Tribunus Militum
1,646
Maybe a bit of a more frugal choice of what we want to store for a "people" that will come after us for as much as we come after the Mesopotamians is a better perspective?

I don't want to diss the Square kilometer array or how great such projects are.
I am a scientist myself and we use big open access data projects, too.

BUT, I am not 100% sure the first priority for data we can store for 10,000 years is from these massive projects.
Such projects live from the data being EASILY accessable, so you need an electronic access system and active storage.
 
Upvote
5 (5 / 0)

Statistical

Ars Legatus Legionis
54,747
Maybe a bit of a more frugal choice of what we want to store for a "people" that will come after us for as much as we come after the Mesopotamians is a better perspective?

I don't want to diss the Square kilometer array or how great such projects are.
I am a scientist myself and we use big open access data projects, too.

BUT, I am not 100% sure the first priority for data we can store for 10,000 years is from these massive projects.
Such projects live from the data being EASILY accessable, so you need an electronic access system and active storage.

Exactly. The article was good up to that point and then just kinda spun the discussion off on the wrong stuff.

How much data would be text and low resolution images of every major news story in a century be. How valuable would that be to historians. Imagine if historians today had access to a digitally perfect copy of the "The Roman Times" with daily issues spanning two centuries.

There are lots of applications which involves hundreds of GB or even a few TB of data but not 100 PBs.
 
Upvote
12 (12 / 0)

Steve austin

Ars Scholae Palatinae
1,752
Subscriptor
I'm reminded of a short story where the "gods" of Earth arrive, announcing they seeded Earth with life eons ago with their impossibly advanced technology as a sort of retirement fund... and they're here to retire, and just need us to take them in and care for them in their advanced age. They've only got a few centuries left, tops. The whole thing was very tongue in cheek. They promised to provide their technology and how to use it to us mere humans. The problem is... they literally only knew the principles behind making their "current" tech, and precisely nothing about all the bootstrap tech you need to develop on the way TO it, so their explanations and diagrams were still entirely beyond human understanding. There was also a new law that had to be created. A lot of people became resentful of being assigned as a "god care" household, and so "god abuse" became an issue.
The case of “advanced so far we don’t remember how we got here” but from more or less the reverse point of view was in John Campbell’s 1937 SF story Forgetfulness, and the concept of “have a vague idea of the needed tool to fix something, but no idea how to build the tool to build the tool to build the tool…” was the basis of Raymond Jones’ 1950 SF story Tools of the Trade.
 
Upvote
4 (4 / 0)

fenris_uy

Ars Tribunus Angusticlavius
9,086
That may be overselling it just a bit. The Square Kilometer Array telescope, for example, is expected to need to archive 700 petabytes of data each year. That would mean over 140,000 glass slabs would be needed to store the data from this one telescope. Even assuming that the write speed could be boosted by adding significantly more lasers, you’d need over 600 Silica machines operating in parallel to keep up. And the Square Kilometer Array is far from the only project generating enormous amounts of data.

I'm sorry, but why would you want to store the data from a telescope from 10.000 years?

Yeah, the write speed is slow, but you aren't going to write everything into it. Only data that you want to store for a long time, that you don't want to be written over.

Also, the article talks about the write speed, but it never mentions the read speed, or did I miss a part?
 
Upvote
-2 (0 / -2)

zogus

Ars Tribunus Angusticlavius
7,181
Subscriptor
Storage that will last 10,000 year eh?

Say hello to my little friend....
View attachment 128560

The first thing that popped into my mind when archive + glass + 10,000y are put together is Library of Alexandria.
The Library of Alexandria was irreplaceable because replicating the entire collection would have required paying thousands of scribes—who were among the best-educated people in classical times, and consequently extremely valuable to society—to spend decades to do nothing but copy old books. The cost in human resources would have been too much for even the Roman emperors to bear.

In contrast, assuming this technology is ready to use, replicating Microsoft’s repository requires tying up a few engineers for some number of months to get a batch of drives built, and then write and verify all the glass media. The cost of subsequent mirror sites will go even further down, as the writing infrastructure gets amortized. Microsoft can realistically build dozens of permanent repositories around the world before the shareholders even notice the cost impact.
 
Upvote
2 (3 / -1)

laserman5000

Wise, Aged Ars Veteran
147
1771462249885.png
 
Upvote
-2 (0 / -2)

LordDaMan

Ars Legatus Legionis
11,479
a few, yes.

but be realistic. in 10,000 years, this kind of data would tell them more about our technology than anything else. and they won't need TB upon TB of sample data to learn what they need to learn about that.

if we had hundreds of thousands of pages of Mayan astronomical data, their individual scientific value would be minimal. people would be framing them and hanging them on their wall as decoration.
In 10,000 years we will either be far past the singularity and have no need for something like that as we could recall all data in all of human history or be using wooden clubs to fight off other and breaking the demon machines and everything associated with them.
 
Upvote
-2 (3 / -5)

eldakka

Ars Tribunus Militum
1,728
Subscriptor
LTO is a terrible medium for long term archival.
The tapes may be rated for "30 years" but in ~10 years they won't be manufacturing drives capable of reading current tapes.
E.g. LTO-10 drives can't even read LTO-9 tapes.
Right, but if you are on LTO-9 and upgrade to LTO-10 it doesn't magically mean you have no LTO-9 drives left or that the older LTO drives you have immediately stop functioning.

One of the things you need to do is before ditching all the LTO-9 drives, you use them to read the LTO-9 tapes and migrate that data to LTO-10.

Realistically you'd effectively still initially have the drives for 3 generations of LTO (assuming you are a 'upgrade very generation'-type organisation.)

So when you upgrade from 9 to 10, you'd have 8/9/10 drives (LTO-9 drives can read LTO-8 tapes, so you wouldn't need to keep LTO-8 drives to read LTO-8 media, but let's go with the "can't read any previous generation" for the process). And one of the first things you do is migrate all the LTO-8 data to LTO-10 so you can ditch the LTO-8 drives as soon as you've migrated that data. So now you are writing to LTO-10 drives/tapes, while still having a few LTO-9 drives for reading LTO-9 archival data. Then when you migrate to LTO-11 in the future, you migrate all the LTO-9 tapes to 11 before you dispose of all the LTO-9 drives. etc.

If the organisation doesn't follow that type of prcess, that's an organisational problem, not a storage medium (LTO or whatever it is in use) problem. And it's the exact same problem this new media will have, as others have pointed out it's only useful while the capability to read the data exists.
 
Upvote
3 (4 / -1)

ewelch

Ars Tribunus Angusticlavius
9,327
Subscriptor++
In 10,000 years we will either be far past the singularity and have no need for something like that as we could recall all data in all of human history or be using wooden clubs to fight off other and breaking the demon machines and everything associated with them.
I don't really care, since a copy of me in the sungularity isn't me. I was disassembled in the upload and won't exist any more.
 
Upvote
0 (1 / -1)

raxx7

Ars Legatus Legionis
17,079
Subscriptor++
Right, but if you are on LTO-9 and upgrade to LTO-10 it doesn't magically mean you have no LTO-9 drives left or that the older LTO drives you have immediately stop functioning.

One of the things you need to do is before ditching all the LTO-9 drives, you use them to read the LTO-9 tapes and migrate that data to LTO-10.

Agreed.

I had already argued that writing something into a medium and leaving it on a shelf is a poor data preservation strategy. Copying is the way.
But what I meant is that within that bad strategy LTO is particularly bad compared to say, analog film or something like high quality writable optical media. At least you can still buy a new drive which can read a CD-R.
If you want a read a contemporary LTO-1 tape, you're looking at eBay.

That said: you don't need to transfer your data every LTO generation.
Although the drives are little-to-no backward compatibility,there's some overlap in drive production. E.g. you can still buy new LTO-7 drives. That means it's now time to transfer your data from LTO-5 tapes (introduced in 2010) to LTO-10 tapes.
So you get about 10-15 years out of the tapes.
 
Upvote
0 (0 / 0)
There have been ten thousand "hey look at this wacky shit that will obviously never, ever make it out of the laboratory" technologies. It's like a clock, every six months. Tick, a new bullshit data storage method, tock, a new bullshit battery technology. You know the first person to demonstrate using a laser to record and retrieve data volumetrically in a crystal? Van Heerden in fucking 1963.
 
Upvote
-12 (2 / -14)

zogus

Ars Tribunus Angusticlavius
7,181
Subscriptor
I'm sorry, but why would you want to store the data from a telescope from 10.000 years?
In those petabytes of data are images of stars that look perfectly harmless today, but will go berserk as supernovae 100, 200, 1,000, or even 10,000 years from now. It might also contain faint signals of extra-solar-system objects that may or may not crash onto Earth 5,000 years from now. And when these things happen, the past observed trajectories of the objects will be worth their weight in gold.

Incidentally, this is not a theoretical exercise. Astronomy routinely benefits from observations left in documents from before the age of telescopes.
 
Upvote
21 (21 / 0)

Resistance

Wise, Aged Ars Veteran
418
To the people talking about readability in 10 millennia while bemoaning the requirement for complex hardware and software to read, I get what you're saying, but suggesting words etched into stone is meaningfully better is... a flawed take. In order for that to work, you have to maintain the language, or you have to write it in several different languages and hope that enough of each of them survive into the future for the text to be decipherable.

Stone (or some other material) tablets in several different languages is probably the lowest maintenance tool for communicating to far future people, but it is ridiculously expensive per unit of data. Once you get into even a handful of megabytes it makes sense to start doing things like microscopic etchings on a sturdy material. If you want gigabytes you're going to want something that requires a specialized machine to read it.

Data preservation is always going to require maintenance, the goal is to reduce all the different costs associated with it. If you want to communicate a large amount of data to a far future civilization with minimal maintenance then, yes, part of your system will be stone tablets, but realistically if you want to do multiple gigabytes you're going to want something that requires the reader to build a relatively complicated machine to extract the data, so most of your stone tablets will be instructions on how to build the machine. A microscope and a computer seems to be pretty reasonable if you want to do terabyte scale data with minimal maintenance.
 
Upvote
7 (7 / 0)

Fabermetrics

Ars Praefectus
5,768
Subscriptor
Glass seems like an interesting medium here. People are talking about lack of persistent hardware support but imagine a world where a glass square was the media standard. Devices could read this regardless of how the data is etched on it. Software would be responsible for decoding the format of etching. As time progresses, the data density could increase, and new readers would be both forwards and backwards compatible. If you can read a 5nm bit you can read a 25nm bit. Read/write process would be tied to the technology available. Perhaps single bit read at first, but later the glass used almost like a photomask on a image sensor-like reader, loading an entire slab into a system at once. Maybe the slab used as a photomask could also allow for bulk copies to be made, first of a negative, which then is used to mass produce positives for distribution. I dont think microsofts system is the future as designed, but if everyone is going to talk about "Id rather just copy my data forever", we could spend time imagining a forever where that isnt necessary. An archive of little glass slides of varying ages and etching resolutions seems possible. "These land records from 2150, theyre in 20nm, but the 2589 land records the state sent us are 3nm and dont work with our system, we need to upgrade if we want to ensure theres no defects in this title, theres 400 years of easments and liens we need to review" - Clerk at the Moon Law Office of Aldrin and Sons.
 
Upvote
2 (3 / -1)

sarusa

Ars Praefectus
3,258
Subscriptor++
But it's Microslop, so... The big question here is how, if they decide to move forward, they'll ruin it with an LLM. Yes, there is some CNN involved in reading the data, but that won't be in your face enough for Microslop. Nadella wouldn't let anything like this productize without somehow ramming Copilot sideways up its rectum.

Perhaps every cartridge will ship with an LLM that will be your only way to actually access the data? Oh, and then you'll need a monthly subscription so MS can pretend it's making any actual AI revenue. So 10,000 years in the future, the aliens will find these in the smoking ruins of Earth, attempt to access the information and, 'Glork, it says we need an Enterprise Subscription to Office and one million dollars back subscription fees for this cartridge, do we have any spare change?'
 
Upvote
-16 (2 / -18)
All of this is great, but has storing the data ever actually been the problem? I'm decidedly ignorant here, so I'm asking a genuine question. My read on this was always that it was the data itself that became obsolete, as file formats aged out, software changed, etc. If you don't have software that can read the data, what good is the data?

Outside of super-specific software that was locked behind corporate or government firewalls until it became obsolete, has that ever really been an issue? I don't think there's any piece of consumer computing or gaming hardware from the last 50 years that hasn't been emulated.

Hell, a good chunk of that hardware can run as an FPGA core for my Analogue Pocket. I've got MSX and Fairchild Channel F cores on my SD card.
 
Upvote
-2 (2 / -4)

morlamweb

Ars Scholae Palatinae
1,425
All of this is great, but has storing the data ever actually been the problem? I'm decidedly ignorant here, so I'm asking a genuine question. My read on this was always that it was the data itself that became obsolete, as file formats aged out, software changed, etc. If you don't have software that can read the data, what good is the data?
Of course it's been a problem. Have you not heard of bit rot in optical discs, or moldy/demagnetized tapes in tape backups?
 
Upvote
5 (5 / 0)

Eldorito

Ars Tribunus Angusticlavius
7,929
Subscriptor
There have been ten thousand "hey look at this wacky shit that will obviously never, ever make it out of the laboratory" technologies. It's like a clock, every six months. Tick, a new bullshit data storage method, tock, a new bullshit battery technology. You know the first person to demonstrate using a laser to record and retrieve data volumetrically in a crystal? Van Heerden in fucking 1963.

Welcome to science, you appear to be new here.

He didn't demonstrate it in 1963, he theorised it. And his work was yet another data storage method built off existing ideas on using lasers and inference patterns. Everyone is standing on the shoulders of giants.
 
Upvote
21 (21 / 0)

Frank C.

Ars Scholae Palatinae
1,810
Our lord, our lord Jejova has brought you these 15, crash, 10, 10 commandments for all to obey! Glass seems to be a fragile medium to employ. Then again, so is spinning rust over the long term.
Excellent History of the World reference. And a very hilarious scene.
 
Upvote
4 (4 / 0)

Northbynorth

Ars Praetorian
598
Subscriptor++
From a historians perspective it is important to have access to a healthy amount of data from society as a whole but also daily records from "ordinary" people. In the 18th and 19th century it was mainly from letters and diaries, first from the upper classes but gradually spreading downwards. Today we store our thoughts and life happenings in social media, images and videos, and AI chat logs. That is an enormous amount of data. Last year it was estimated we took 2 trillion photos.

Should we let it go, and let later generations guess what we were up to? Without digital media very little is stored for future generations. Almost no science, engineering, political transformations, population statistics. Also, no letters, no books, no photos, no art and most buildings, clothes and furnitures are semi-disposable, at best lasting a single lifetime. Without storage our time will be a black hole in history.
 
Upvote
7 (7 / 0)

zogus

Ars Tribunus Angusticlavius
7,181
Subscriptor
From a historians perspective it is important to have access to a healthy amount of data from society as a whole but also daily records from "ordinary" people. In the 18th and 19th century it was mainly from letters and diaries, first from the upper classes but gradually spreading downwards. Today we store our thoughts and life happenings in social media, images and videos, and AI chat logs. That is an enormous amount of data. Last year it was estimated we took 2 trillion photos.

Should we let it go, and let later generations guess what we were up to? Without digital media very little is stored for future generations. Almost no science, engineering, political transformations, population statistics. Also, no letters, no books, no photos, no art and most buildings, clothes and furnitures are semi-disposable, at best lasting a single lifetime. Without storage our time will be a black hole in history.
Yeah, very much this. The Complaint Tablet to Ea-Nasir is probably the last thing the hapless Ea-Nasir or his customer/victim Nanni would have wanted the posterity to see, but what it tells us about the Mesopotamian society is priceless, arguably far more important than all those vacuous who-killed-who inscriptions that their kings proudly left behind.
 
Upvote
13 (13 / 0)

DanNeely

Ars Legatus Legionis
16,038
Subscriptor
That may be overselling it just a bit. The Square Kilometer Array telescope, for example, is expected to need to archive 700 petabytes of data each year. That would mean over 140,000 glass slabs would be needed to store the data from this one telescope.

There are multiple collections of photographic plates with six figure numbers of images.

And while the number of slabs is going to be massive, the volume isn't bad. Only about 4 nominal cubic meters.

In contrast 30TB 3.5" HDDs would need about 8 cubic meters.

That's close enough that physical space needed it shouldn't be a major factor in comparing the options (and is much smaller than larger plate archives). In practice the volume gap would probably be less since the glass slabs would presumably have some level of packaging equivalent to the shell around HDD platters that's not factored in my calculation.
 
Upvote
3 (3 / 0)

Kamishin

Smack-Fu Master, in training
1
This has already been done years ago??

Just google king james bible etched glass storage - story from 2016 and before (theoretical) - in 2016 they put the magna carta, king james bible and universal declaration of human rights on an etched crystalline glass coin with lasers.

It's called a 5 dimensional crystalline storage medium:

That all works out to a theoretical data capacity of 360Tb that can be stored in the dimensions of a conventional disc, like a DVD, the researchers said. The fused quartz essentially lasts forever, or 13.8 billion years at 190 degrees centigrade. It’s also thermally stable up to 1,000°C, the researchers claim.
 
Upvote
2 (3 / -1)

daduke

Smack-Fu Master, in training
81
In reality you migrate to a new medium/format every 5-10 years to ensure you can reliably read back the data.

An untested copy isn't a copy.
I am working with tape archival at EB range.

If your tape software is serious about data preservation, the tape drive will read and verify each written tape block of your data as you are writing it (Logical Block Protection feature) this is a feature that has been around for a while on TS11XX and LTO drives (even in the last generations of 10K drives).
Therefore written data is always tested by design at 0 performance cost contrary to disk.

Migrating data every 5 years is a bit complicated at EB scale as the cost of media is several order of magnitude higher than the infrastructure cost: today 10 years may be a more reasonable target but this is more likely to increase further as medium amortization cost and exponentially growing amounts of data requires to keep data (and drives and libraries) longer and longer until we reach the 30 years data duration wall.

Then I guess this problem is for future generations of tape archive service managers: not for me to solve, just for me to warn.
 
Upvote
2 (2 / 0)

ParryLost

Smack-Fu Master, in training
57
I'm sorry, but wasn't this announced in 2019? https://meincmagazine.com/gadgets/201...t-silica-offers-robust-thousand-year-storage/

I mean, presumably Microsoft has improved the technology since then, but Project Silica isn't really "new," is it? I remember reading about it back then, and being excited, and it was exciting to see it show up in the news again... but then confusing, because everyone is treating it as the first announcement ever of a brand-new Microsoft project?..
 
Last edited:
Upvote
3 (3 / 0)
The most surprising result of this type of storage is its durability: most modern storage systems degrade over time, losing information, while this new storage system would guarantee 10,000 years!
Humans have been writing for about 6000 years. The ability to read Akkadian was lost for about 2000 of those years but despite that we still use the Babylonian 60 minute hour. The names of the days of the week in English are 1500 years old but at least a 1/3 of that they weren't written down. The names of the months in English are 2000 years old and are mostly in a dead language. Information that gets used doesn't disappear. Only information that doesn't get used vanishes. I doubt anyone has looked at my undergraduate thesis in decades and its sitting in a box file somewhere, if it exist at all. Not all information is actually worth preserving.
 
Upvote
1 (3 / -2)