Let’s do a thought experiment.
Let’s assume we just never got hard drives to work all that well, head crashes are common and large storage capacities are only possible in servers with incredibly expensive anti vibration setups and what not and there’s no way they’d ever work in portable devices. And optical media just didn’t work out. Maybe somehow we didn’t discover the science in time and the companies working on it just failed or bad management decisions killed off the research into it before anything useful came off of it. And flash storage just never came down in price.
How far could we have pushed cassettes and tape if all the effort that went into other technologies had to be put into cassettes because there simply wasn’t a good alternative for data storage. What are the limits of how fast we could move tape in a cassette? How miniaturized would the technology be by now in 2023? I know that there are contemporary tape backup systems with large capacities but there hasn’t been any efforts into high speed seek times or making cassettes a viable tiny medium for use as removable media on PDAs or mobile phones.
If we could pack data densely enough and move the tape quickly enough how possible would modern computer tasks like high resolution digital video or image editing be? Assuming that we still had the high speed processors and non-persistent memory of the present but just no other data storage medium aside from magnetic tape.
For inspiration consider that in 1992 we had NT Cassettes that are about as big as an SD card an could store nearly a Gigabyte and in the enterprise LTO-9 tape (released in 2021) stores around 18 TB. So it doesn’t sound impossible to have tiny cassettes with a lot of storage if we spend the last 3 decades working on it.
https://en.wikipedia.org/wiki/Magnetic-tape_data_storage
https://techxplore.com/news/2020-12-fujifilm-ibm-unveil-terabyte-magnetic.html
But I don’t think it is possible to achieve fast random access to different parts of the tape. The workaround would be having a ridiculous amount of RAM to cache loaded data.
I’m incredibly unqualified to even think about how one might get faster random access times but i was imagining sci-fi solutions like looped tape so that you are always at most 1/2 of the tape length away from the point you want to reach, or the tape equivalent of multi actuator hard drives where there’s be multiple independent tapes in one cassette in a sort of RAID style thing but maybe instead of (only) striping data could be stored on multiple tapes in different places to always have one tape that is at a position close to the data you want. Or a system where the same tape has multiple read heads applied to it in distant places.
it’s possible there’d be some kind of RAID micro-tape situation where data is stored on many smaller tapes that can all be seeking individually
*edit:
i think out usage patterns would also change… i could imagine a situation where you’re watching a movie, listening to music, etc and your data is redundantly stored say 3 times… a music playlist would seek to a song in the first storage tape, play sequentially, and seek the 2nd storage tape to the next song location
perhaps things like watching UHD movie content would be slower to start (similar kind of situation to downloading the movie from the internet for piracy) but the quality of the movie would be so high because storage density and bandwidth is high
certainly for interactive tasks like editing documents it’d be total ass no matter what, but again maybe we’d get used to “queueing up” tasks, opening more than we need in the background, etc