Does lemmy have any communities dedicated to archiving/hoarding data?
For wikipedia you’ll want to use Kiwix. A full backup of wikipedia is only like 100GB, and I think that includes pictures too.
Last time I updated it was closer to 120GB but if you’re not sweating 100 GB then an extra 20 isn’t going to bother anyone these days.
Also, thanks for reminding me that I need to check my dates and update.
EDIT: you can also easily configure a SBC like a Raspberry Pi (or any of the clones) that will boot, set the Wi-Fi to access point mode, and serve kiwix as a website that anyone (on the local AP wifi network) can connect to and query… And it’ll run off a USB battery pack. I have one kicking around the house somewhere
The English Language Wikipedia probably wouldn’t be hard, or Debian Stable.
All of Debian’s packages might be a tad more expensive, though.
This might be a good place to start for Wikipedia;
https://meta.wikimedia.org/wiki/Data_dump_torrents#English_Wikipedia
And the english with no pictures is even smaller
And you can use Kiwix to setup a locally hosted wikipedia using the data dumps
It depends if you want the images or previous versions of wikipedia too. The current version is about 25Gb compressed, the dump with all versions is aparently multiple terabytes. They don’t say how much media they have, but I’m guessing it’s roughly “lots”.





