MrCunt
kiwifarms.net
- Joined
- Sep 22, 2021
So hey.. In the interest of ARCHIVE EVERYTHING.
I am wondering if the coders here could make a wiki scraper that archives everything in the specific wiki.. With a limit to set bytes so it does not trawl the entire wiki.. More so, Starting before it was wokipedya edited.
Probably not making sense, but those who know what I am chasing know... keke Make it a botnet decentralized archive... You know what I am saying.
Just an idea, Hope I am not the first, Scrape the history etc.
EDUD; Like our own network of that movie that the Not Socially Awkward had in that Snowden movie
EDUDTU: PRISM. KWISM!
I am wondering if the coders here could make a wiki scraper that archives everything in the specific wiki.. With a limit to set bytes so it does not trawl the entire wiki.. More so, Starting before it was wokipedya edited.
Probably not making sense, but those who know what I am chasing know... keke Make it a botnet decentralized archive... You know what I am saying.
Just an idea, Hope I am not the first, Scrape the history etc.
EDUD; Like our own network of that movie that the Not Socially Awkward had in that Snowden movie
EDUDTU: PRISM. KWISM!
Last edited: