If a new encyclopaedia were to be created, it would have to copy some of the content from Wikipedia, like images and diagrams, to avoid too much duplication of effort, while still having a large coverage of different topics. The text of the articles would have to be totally rewritten.
wikiwikiweb.de is a MoinMoin site. The software is in the middle of releasing their second release which changes a lot of things. It seems like it's a hobby project for the developers, so it will get done when they have the time; very slowly. MoniMoni can support some of the markdown that Wikipedia uses, but it also has it's own markdown language and some of the Media Wiki stuff isn't identically supported. There would need to be a pipeline where you could fork a Wikipedia article to start, then replace the text, and copy over photographs and other media. MoniMoni would need to be modified to support this work flow
This effects more than just wiki sites, but it's a big issue with Wikis. The main way that content is discovered though a website is by site search. Large search engines censor small websites. If people find your site, then they will usually only find content though site search, especially if it's an info site like a wiki. By default Media Wiki's search is very bad. There are ways to add external search systems, like you have another process running that serves an API that does all the search requests, and then the pages are indexed into this other process' search index. Wikipedia uses ElastiSearch for this so search queries have spell check and "here are similar results to your query with no results". Otherwise, all you get is a grep on the titles of the articles by default.
Most wiki software really is archaic by modern standards. Media Wiki requires the database to be manually setup before using the site; it can't just create the tables it needs on the fly on first use. Many options require a lot of manual administration. Most wiki software does not support good site search, and they are not designed to randomly fork other wikis page by page as you like. Media Wiki has forking abilities, but they are not very flexible from what I know, and you still have the problems of setting up search and other manual administrations. By manual administration, I mean you have to edit the preferences PHP file yourself to block random edits and get anything resembling CMS, like you'd have for a news site with multiple writers. The file is auto generated on startup, and you'd have to have a customised and tested one handy if you just wanted the settings pre-loaded into your Docker Container when the server is launched by Kubernetes or something. Basically, there is a lot of technical debt given the changes in how sites are used and expected to be administered. Media Wiki has not kept up with these changes because their only user is Wikipedia, and they don't care about having an army of mindless autistic bug people who love doing repetitive grut work tasks that normal people need automated because they can't spend all day trying to fix this one setting, while also having a job and other responsibility. I would like to see a completely new wiki software be written.