There's also the "rapture of the nerds" you speak of and is often also called "The Singularity." That's a closely related concept called the Law of Accelerating Returns, which is the more generalized version of Moore's Law. Basically, if you plot technological advancement on a graph, advancements have happened more and more frequently as time goes on. If this holds up -- big if, but it has so far -- then around 2030-2045, shit gets a bit crazy -- the chart of technological progress or computing power or whatever specific metric you're using (there are several and they all follow this trend around the same time) reaches an asymptote, or in other words, goes to infinity.
This is related to the previous thing about making smarter people (or people smarter) -- because lets say you make a smarter person, and set him on the task of making smarter people? He's inherently going to find better ways to make smarter people, so the next generation of people will be even smarter still. More importantly, the original humans and even the first generations of smarter humans rapidly get left in the dust, as they inherently can't understand the successive generations of people who are smarter than they are. (Compare it to a retard not being able to understand calculus, no matter how hard he studies.)
This is commonly associated with AI, specifically AGI. AGI (not what we have now, but a Artificial General Intelligence, i.e., an actual artificial mind) could be used to make a better AGI. Lets call it AGI2. AGI2 then would be smarter than AGI1, and you could set AGI2 to the task of making a better AGI. AGI3. AGI3 gives us AGI4, AGI4 gives us AGI5.... This process repeats, and repeats quickly, and normal human minds are left far, far behind, very quickly. They call any AGI smarter than a human could be an ASI (Artificial Super Intelligence).