I read an article the other day in IEEE spectrum that according to a study by a group called METR (Model Evaluation & Threat Research, a non-profit research institute in Berkeley that is mostly concerned with the dangers of AI and is staffed by ex-OpenAI employees AFAIK) that LLMs are improving at an exponential rate since 2019 and the current cycle seems to be seven months. This means the measurable performance of current models doubles every seven months. This vibes with my fully subjective observations. Self-improvement loops re: AI are also not the substance of scifi stories anymore, they're a real thing you can track through papers and actual real-life results you can see. AI is used to improve AI. (Nvidia's Hopper architecture, Google's AutoML) The paper extrapolates that by 2030, an AI could be able to do a software-based task that would take a person skilled in that task a month (40 hours of work a week) in less than a day (at a reliability of 50%). This includes creative tasks like writing a "decent" novel, but also programming tasks like writing software, or business tasks like managing the logistics of a company. This would of course also include tasks like improving its own software stack.
While the paper also admits that this development could eventually be bottlenecked by hardware capability, it might as well also accelerate the interval by improving hardware capability through AI-assisted development or other architectural efficiency gains that cannot be guessed at yet and the likes we've seen already. In conclusion, if these observable permises hold, we might be less than a decade away from a fast takeoff/runaway process by a "seed AI" (a, relatively speaking, weak AI that is capable enough to set off a process of
recursive self improvement) leading to an
intelligence explosion.
If you watch the process up close like I have for the last ten years, it's really fascinating to see the acceleration in it, how what started as a brisk walk developed into a full blown sprint in the last, I'd say, two years. I observed a similar thing in the late 80s with home computers. Back then, also nobody cared (lol) and there were always people (laymen and "experts") around who'd tell you with perfect sincerity and confidence that computers are a fad that's never going to amount to anything for the average Joe. This occupies my mind quite a bit every few months, because it'll change absolutely everything, again. Ever have been in an accident and feeling the time just slowing down as your brain kicks into overdrive and seconds feel like hours? It also feels a little bit like that, which the computer revolution hasn't felt like, at all.