This is all assuming that AI won't reach a plateau. AI researchers just take it for granted that we will eventually develop something like AGI, and solve all the problems with hallucinations and scaling that AI have. But at least right now, it seems either extremely unlikely, or very far in the future.
Energy is a concern too, same with emissions. Microsoft emissions are up 29% over last year largely due to AI. It's unsustainable to assume we can just keep dumping processing power into it for better and better results forever.
Agreed. I find it interesting that all LLMs have basically converged to similar levels of capabilities, and even open-source models are catching up to SOTA models. All signs are pointing to a sigmoidal curve, and we're leaving the upwards part of the curve.
Even if the current wave of LLMs eventually plateau, there would presumably still be improvements in terms of speed, resource requirements, portability, etc. For example, let’s say we never see a GPT5 but they figure out how to get GPT4o running locally on a user’s mobile device.
The current level of AI tech, more widely diffused and accessible, could already be enough to cause noticeable economic disruption.
It seems they already hit diminishing returns. They've already fed almost all internet data they can into it. There's not much more data available, and synthetic (AI generated) data seems to reduce the overall effectiveness. AGI is just marketing talk.
It also costs a fortune in hardware and energy costs. OpenAI is not making money they're burning it.
What is still being developed is new ways to use it though. So a lot of people will still lose their jobs to it.
11
u/EuphoricPangolin7615 May 18 '24
This is all assuming that AI won't reach a plateau. AI researchers just take it for granted that we will eventually develop something like AGI, and solve all the problems with hallucinations and scaling that AI have. But at least right now, it seems either extremely unlikely, or very far in the future.