13.83241
EA discourse crossed my path again, so I ended up reading a bit more what Émile Torres has to say. I hadn’t quite realized how much they blame AI doomers for the current AI race/bubble/mania.
The involvement and influence they cite of AI doomers seems accurate. But it’s like they forgot to imagine a world without AI doomers, or where AI doomers decided that they should just stay in their weird corner of the internet and not try to avert the development of AGI that they believe will kill everyone without trying to.
The idea of AGI was widespread decades before AI doomers existed, and was widely portrayed as a good thing, in a naïve technological-progress is good sort of way. Eliezer Yudkowsky started out believing that AGI would be a such a boon, and started the Singularity Institute with the mission to achieve it. Unless someone developed the idea solely from watching Terminator (which is maybe possible? but serious contemplation of AI doomsday scenarios looks nothing like nice theatrical ones where the humans survive), anyone else who developed and propagated the idea of AGI causing extreme harm would also start out believing that AGI would cause extreme good.
Which suggests that the world where AI doomerism sat in the corner (or didn’t exist at all) is a world where the AGI race is substantially made up of AI boomers.
Now, there are lots of options for whether we would have an AGI race in 2024 in those worlds. A lot of effort was expended to avoid an AGI race - there had been a clear effort to keep things academic. Then OpenAI created ChatGPT.
I’m pretty distant from this, but it seems like the mistake was, when faced with the dilemma of needing access to huge amounts of computing power to work on the current trend in models, OpenAI chose to restructure to seek investment rather than figure out what they could do with the computing power that was within their budget.
In a world where AI doomerism had no influence in the leading AI research organizations, there would have been no dilemma. More funding would have always been better.