Neither e/acc or the doomers are right. Both are making the same logical error of assuming that what we're calling "AI" today is, in fact, worthy of being labeled as legitimate intelligence and not a clever parlour trick.
Instead, the most likely outcome will be a further enshitification of the world by people and companies trying to "AI all the things" (a 2020s equivalent of "everybody needs an app").
And before your balk: look around—it's already happening.
It is happening. But I'm more concerned by the enshittification of life for everybody working in a field that just heard the starting pistol for the race to the bottom. Dollar sign eyed executives will happily take that shitty version of their former product and "employ" gig workers making piece work wages to sorta clean it up without having to give anyone security or health insurance or anything. It's a turbo charger for the upper crust's money vacuum.
Not sure. Something doesn't need a solution to be a problem, and pointing it out in a crowd of people gleefully patting themselves on the back about making the problem worse every day is a worthwhile pursuit.
It's sacrilegious to even imply prioritizing the profit machine at many other people's expense isn't a requirement of any plan, but maybe the policy experts can figure out how our society could treat people like they're inherently worthwhile so when industry dramatically improves efficiency, we won't just casually toss the affected people out the window like a bag of moldy peaches and tell them its their own fault. The only people who got a congressional hearing were the people running giant AI projects begging for regulatory capture to prevent boogeyman problems. They did not hear from someone with kids to feed and stage 4 cancer about to lose their family health insurance because chatGPT tanked the market for their labor overnight and the only work they can get is gig work which makes them just enough money to not qualify for most government subsidies. Despite what many private health insurance fans say about health care being free for the poor, hospitals are only required to stabilize you. They'll stop you from bleeding out, but they sure as fuck aren't giving you a free supply of chemotherapy, heart medication, colostomy bags, dialysis, or insulin.
I'm not proposing one, but am skeptical that this reality will be seriously considered at any point before a large-scale violent uprising is on the table.
If you're blessed enough to do it (or can sweat it out): build companies that reject that line of thinking and hire those people.
The good news is that the markets are already proven, so—though, not easy—it boils down to building the same products while restoring some semblance of their original vision (just under a new brand name).
When an AI can successfully navigate everyday human tasks without needing hand holding: change my oil, withdraw money from a buggy ATM, change a diaper, etc.
IMHO: considering the declining quality standards (and individual psychology) of Western civilization, never.
> IMHO: considering the declining quality standards (and individual psychology) of Western civilization, never.
What does that have to do with AI progress? AI, tech, etc. are all increasing continuously, getting better year after year.
What you're qualifying as AGI will require
> Human level general intelligence
> Human level embodiment of the intelligence
> A robot with human level dexterity
Which are all tech trends towards which billions of dollars are being poured and we are getting closer everyday. To think that general declining quality standards among stagnating areas in Western Civilization means "no AGI ever" is a misguided extrapolation.
> What does that have to do with AI progress? AI, tech, etc. are all increasing continuously, getting better year after year.
Who do you think is building the AI? Even more importantly, if people become overly-reliant on AI (already early hints of this happening), human competency will decline. There's a tipping point on that curve where AI plateaus indefinitely as there's no one competent enough to work on it anymore. The speed we're traveling at on that curve is far faster than progress toward an organic intelligence.
> What you're qualifying as AGI will require [...]
> Which are all tech trends towards which billions of dollars are being poured and we are getting closer everyday.
The amount of money you pour into a problem is meaningless. How it's solved (and why) is far more important. Resources !== solutions. If that heuristic were true, the world would be in a far better place.
Instead, the most likely outcome will be a further enshitification of the world by people and companies trying to "AI all the things" (a 2020s equivalent of "everybody needs an app").
And before your balk: look around—it's already happening.