Thought this was an Onion article!
Hey plebs! I demand you work 50% more to develop AGI so that I can replace you with robots and fire all of you and make myself a double plus plutocrat! Also, I want to buy an island, small city, Bunker, Spaceship, And/Or something.
AGI requires a few key components that no LLM is even close to.
First, it must be able to discern truth based on evidence, rather than guessing it. Can’t just throw more data at it, especially with the garbage being pumped out these days.
Second, it must ask questions in the pursuit of knowledge, especially when truth is ambiguous. Once that knowledge is found, it needs to improve itself, pruning outdated and erroneous information.
Third, it would need free will. And that’s the one it will never get, I hope. Free will is a necessary part of intelligent consciousness. I know there are some who argue it does not exist but they’re wrong.
I don’t believe a single word of this bullshit.
That might speed it up, but that certainly is not a prerequisite.
I bet it would happen faster if this guy was fired.
He can fuck all the way off.
They warn us about AGI while simultaneously attempting to sell it to us.
Or just hire 50% more engineers? Or wait 50% longer?
with “hire more” you do run up against the “9 women can have a baby in 1 month” limit, but in this case it’s likely to help.
For how many years? Cuz y’all ain’t anywhere near AGI. You can’t even get generative AI to not suck compared to your competition in that market (which is a pretty low bar) lol
With all the rounds of layoffs they’ve had, their remaining employees would need to be quite stupid to give a shit what this disloyal piece trash says.
AGI is not in reach. We need to stop this incessant parroting from tech companies. LLMs are stochastic parrots. They guess the next word. There’s no thought or reasoning. They don’t understand inputs. They mimic human speech. They’re not presenting anything meaningful.
I feel like I have found a lone voice of sanity in a jungle of brainless fanpeople sucking up the snake oil and pretending LLMs are AI. A simple control loop is closer to AI than a stochastic parrot, as you correctly put it.
There are at least three of us.
I am worried what happens when the bubble finally pops because shit always rolls downhill and most of us are at the bottom of the hill.
Not sure if we need that particular bubble to pop for us to be drowned in a sea of shit, looking at the state of the world right now :( But silicon valley seems to be at the core of this clusterfuck, as if all the villains form there or flock there…
That undersells them slightly.
LLMs are powerful tools for generating text that looks like something. Need something rephrased in a different style? They’re good at that. Need something summarized? They can do that, too. Need a question answered? No can do.
LLMs can’t generate answers to questions. They can only generate text that looks like answers to questions. Often enough that answer is even correct, though usually suboptimal. But they’ll also happily generate complete bullshit answers and to them there’s no difference to a real answer.
They’re text transformers marketed as general problem solvers because a) the market for text transformers isn’t that big and b) general problem solvers is what AI researchers are always trying to create. They have their use cases but certainly not ones worth the kind of spending they get.
LLM’s can now generate answers. Watch this:
Let’s work 20 hour weeks then. Who wants AGI other than war pigs and billionaires?
I mean, we could get a rogue agi take over the world. It might not even be worse lmao
WarPigs by Sabbath was about the future.
If it’s in reach working 60 hour weeks, it’s also in reach working 40 hour weeks, it will just take 1/3rd longer. ;)
Let’s be real, it’ll probably happen faster on 40 hour work weeks than 60.
lots of evidence starting to point to it’d be quicker in 32 hour weeks than 40
50%
“Work 50% longer weeks so you can make something that’ll both make me richer AND cost you your jobs!” is not the motivational speech he thinks it is.
I don’t even know what AGI is, and I read the headline as “rich disconnected from reality asshole”.
Turns out I was right.
A General ignoramus
Wait, are these AI boosters bragging about how close they are to building God the torture that Roko’s Basilisk is inflicting on us all?
Hahaha