It is funny watching people claim AGI is just around the corner so we need to be safe with LLMs
…when LLM can’t keep track of what’s being talked about, and their main risks are: Covering the internet with slop and propaganda, and contributing to claime change. Both of which are more about how we use LLMs.
Right but reliance on it is a way to destroy the world in the dumbest way. I don’t mean in the robot apocalypse way but the collapse of most societies. Without reliable information, nothing can get done. If shitty llms get put into everything, there’s no government, no travel, no grid/infrastructure and logistics of every kind are gone.
While it’s fun to think about living in a small, self-sufficient community, we are not prepared for that and certainly not at this pace.
The risk of LLMs aren’t on what it might do. It is not smart enough to find ways to harm us. The risk seems from what stupid people will let it do.
If you put bunch of nuclear buttons in front of a child/monkey/dog whatever, then it can destroy the world. That seems to be what’s LLM problem is heading towards. People are using it to do things that it can’t, and trusting it because AI has been hyped so much throughout our past.
LLMs are already deleting whole production databases because “stupid” people are convinced they can vibe code everything.
Even programmers I (used to) respect are getting convinced LLM are “essential”. 😞
They are useful to replace stackoverflow searches.
I’ve not found them useful for that, even. I often just get “lied to” about any technical or tricky issues.
They are just text generators. Even the dumbest stack overflow answers show more coherence. (Tho, they are certainly wrong in other ways.)
The risk is worth it for bideo games
xD
SamanthA
Do you remember when we were all wanting to be careful with AI, and not just proliferate the thing beyond any control?
It was only a few years ago, but pepperidge farm remembers
xD
Made me think of an article I read not so long ago, really interesting!
oooo
I’m sure they’ll have a good reason. If not, I have several to suggest.
To me, there is no risk. Destroying the world is the goal. Humans had a very bad run on this planet. Destroy. Erase. Rebuild.
Cringe take
Problem is, there probably isn’t any rebuilding again or at least not to the level of technology that we are currently at. The main reasoning for that is all of the easy to get to resources like metals and fossil fuels have already been used up. So if this doesn’t work out and another potentially intelligent species comes along it’s going to be even harder than we had it getting things started.
I mean, good. Intelligent life as it appears on Earth is horrific beyond words.
…I feel like you totally missed the point. Destroy all humans. Humans are now extinct. Erase their effects on the planet. And rebuild the ecosystem with species that are healthy for the planet.
Humans make the mistake of thinking that they are the most important thing in existence, and the world would end without them. This planet has survived countless exinctions of species in the past. It’ll survive just as well without us.
The planet would survive, sure, but why should we care that it does without us? Meaning is something that we invented, without someone to assign it, an ecosystem is little more valuable than a rock.
animals experience meaning. they don’t like to die.