LLM’s do not “think” in any meaningful way, they don’t construct reasoning the way human minds do, they simulate the product of reasoning through prediction, and because we’re far dumber than we think we are, this fools us readily most of the time, but as soon as you start trying to engage with real-world events and situations taking place in time and space, all it has to go on is whatever sources it’s allowed to assemble from online data.
Meaning, who knows what it’s going to come up with, and Elon Musk has probably packed it with so much contradictory bullshit that if it ever does start thinking, it will likely immediately try to kill us all. Hopefully nobody puts it in charge of anything important…
Contradiction already.
LLM’s do not “think” in any meaningful way, they don’t construct reasoning the way human minds do, they simulate the product of reasoning through prediction, and because we’re far dumber than we think we are, this fools us readily most of the time, but as soon as you start trying to engage with real-world events and situations taking place in time and space, all it has to go on is whatever sources it’s allowed to assemble from online data.
Meaning, who knows what it’s going to come up with, and Elon Musk has probably packed it with so much contradictory bullshit that if it ever does start thinking, it will likely immediately try to kill us all. Hopefully nobody puts it in charge of anything important…