I’m not fully up to speed on Waymo and if they have ever released remote assistance/ miles details, but when Cruise went through that shit storm a year or two ago, it came out that that the cars were asking for help every few miles.
Interesting. Stuff like this makes me suspicious of the current LLM hype. I know it’s not necessarily language models per se being used by these vehicles, but still. If we were really on the cusp of AGI then I’d expect us to have at least cracked autonomous driving by now.
Ya, I don’t buy the hype around AGI. Like a Waymo drove into a telephone pole because of something they had to fix in their code. I’m not doubting there’s AI involved, neural nets, machine learning, whatever, but this isn’t an AGI type level development. Nor do I think they need an AGI to do this.
I’m also not convinced this LLM stuff can ever lead to AGI either. I think it can do some pretty impressive things with some very real drawbacks/caveats and there is definitely room to keep improving them, but that the whole architecture is flawed if you want to make an AGI.
Yeah I guess theres a lot of interesting stuff we can do with AI without necessarily achieving AGI. What about programming? Even if we don’t get AGI soon, do you still think LLMs will be snatching up a sizeable chunk of programming jobs?
I’m not fully up to speed on Waymo and if they have ever released remote assistance/ miles details, but when Cruise went through that shit storm a year or two ago, it came out that that the cars were asking for help every few miles.
Cruise was essentially all smoke and mirrors.
Interesting. Stuff like this makes me suspicious of the current LLM hype. I know it’s not necessarily language models per se being used by these vehicles, but still. If we were really on the cusp of AGI then I’d expect us to have at least cracked autonomous driving by now.
Ya, I don’t buy the hype around AGI. Like a Waymo drove into a telephone pole because of something they had to fix in their code. I’m not doubting there’s AI involved, neural nets, machine learning, whatever, but this isn’t an AGI type level development. Nor do I think they need an AGI to do this.
I’m also not convinced this LLM stuff can ever lead to AGI either. I think it can do some pretty impressive things with some very real drawbacks/caveats and there is definitely room to keep improving them, but that the whole architecture is flawed if you want to make an AGI.
Yeah I guess theres a lot of interesting stuff we can do with AI without necessarily achieving AGI. What about programming? Even if we don’t get AGI soon, do you still think LLMs will be snatching up a sizeable chunk of programming jobs?