AI codebots in the cloud doing your code for you, cool, I guess.
So you need to watch them? And presumably intervene if necessary? Ok.
So then:
They decided that they’d stream a video of the AI codebots doing their thing.
At 40Mbps per stream.
For “enterprise use”.
Where presumably they want lots of users.
And then they didn’t know about locked down enterprise internet and had to engineer a fallback to jpeg for when things aren’t great for them. Newsflash - with streaming video peaking at 40Mbs per user, things will never be great for your product in the real world.
How, in any way, does this scale to anything approaching success? Their back end now has to have the compute power to encode and serve up gigabits of streaming video for anything more than ~50 concurrent users, let alone the compute usage of the actual “useful” bit , the AI codebots.
For say, 5 users out of a site of 200, IT departments will now see hundreds of megabits of streaming traffic - and if they’re proactive, they will choke those endpoints to a crawl so that their pathetic uplink has a chance to serve the other 195 users.
All of this for a system that is fundamentally working on maybe 5kB of visible unicode text at any particular moment.
How does a person get this far down a rabbit hole?
I don’t know. Software engineering is tangential to my field but I have to wonder, is software efficiency even a consideration these days?
It seems that maybe a week of just solid thinking about what they have and what they need - WITHOUT touching a keyboard - could have put them in a better position. But move fast and break things doesn’t seem to accommodate that kind of approach.
How does a person get this far down a rabbit hole?
AI psychosis is a real thing, and it’s apparently a lot easier to fall into these rabbit holes than most people think (unless, I suspect, like me, you have a thick foundation of rock-solid cynicism that the AI simply will never penetrate). This is probably another interesting example of it.
Ok there’s a whole lot of wtf going on here.
AI codebots in the cloud doing your code for you, cool, I guess.
So you need to watch them? And presumably intervene if necessary? Ok.
So then:
They decided that they’d stream a video of the AI codebots doing their thing.
At 40Mbps per stream.
For “enterprise use”.
Where presumably they want lots of users.
And then they didn’t know about locked down enterprise internet and had to engineer a fallback to jpeg for when things aren’t great for them. Newsflash - with streaming video peaking at 40Mbs per user, things will never be great for your product in the real world.
How, in any way, does this scale to anything approaching success? Their back end now has to have the compute power to encode and serve up gigabits of streaming video for anything more than ~50 concurrent users, let alone the compute usage of the actual “useful” bit , the AI codebots.
For say, 5 users out of a site of 200, IT departments will now see hundreds of megabits of streaming traffic - and if they’re proactive, they will choke those endpoints to a crawl so that their pathetic uplink has a chance to serve the other 195 users.
All of this for a system that is fundamentally working on maybe 5kB of visible unicode text at any particular moment.
You are right but this comment gives me iPod introduction feelings. That company will be huge in some years.
Quit reading at:
But your comment made me go back and look out of disbelief. How does a person get this far down a rabbit hole?
I don’t know. Software engineering is tangential to my field but I have to wonder, is software efficiency even a consideration these days?
It seems that maybe a week of just solid thinking about what they have and what they need - WITHOUT touching a keyboard - could have put them in a better position. But move fast and break things doesn’t seem to accommodate that kind of approach.
What a glorious future AI is heralding.
AI psychosis is a real thing, and it’s apparently a lot easier to fall into these rabbit holes than most people think (unless, I suspect, like me, you have a thick foundation of rock-solid cynicism that the AI simply will never penetrate). This is probably another interesting example of it.
Do we know each other or something :).
Honestly great comment, couldn’t agree more.