By that same logic LLMs themselves (by now some AI bro had to vibe code something there) & their trained datapoints (which were on stolen data anyway) should be public domain.
What revolutionary force can legislate and enforce this?? Pls!?
By that same logic LLMs themselves (by now some AI bro had to vibe code something there)
I’m guessing LLMs are still really really bad at that kind of programming. The packaging of the LLM, sure.
& their trained datapoints
For legal purposes, it seems like the weights would be generated by the human-made training algorithm. I have no idea if that’s copyrightable under US law. The standard approach seems to be to keep them a trade secret and pretend there’s no espionage, though.
By that same logic LLMs themselves (by now some AI bro had to vibe code something there) & their trained datapoints (which were on stolen data anyway) should be public domain.
What revolutionary force can legislate and enforce this?? Pls!?
I’m guessing LLMs are still really really bad at that kind of programming. The packaging of the LLM, sure.
For legal purposes, it seems like the weights would be generated by the human-made training algorithm. I have no idea if that’s copyrightable under US law. The standard approach seems to be to keep them a trade secret and pretend there’s no espionage, though.
Yes, totally, but OP says a small bit affects “possibly the whole project” so I wanted to point out that includes prob AIs, Windows, etc too.