And by the same logic, training it on only data that you own should also be harmless.
And yet, I usually see all generative ai treated the same way in these discussions and with the handling of the tech. There is no differenciation at this point about any legally non-ambigous or ethically "clean" solutions. It is ai or not ai. And that is why I am lamenting that arguments against the tech should be solid and future proof instead of shortsighted and emotional.
With the current/initial filters provided there is also only ai yes or no with no practical information if it is content or just the code (yet?). Unlike artists, developers embraced the tech. Or at least that is my impression. It is the developers that create code with the tech. But with art, it is usually not the artists that use it to create more art, but people that could not draw at all to create images. Maybe it is, because in software it is a common concept to reuse other's code from templates, examples, and of course your own previous works and adapt it to new situations. In the visual arts that is frowned upon as "tracing".