Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines
(+1)

AI feeds on hand-created art in the most literal sense.  Without hand-created art to train on, there would be no AI art.

The problem with most AI models is that they are obviously infringing on copyright by being trained on unlicensed copyrighted materials.  This is likely to bite the companies that produce these models, and those that use these models, in the ass in the near future.

I have no problem with AI algorithms so long as you train them exclusively on your own art.  I have a problem with copyright theft, where people take other people's art, remix it through an AI algorithm, and claim to own the copyright on the result.

(+3)
Without hand-created art to train on, there would be no AI art.

One particular method would not exist. There was AI art before, as AI is a very poor choice of words, as it would include all sorts of machine generated things.

If you would imagine a world where training a model by human made art would be illegal, there are still other ways to make a machine create images. So, there would be "AI art", just not the thing we currently understand by that word.

they are obviously infringing

It is not obvious. If it were, it would be forbidden from day one, everywhere.

When an art student looks at art works and gets inspiration, even imitates techniques and then creates a new work, how can that artist claim copyright. The basis of that work was obviously infringed from other works, just remixed and with some added random bits to create something new.

The only difference is, that the remixing was done inside a human mind.

Maybe read this one here https://en.wikipedia.org/wiki/Derivative_work to understand that it is anything but obvious. Also, copyright ist the right to copy. An AI work that used training data is not a copy.

You can't copyright the knowledge how things look. If you teach a black box how things look and it has the ability to understand a prompt and use that knowledge to create something, that poses a lot of interesting questions with a lot of non obvious and probably conflicting answers.

(+2)
It is not obvious. If it were, it would be forbidden from day one, everywhere.

You are putting way too much faith in governing bodies. In reality, governments often need significant pressure before ruling on something "obvious" 

(+1)

Oh, I did mean that literal. It just is not obvious how to handle llm systems and their capabilities. It is a new concept. Anyone claiming it is obivous just has their own opinion and tries to convince other people with non arguments. It is a type of fallacy. Propably this one https://en.wikipedia.org/wiki/Argument_from_incredulity

But we need real arguments and reasoning to deal with llm in the future. The tech won't go away. And arguments of an ethical nature can be overcome and what then? Arguments that machines take human jobs? Those never worked.

(2 edits) (+1)

There is a clear legal difference between human brains and computers.  There has to be in order for copyright to work, because every time I consume a copyrighted work, I am creating a copy in my brain.  I can legally read a short poem and have a perfect copy of that in my brain.  If I later write down the poem that's now in my brain, that's when I'm breaking copyright - not before.

There is a legal technique called clean-room design to create a functional clone of something without breaking copyright law.  It involves two teams of engineers working together.  The first team examines the original and writes a specification.  The second team creates the clone according to the specification without looking at the original.  In order for this technique to work, the following all have to be true:

  • It is legal for the first team to examine the original.
  • Examining the original taints the first team.  Because they now carry around a copy of the original in their brain, any clone they create will now legally be a derivative work.
  • The specification that the first team creates is not itself tainted.

Any argument that it should be legal to do something on a computer if it's legal to do the same thing in a human brain is either an argument against copyright itself or an argument in favor of government thought control.  I can sort of get behind the former, but our governments and courts have decided differently.  The latter is completely unconscionable.

(+1)

My argument was, that is was not obvious.

And your clean room design has also issues that are not obvious. You might not break "copyright" under certain circumstances, but you will break trademarks and other legal barriers, like patents. Also, this is just to circumvent having used a "copy". Which too would be circumvented by using a llm.

So how would a clean room aproach look like for an image or a work of art? Oh, and no, you do not break copyright by writing down a poem you remembered. You break copyright by publishing it, because you distribute a "copy" and you did not have the "right to copy". But this interpretation is different world wide.

Using a work to create another work without permission is not a copyright breach, unless you use exact portions of that work. It might be a licensing issue or other legal stuff. But not copyright. That clean room approach was for example used to recreate functionality of software code, because, of course, afterwards there were accusations that portions of the code were used.

So in other words: as I said, you can't copyright the knowledge how something looks. Or what it does in case of software. You might give a patent if you have a broken patent law.

Aso a llm basically uses the hash value of a work. Some earlier version might have had portions of original works inside the database, but when you read about the complaints against those systems it was about the content of those databases and decidedly not the output of the llm.

I stand by my opinion. It is not obvious how to handle llm generative ai systems. Neither legally, nor in society, nor in art. Or games. The emerging consens among players seems to be, to prefer human made art. The emerging consens among software developers seems to be, yeah, another tool to play with that can churn out templates in a hurry and find semantical errors in a programming language.