@ErikJonker@mastodon.social @david_chisnall@infosec.exchange @chris@mstdn.chrisalemany.ca here's a hint
COPY right
@Wearwolf@kind.social @david_chisnall@infosec.exchange @chris@mstdn.chrisalemany.ca for strictly personal, non-commercial use copyright law gives a lot of freedom.
@ErikJonker@mastodon.social @Wearwolf@kind.social @chris@mstdn.chrisalemany.ca
Try creating copies of music CDs or movie DVDs for strictly personal, non-commercial use and see what the EUCD says about your legal liability.
@ErikJonker@mastodon.social @david_chisnall@infosec.exchange @chris@mstdn.chrisalemany.ca it's not personal, non-commercial use though
LLM work by learning what words are associated with what context and then spitting those words out again when prompted with a similar context
They are regurgitation engines. They can only spit out, produce a copy of, the content that went into them.
It would be illegal to sell collages of people's Instagram posts without their permission. LLMs are that but with people's words
@ErikJonker@mastodon.social @david_chisnall@infosec.exchange @chris@mstdn.chrisalemany.ca the argument you would use is actually that it's a transformative work. The process of the content going through the machine makes it unique.
The problem there is that LLMs can't have original ideas. They can't rephrase or paraphrase. All of that transformation comes from combining your words with other people's words
So the defence is that you have stolen from so many people that it's not obvious what was stolen from whom
@ErikJonker@mastodon.social @david_chisnall@infosec.exchange @chris@mstdn.chrisalemany.ca the argument you would use is actually that it's a transformative work. The process of the content going through the machine makes it unique.
The problem there is that LLMs can't have original ideas. They can't rephrase or paraphrase. All of that transformation comes from combining your words with other people's words
So the defence is that you have stolen from so many people that it's not obvious what was stolen from whom
@Wearwolf@kind.social @ErikJonker@mastodon.social @chris@mstdn.chrisalemany.ca
The legal fig leaf here might be that quoting small amounts of other peoples' work is protected by fair use (which is an affirmative defence). There are two problems with this:
First, as a few of the lawsuits have shown, the correct prompt can generate large amounts of original versions, far beyond the amount allowed for quotation. If it can be extracted from the model, then it must be contained within the model and so the model is a derived work.
Second, quoting more than a very small amount usually requires attribution. This is where non-US laws may be stricter. In the UK and EU, there is a notion of 'moral rights'. You may have read something like 'the moral rights of the author to be associated with the work...' in the front of a book. Moral rights are somewhat different to copyright: even where you have the right to use something that I have written, you do not have the right to claim that you wrote it. The fact that LLMs are not able to provide attribution risks running afoul of this.
@Wearwolf@kind.social @ErikJonker@mastodon.social @chris@mstdn.chrisalemany.ca
The legal fig leaf here might be that quoting small amounts of other peoples' work is protected by fair use (which is an affirmative defence). There are two problems with this:
First, as a few of the lawsuits have shown, the correct prompt can generate large amounts of original versions, far beyond the amount allowed for quotation. If it can be extracted from the model, then it must be contained within the model and so the model is a derived work.
Second, quoting more than a very small amount usually requires attribution. This is where non-US laws may be stricter. In the UK and EU, there is a notion of 'moral rights'. You may have read something like 'the moral rights of the author to be associated with the work...' in the front of a book. Moral rights are somewhat different to copyright: even where you have the right to use something that I have written, you do not have the right to claim that you wrote it. The fact that LLMs are not able to provide attribution risks running afoul of this.