Is the ai doing anything that isn’t already allowed for humans. The thing is, generative ai doesn’t copy someone’s art. It’s more akin to learning from someone’s art and creating you own art with that influence. Given that we want to continue allowing hunans access to art for learning, what’s the logical difference to an ai doing the same?
Did this already play out at Reddit? Ai was one of the reasons I left but I believe it’s a different scenario. I freely contributed my content to Reddit for the purposes of building an interactive community, but they changed the terms without my consent. I did NOT contribute my content so they could make money selling it for ai training
The only logical distinction I see with s ai aren’t human: an exception for humans does not apply to non-humans even if the activity is similar
Is the ai doing anything that isn’t already allowed for humans. The thing is, generative ai doesn’t copy someone’s art. It’s more akin to learning from someone’s art and creating you own art with that influence. Given that we want to continue allowing hunans access to art for learning, what’s the logical difference to an ai doing the same?
AI stans always say stuff like this, but it doesn’t make sense to me at all.
AI does not learn the same way that a human does: it has no senses of its own with which to observe the world or art, it has no lived experiences, it has no agency, preferences or subjectivity, and it has no real intelligence with which to interpret or understand the work that it is copying from. AI is simply a matrix of weights that has arbitrary data superimposed on it by people and companies.
Are you an artist or a creative person?
If you are then you must know that the things you create are certainly indirectly influenced by SOME of the things that you have experienced (be it walking around on a sunny day, your favorite scene from your favorite movie, the lyrics of a song, etc.), AS WELL AS your own unique and creative persona, your own ideas, your own philosophy, and your own personal development.
Look at how an artist creates a painting and compare it to how generative AI creates a painting. Similarly, look at how artists train and learn their craft and compare it to how generative AI models are trained. It’s an apples-to-oranges comparison. Outside of the marketing labels of “artificial intelligence” and “machine learning”, it’s nothing like real intelligence or learning at all.
(And that’s still ignoring the obvious corporate element and the four pillars of fair use consideration (US law, not UK, mind you). For example, the potential market effects of generating an automated system which uses people’s artwork to directly compete against them.)
Outside of the marketing labels of “artificial intelligence” and “machine learning”, it’s nothing like real intelligence or learning at all.
Generative AI uses artificial neural networks, which are based on how we understand brains to connect information (Biological neural networks). You’re right that they have no self generated input like humans do, but their sense of making connections between information is very similar to that of humans. It doesn’t really matter that they don’t have their own experiences, because they are not trying to be humans, they are trying to be as flexible of a ‘mind’ as possible.
Are you an artist or a creative person?
I see anti-AI people say this stuff all the time too. Because it’s a convenient excuse to disregard an opposing opinion as ‘doesn’t know art’, failing to realize or respect that most people have some kind of creative spark and outlet. And I know it wasn’t aimed at me, but before you think I’m dodging the question, I’m a creative working professionally with artists and designers.
Professional creative people and artists use AI too. A lot. Probably more than laypeople, because to use it well and combine it with other interesting ideas, requires a creative and inventive mind. There’s a reason AI is making it’s way all over media, into movies, into games, into books. And I don’t mean as AI slop, but well-implemented, guided AI usage.
I could ask you as well if you’ve ever studied programming, or studied psychology, as those things would all make you more able to understand the similarities between artificial neural networks and biological neural networks. But I don’t need a box to disregard you, the substance of your argument fails to convince me.
At the end of the day, it does matter that humans have their own experiences to mix in. But AI can also store much, much more influences than a human brain can. That effectively means for everything it makes, there is less of a specific source in there from specific artists.
For example, the potential market effects of generating an automated system which uses people’s artwork to directly compete against them.
Fair use considerations do not apply to works that are so substantially different from any influence, only when copyrighted material is directly re-used. If you read Harry Potter and write your own novel about wizards, you do not have to credit nor pay royalties to JK Rowling, so long as it isn’t substantially similar. Without any additional laws prohibiting such, AI is no different. To sue someone over fair use, you typically do have to prove that it infringes on your work, and so far there have not been any successful cases with that argument.
Most negative externalities from AI come from capitalism: Greedy bosses thinking they can replace true human talent with a machine, plagiarists that use it as a convenient tool to harass specific artists, scammers that use it to scam people. But around that exists an entire ecosystem of people just using it for what it should be used for: More and more creativity.
You picked the wrong thread for a nuanced question on a controversial topic.
But it seems the UK indeed has laws for this already if the article is to believed, as they don’t currently allow AI companies to train on copyrighted material (As per the article). As far as I know, in some other jurisdictions, a normal person would absolutely be allowed to pull a bunch of publicly available information, learn from it, and decide to make something new based on objective information that can be found within. And generally, that’s the rationale AI companies used as well, seeing as there have been landmark cases ruled in the past to not be copyright infringement with wide acceptance for computers analyzing copyrighted information, such as against Google, for indexing copyrighted material in their search results. But perhaps an adjacent ruling was never accepted in the UK (which does seem strange, as Google does operate there). But laws are messy, and perhaps there is an exception somewhere, and I’m certainly not an expert on UK law.
But people sadly don’t really come into this thread to discuss the actual details, they just see a headline that invokes a feeling of “AI Bad”, and so you coming in here with a reasonable question makes you a target. I wholly expect to be downvoted as well.
I never claimed that in this case. As I said in my response: There have been won lawsuits that machines are allowed to index and analyze copyrighted material without infringing on such rights, so long as they only extract objective information, such as what AI typically extracts. I’m not a lawyer, and your jurisdiction may differ, but this page has a good overview: https://blog.apify.com/is-web-scraping-legal/
EDIT: For the US description on that page, it mentions the US case that I referred to: Author’s Guild v Google
You might not remember but decades ago Microsoft was almost split in two. But then it came to pass that George Bush “won” the elections. And the case was dismissed.
Oh I agree money talks in the US justice system, but as the page shows, these laws also exist elsewhere, such as in the EU. And even if I or you don’t agree with them, they are still the case law that determines the legality of these things. For me that aligns with my ethical stance as well, but probably not yours.
I know they exist outside the US. I’m European. But there are too many terms of use that say that in case of problems they go to courts in the US that they will do everything on their hand to do the sameifn this case.
If AI companies can pirate, so can individuals.
You know I am somewhat of a large language model myself.
At this rate we will get access to more rights if we can figure out a way to legally classify ourselves as AI.
Is the ai doing anything that isn’t already allowed for humans. The thing is, generative ai doesn’t copy someone’s art. It’s more akin to learning from someone’s art and creating you own art with that influence. Given that we want to continue allowing hunans access to art for learning, what’s the logical difference to an ai doing the same?
Did this already play out at Reddit? Ai was one of the reasons I left but I believe it’s a different scenario. I freely contributed my content to Reddit for the purposes of building an interactive community, but they changed the terms without my consent. I did NOT contribute my content so they could make money selling it for ai training
The only logical distinction I see with s ai aren’t human: an exception for humans does not apply to non-humans even if the activity is similar
AI stans always say stuff like this, but it doesn’t make sense to me at all.
AI does not learn the same way that a human does: it has no senses of its own with which to observe the world or art, it has no lived experiences, it has no agency, preferences or subjectivity, and it has no real intelligence with which to interpret or understand the work that it is copying from. AI is simply a matrix of weights that has arbitrary data superimposed on it by people and companies.
Are you an artist or a creative person?
If you are then you must know that the things you create are certainly indirectly influenced by SOME of the things that you have experienced (be it walking around on a sunny day, your favorite scene from your favorite movie, the lyrics of a song, etc.), AS WELL AS your own unique and creative persona, your own ideas, your own philosophy, and your own personal development.
Look at how an artist creates a painting and compare it to how generative AI creates a painting. Similarly, look at how artists train and learn their craft and compare it to how generative AI models are trained. It’s an apples-to-oranges comparison. Outside of the marketing labels of “artificial intelligence” and “machine learning”, it’s nothing like real intelligence or learning at all.
(And that’s still ignoring the obvious corporate element and the four pillars of fair use consideration (US law, not UK, mind you). For example, the potential market effects of generating an automated system which uses people’s artwork to directly compete against them.)
Removed by mod
Generative AI uses artificial neural networks, which are based on how we understand brains to connect information (Biological neural networks). You’re right that they have no self generated input like humans do, but their sense of making connections between information is very similar to that of humans. It doesn’t really matter that they don’t have their own experiences, because they are not trying to be humans, they are trying to be as flexible of a ‘mind’ as possible.
I see anti-AI people say this stuff all the time too. Because it’s a convenient excuse to disregard an opposing opinion as ‘doesn’t know art’, failing to realize or respect that most people have some kind of creative spark and outlet. And I know it wasn’t aimed at me, but before you think I’m dodging the question, I’m a creative working professionally with artists and designers.
Professional creative people and artists use AI too. A lot. Probably more than laypeople, because to use it well and combine it with other interesting ideas, requires a creative and inventive mind. There’s a reason AI is making it’s way all over media, into movies, into games, into books. And I don’t mean as AI slop, but well-implemented, guided AI usage.
I could ask you as well if you’ve ever studied programming, or studied psychology, as those things would all make you more able to understand the similarities between artificial neural networks and biological neural networks. But I don’t need a box to disregard you, the substance of your argument fails to convince me.
At the end of the day, it does matter that humans have their own experiences to mix in. But AI can also store much, much more influences than a human brain can. That effectively means for everything it makes, there is less of a specific source in there from specific artists.
Fair use considerations do not apply to works that are so substantially different from any influence, only when copyrighted material is directly re-used. If you read Harry Potter and write your own novel about wizards, you do not have to credit nor pay royalties to JK Rowling, so long as it isn’t substantially similar. Without any additional laws prohibiting such, AI is no different. To sue someone over fair use, you typically do have to prove that it infringes on your work, and so far there have not been any successful cases with that argument.
Most negative externalities from AI come from capitalism: Greedy bosses thinking they can replace true human talent with a machine, plagiarists that use it as a convenient tool to harass specific artists, scammers that use it to scam people. But around that exists an entire ecosystem of people just using it for what it should be used for: More and more creativity.
You picked the wrong thread for a nuanced question on a controversial topic.
But it seems the UK indeed has laws for this already if the article is to believed, as they don’t currently allow AI companies to train on copyrighted material (As per the article). As far as I know, in some other jurisdictions, a normal person would absolutely be allowed to pull a bunch of publicly available information, learn from it, and decide to make something new based on objective information that can be found within. And generally, that’s the rationale AI companies used as well, seeing as there have been landmark cases ruled in the past to not be copyright infringement with wide acceptance for computers analyzing copyrighted information, such as against Google, for indexing copyrighted material in their search results. But perhaps an adjacent ruling was never accepted in the UK (which does seem strange, as Google does operate there). But laws are messy, and perhaps there is an exception somewhere, and I’m certainly not an expert on UK law.
But people sadly don’t really come into this thread to discuss the actual details, they just see a headline that invokes a feeling of “AI Bad”, and so you coming in here with a reasonable question makes you a target. I wholly expect to be downvoted as well.
Oh are we giving AI the same rights as humans now? On what grounds?
I never claimed that in this case. As I said in my response: There have been won lawsuits that machines are allowed to index and analyze copyrighted material without infringing on such rights, so long as they only extract objective information, such as what AI typically extracts. I’m not a lawyer, and your jurisdiction may differ, but this page has a good overview: https://blog.apify.com/is-web-scraping-legal/
EDIT: For the US description on that page, it mentions the US case that I referred to: Author’s Guild v Google
You might not remember but decades ago Microsoft was almost split in two. But then it came to pass that George Bush “won” the elections. And the case was dismissed.
In the US justice system, money talks.
Oh I agree money talks in the US justice system, but as the page shows, these laws also exist elsewhere, such as in the EU. And even if I or you don’t agree with them, they are still the case law that determines the legality of these things. For me that aligns with my ethical stance as well, but probably not yours.
I know they exist outside the US. I’m European. But there are too many terms of use that say that in case of problems they go to courts in the US that they will do everything on their hand to do the sameifn this case.