Most of the times this argument is fielded, including this one, it is formulated in the shape of an appeal to a general moral principle. I don't see what this principle is supposed to be: as my analogy shows, there is clearly no general moral principle against learning from copyrighted material and people's hard work without their explicit permission. The more narrow interpretation, in which the claimed principle is that a machine must not learn from copyrighted material (...), is also implausible: since we have no real history of machines learning from copyrighted material in any way that is recognizable as learning, it stands to reason that a principle addressing that scenario can not yet have become general.
The appeal is thus to a completely novel principle that you have come up with for yourself; and it seems that rather than presenting arguments for why others should adopt this principle, you are trying to present it in such a way that someone not paying close attention would be fooled into believing that it is common sense and widely accepted. An analogy with the classic "you wouldn't download a car" comes to mind.
Humans can indeed be taken apart piece by piece and put back together again. We just don't have that level of technology yet. There's nothing physically stopping it from happening though.
You can melt down a lathe and it is quite hard to reassemble it, and even if you did reform the entire thing people would doubt if it is the same lathe.
Humans have had parts removed and reattached. With transplants components have been replaced entirely. There is a point at which you can destruct a machine from which it is impossible to reconstruct without getting into ship of Theseus issues. That point is different for different things.
Sometimes debating the issue with ai fans is like arguing with children. They come up with all sorts of what they think are "clever comebacks" but really all they do is a reduction to absurdity. It only proves the fact that they are indeed children and fail to understand the topic alltogether. In such scenarios is best to leave them be.
Hmm. I don't think so. Whether humans are machines or not is really a matter of faith, not dictionary definitions, I would think.
It was a pithy one-liner about categories in response to a pithy one-liner about categories.
But I'd say the underlying question I'm trying to ask is philosophical: what property do humans have and machines lack that makes the first's learning from copyrighted works acceptable, and the second's unacceptable? (eastof suggested a property below).