Their definition and yours are not exactly the same. If you read what they have to say, you'll see that the issues are more broad than whatever you personally call abuse, and those issues include exploitation and distribution of images that victims find harmful. Are you a victim of abuse or exploitation? Who are you to determine what a victim finds abusive or exploitative?
I'm not saying I agree with this situation, just that I can see how it happened and why Google won't do anything in this case.
I didn't give a definition for what constitutes abuse.
My point is that the software is not working as intended. The intent is to stop child abuse, but the software they build does not identify child abuse it identifies what it believes to be naked (semi-naked?) minors. Those are two different things.
A photo you take of your kid for the doctor to see in order to treat them is not CSAM. Neither is a photo of your naked baby.
But those same photos if stolen from you or taken by others for different (less wholesome) reasons would be. But, and most importantly, that context is not to be found in the photo itself.
Their definition and yours are not exactly the same. If you read what they have to say, you'll see that the issues are more broad than whatever you personally call abuse, and those issues include exploitation and distribution of images that victims find harmful. Are you a victim of abuse or exploitation? Who are you to determine what a victim finds abusive or exploitative?
I'm not saying I agree with this situation, just that I can see how it happened and why Google won't do anything in this case.