True, not only handicapping is already over the top with most models, but even mentioning it here gets you flagged, see my honest comment at the end (-1), amazing how people are censoring themselves now.
Conservatives (in the US meaning of the word) are exclusively the ones clutching their pearls about pornography. I don't know how you can be so confused on the subject. Three red states have even started the process of banning pornographic websites.
> OpenAI claims it focused a lot of work on DALL-E 3 in creating robust safety measures to prevent the creation of lewd or potentially hateful images.
The safety measures in OpenAI is pretty much cliche progressive hyper-moderation in 2023. You see them same approach on Reddit and elsewhere, and the left-leaning communities are consistently the most locked down to a particular overton window. This is not a very controversial take.
It goes well beyond just pornography and explicitly violent stuff.
Go ahead and try to sell a product to enterprises where the creation of violent and pornographic images is possible. I promise you that you'll get zero sales. This isn't censorship nor is it about "progressive" values. This is capitalism.
Again this goes well well beyond pornography and violence. It was already very aggressive and now they claim that invested a ton more work in making it even more “safe”.
Please provide more substance when saying something like this. Kindly explain why you feel it is flawed instead of flatly rejecting it without elaboration.
The comment is not forthcoming but the point is valid because it’s pretty obvious that the reason they are afraid of generating adult content is because of the risk that people e.g. use it to make porn of people without their consent, CP, shock images, or just anything that will harm OpenAI’s reputation and get them sued/regulated into the ground. Not because Sam Altman is some kind of evangelical moralizer.
And so continues the trend of "progressive" AI companies deliberately handicapping their models for no real good reason.