Where tech aligns

How Stable Diffusion could end revenge porn

I’ve been tracking the trials and tribulations of a project called Unstable Diffusion — this is essentially an effort to take the Stable Diffusion code and train a model that can generate “erotic imagery.” In other words, porn, though the project’s backers are careful to position the effort as appealing to other than prurient interests. […]

I’ve been tracking the trials and tribulations of a project called Unstable Diffusion — this is essentially an effort to take the Stable Diffusion code and train a model that can generate “erotic imagery.” In other words, porn, though the project’s backers are careful to position the effort as appealing to other than prurient interests.

While Unstable Diffusion keeps getting deplatformed and vilified as a future enabler of synthetic child pornography, it has recently occurred to me that there could be an unexpected upside to the proliferation of open-source AI porn generators: the end of revenge porn as an effective weapon of shame and humiliation.

Obligatory disclaimer: Porn is bad, child porn is monstrous, and synthetic child porn is also monstrous. Insofar as Unstable Diffusion increases the proliferation of porn, especially among underage people, it will be a net negative for the world.

With that throat clearing out of the way, I think there’s a point that’s very fast approaching — maybe a year or two out, at most — when any nudes that circulate of someone without their consent are by default assumed to be synthetic and not real. These NSFW image generation models will be so common and so easy for anyone to use, that there will be a flood of fake nudes so large it will drown out the real nudes. And when that happens – when synthetic nudes are so easy to generate and ubiquitous that everyone assumes all nudes are fake – it will no longer be possible to threaten or humiliate someone by circulating nudes of them (nudes that everyone will assume are fake, and that the circulator cannot prove are real).

So I think schoolboys will soon be generating and trade synthetic nudes of girls they know, and this will be an awful development because as I said, porn is bad and child porn (even synthetic child porn) is monstrous, but at least the reputational threat from these ubiquitous images will be greatly reduced by virtue of their presumed falsehood. And that would be a teeny tiny silver lining in what is an otherwise very dark, very NSFW thundercloud that we’ll all be living under very soon.