I don't see much AI art. Far less than I expected a couple of years back. What's going on?
Midjourney looked game changing. When I saw that we could generate any image from a prompt, I thought a big proportion of twitter would be AI images by this point
But AI images aren’t everywhere for me? I am confused.
In my experience:
Twitter. AI images aren't everywhere. Other than when a new model is released, they are hardly anywhere. One has to be more careful at taking an image at face value, but it hasn't impacted photography much as far as I can see.
Facebook. I don't use loads, but I don't see too much AI art
Websites. I sense AI art is common as backgrounds or at scale. If you need thumbnails for something, get AI to generate them.
Porn. There is a reasonable amount of AI art in porn. Also the best image generators won't generate pornography, so these are older models (suggesting there would be more, if this weren’t the case)
For others (I've chatted to a few other people while writing this):
Facebook. I hear this platform is full of AI "slop" (low quality AI images)
Instagram.
My mum says that she sees a lot of AI images, and that sometimes with images of knitted products, it's hard to tell what's real and what's fake.
Business Insider discusses an AI generated Instagram model with 100,000 followers.
Porn. Some of my friends who create pornography say that AI is being used to copy and resell their work
Also some anecdotal data:
Lots of 50+ people seem to like obviously fake images on Facebook (corroborating source)
There are Reddit and Discord communities of people who work together to generate images, often having some specific shared interest
What do the patterns seem to be here?
AI art isn't obviously common on social media for younger adults.
It may be common for people aged 50+
It is probably common for producing filler images, pornography, in certain communities
What are some theories?
What might explain why I see so little AI art while others see more?
Some things are hard to generate
Most of the images I look at are photos of current events. AI has very little share in these images. The images I see of Trump or Elon have an uncanny, church wall quality, where they are almost superimposed on the background.
There are hidden validation mechanisms.
Many kinds of images go through many sets of human hands before they get to me. Most news organisations wouldn't post an AI image where it might be mistaken. But let's imagine a fake news image. I would only sit it if it had been retweeted by someone I follow, and if none of the people liking it had noticed it was fake. I think this is unlikely.
I can imagine if I were less careful with news sources, this might not be the case.
Perhaps I don't notice?
Is it possible that AI art is being used and I don't know. I doubt it. If this were the case, likely some or most would get found out eventually. I don't recall this ever having happened. I recall doctored videos and images that were from other events, but I don't recall an image I cared about ever having turned out to be AI generated1.
I don't like AI art
Generally, I find a lot of AI art pretty soulless. It looks like a sort of mix of all the versions of that thing. And just as I don't read airport romance novels, I don't scroll through endless mediocre images of dogs or cakes or whatever.
Most good AI art still has a human touch
The only two kinds of AI art I see regularly are a guy who works really hard and a comics guy who uses it so he doesn't have to draw. In both cases it's a human who is working hard to make it good.


AI art is for cheap images
Perhaps AI is more present in places where people see lots of similar images. This would explain why it appears in business filler images, spammy porn sites and the feeds of people who scroll a lot. All of these people are happy to pay little or nothing for their content. AI is cheap.
Some people see a lot of it
A lot of these theories imply that I may be unusual in how little AI art I see. And from asking people, some see far more than I do. Which suggests that some people may already have feeds which are more or less entirely AI. I guess it depends on whether you are happy enough with what you see. If you are, the algorithm will keep showing it.
What about as that bar shifts?
What happens as the amount of people who are impressed by AI art increases. There was a scandal a few years back where children were being shown AI generated shows, the most successful of which were remixed into new shows. If I remember correctly the most successful videos were Peppa Pig and Ana from Frozen have a sexy, painful time at the dentists, with music2. It was pretty hellish stuff. Children, if left to click what interests them, do not end up in a good place. I expect many adults to behave similarly.
Alongside this, we have a deeply fracture media ecosystem. A young woman watches instagram reels with cooking content. A teenage boy is swipes to an Andrew Tate short, bragging about how he controls women. A mother watches ten two-minute soaps in a row. I scroll through twitter. We can see entirely different worlds and not know.
I don't know where this ends up. Already it seems to me that some people see a lot of AI art and some see few. Over time, in areas where people see many, similar images, I predict they will increasingly be AI generated. But I think my feeds will be human for a while yet. And how will that change us?
I am trying to publish more, so I am curious how this landed.
Searching, I could find a few examples of AI imagines used where people might think they were real.
You might think I am exaggerating for effect, and I guess this is probably one of the worse examples, because I can remember it, but I think it was horrifically bad.
DeviantArt is saturated with AI output especially for corn images.