I don’t know why, exactly. Is it just cyclical? Is it permanent? I don’t like it much. Feels very puerile. Both the Left (including many/most feminists) and the Right (conservatives and their ilk) seem to support this shift, so what changed?
In one small example, when I was a kid in Florida (where it is very fucking hot), it was not at all unusual to see a woman in a bikini top in the grocery store in high summer. Now it never really happens. No one thought anything of it then, but now it’d be a scandal.
It’s not just an American phenomenon. That’s why I linked to stories about the French also becoming more prudish and Canada with its renewed war against sex workers.
What’s going on? Whatever it is, it doesn’t bode well.