Red Crayon Aristocrat
Member
When I was growing up in the 90s and the 00s, feminism was usually referred to in a past tense, the story basically went "once upon a time women were expected to stay at home, cook dinner and make babies, then the 60s happened and women burned their bras and now they're free to have any sort of life of their choosing, the end."
And so it was, or so it seemed, there were plenty of exceptional women out there, lawyers, doctors, scientists, astronauts, politicians, if a woman felt passionate about some field, she was free to pursue it and if she had the right stuff she could succeed and it was all well and good, but most women didn't care, they were content to focus on traditionally feminine things, raising families, finding romance, succeeding in female oriented fields like the fashion industry, whatever. The point is women were still seen as strong and individuals, not "stepford wives" and that was the bottom line.
What I think really sums up female culture when I was growing up is Sex and The City, which was about a group of smart, independent women, but ones who are focused on nice clothes, nice shoes, finding romance and having a Cosmo with their gal pals, why would it be anything else? That's what most women like, right?
Then a funny thing happened, a funny thing called the 2010s and everything changed, with the way things have been going you'd think we were living in The Handmaid's Tale, as if the decades since the 60s never happened, women are ceassly oppressed by the "patriarchy" (a concept I had literally never of in my entire life before this decade) and must fight tooth and nail for every small victory they achieve in male oriented fields like business, science, technology, politics and so on.
What happened? And how much of that is true? Were we being fed a big fat lie all that time or are we being lied to now?
I can only speak from my experience, I'm not a woman, I won't claim to know everything, but what I do know is from the start of this great revival of feminism in this decade, "third wave feminism" or whatever you want to call it, my Spider's Senses were tingling, something felt off, how did we go from Paris Hilton seemingly personifying female culture in the prior decade to what we have now?
My theory is this, we live in a toxically narcissistic culture, just narcissistic to the bone, undoubtedly the most narcissistic culture ever seen in human history, in today's world if you don't have fame, no matter how small, you may as well not even exist, if you have no personal "brand" then you aren't really alive.
This is a narcissism that affects both men and women, it's one reason why for example I think we see so many males carrying out mass shootings, they want to get their 15 minutes of fame, they want people to know their name and face and they're literally willing to kill for it, that's how narcissistic our culture has become and I think modern feminism is how that's affecting female culture.
It's a desperate grab for attention is what it boils down to, I know that's harsh and I don't mean that as a slam against all women, again this is a problem that affects all of modern culture, but come on, in the modern western world we do not brutally repress women and haven't for decades, if not centuries (if by "brutally" oppress you mean like what you see in the middle east today), it's not a perfect world but the version we are sold on by modern feminists just doesn't seem to be the reality.
I'll give you an example, remember the infamous Cathy Newman interview with Jordan Peterson? The way she talked about having to fight for her position, the challenges she faced for what obviously she would have faced a lot of competition for regardless if she was male or female? Well it honestly seemed like she expected to just be handed the job simply because she's a woman, if that's not narcissism, what is?
And so it was, or so it seemed, there were plenty of exceptional women out there, lawyers, doctors, scientists, astronauts, politicians, if a woman felt passionate about some field, she was free to pursue it and if she had the right stuff she could succeed and it was all well and good, but most women didn't care, they were content to focus on traditionally feminine things, raising families, finding romance, succeeding in female oriented fields like the fashion industry, whatever. The point is women were still seen as strong and individuals, not "stepford wives" and that was the bottom line.
What I think really sums up female culture when I was growing up is Sex and The City, which was about a group of smart, independent women, but ones who are focused on nice clothes, nice shoes, finding romance and having a Cosmo with their gal pals, why would it be anything else? That's what most women like, right?
Then a funny thing happened, a funny thing called the 2010s and everything changed, with the way things have been going you'd think we were living in The Handmaid's Tale, as if the decades since the 60s never happened, women are ceassly oppressed by the "patriarchy" (a concept I had literally never of in my entire life before this decade) and must fight tooth and nail for every small victory they achieve in male oriented fields like business, science, technology, politics and so on.
What happened? And how much of that is true? Were we being fed a big fat lie all that time or are we being lied to now?
I can only speak from my experience, I'm not a woman, I won't claim to know everything, but what I do know is from the start of this great revival of feminism in this decade, "third wave feminism" or whatever you want to call it, my Spider's Senses were tingling, something felt off, how did we go from Paris Hilton seemingly personifying female culture in the prior decade to what we have now?
My theory is this, we live in a toxically narcissistic culture, just narcissistic to the bone, undoubtedly the most narcissistic culture ever seen in human history, in today's world if you don't have fame, no matter how small, you may as well not even exist, if you have no personal "brand" then you aren't really alive.
This is a narcissism that affects both men and women, it's one reason why for example I think we see so many males carrying out mass shootings, they want to get their 15 minutes of fame, they want people to know their name and face and they're literally willing to kill for it, that's how narcissistic our culture has become and I think modern feminism is how that's affecting female culture.
It's a desperate grab for attention is what it boils down to, I know that's harsh and I don't mean that as a slam against all women, again this is a problem that affects all of modern culture, but come on, in the modern western world we do not brutally repress women and haven't for decades, if not centuries (if by "brutally" oppress you mean like what you see in the middle east today), it's not a perfect world but the version we are sold on by modern feminists just doesn't seem to be the reality.
I'll give you an example, remember the infamous Cathy Newman interview with Jordan Peterson? The way she talked about having to fight for her position, the challenges she faced for what obviously she would have faced a lot of competition for regardless if she was male or female? Well it honestly seemed like she expected to just be handed the job simply because she's a woman, if that's not narcissism, what is?