Although this is the part that reads like a bit of a strawman to me, or does in its common usage anyways. It seems that this argument gets trotted out whenever feminists or whomever argues that we're still perpetuating patriarchy. Even if the West is the most liberal and progressive culture in the world, there's still work to be done in making it less patriarchal and oppressive.
Quote:
Originally Posted by CliffFletcher
Why is it so hard for progressives to acknowledge that the modern West is the most liberal and progressive culture in the world, and by far the best society in the world for women (and gays) to pursue their aspirations in defiance of traditional oppression? Is white guilt so deeply ingrained that it trumps all other progressive values?
|