The term "white privilege" annoys me so much.
Someone once told me that just by being white and male I was benefiting and contributing to a societal and systematic oppression. These were literally her words.
"I am a woman, therefore I am oppressed"
I had check my ears to make sure I heard her right. But once I verified for sure she was serious I had to refrain from popping a vein. She then went on about concepts like 'rape culture' and 'patriarchy' and finally told her I had to go to work and to have a good day.
Are people getting this line of thinking from somewhere in particular or just pulling shit out of their ass?