So when Muslims refer to evil 'The West' The are usually talking about America and Britain....exclusively. And they are usually talking about how inferior our culture and society is to theirs, or how are Western people do this and Western people do that and it's EVVVvvil.
What I don't understand is why America and Europe are singled out.
1. America consists of Mostly christian or some other religion (even in only name), YET countries like Japan, China and Sweden are 90% atheist? Why don't Muslims attack those countries? You would think a Godless country would take priority over a one people associate themselves with Islam's cousin religion. But you never hear Muslims talk about those countries which are perfectly fine functioning societies (China needs some work though).
2. There are countries to the Far East, like South Korea, Japan, I guess China, and some parts of India (the middle class areas) engage in a culture similar to the US
Things such as: alcohol, music, women that show off their beauty proudly, secular thinking etc. Again! Why is always 'Westernize culture' referring to America and Europe.
just take a look at the Muslim majority countries in green.
http://en.wikipedia.org/wiki/List_of_Muslim-majority_countriesThe rest of the world around it, engages in activities/LIFESTYLE/GOVERNING that they single out America for and call it 'Westernized'
Now I know the deep hatred from America comes from political fuck ups in the past, and them evil j00z because of the creation of Israel, I can get that. But I'm really baffled when Muslims say
'The West'. Singling out America and Europe for things like sexy women in commercials, alcohol, single women who raise kids alone, rape statistics, celebrity obsessions, trendy clothes or haircuts, or for not following the Islamic way of life etc.
Can someone explain to me why Muslims just ignore every other country when they refer to
'The West'?
I'm American but I don't think think the world should revolve around us.