Are white men losing influence in this country?
Errrr - which country?
[ DI 4 REALITY ]
Definitely. London is 53% ethnic and other big cities are close to having ethnic majorities so white people are losing their influence in society and it's not helped by the erosion of free speech that stops us voicing our concerns.
Rich white men are everything while the rest are being emasculated and put below stairs.
Yeah. Eventually our society is going have females be the dominant one. Then men can make abuse claims.
No as Caucasian males still control majority of the companies & the majority of wealthy people are white males as well.
They seem to be under attack, if you watch TV commercials. Whenever a TV ad wants to make somebody look stupid, neurotic, evil, lazy, etc, it's almost always a white male who plays the part. Same thing in Hollywood movies
Well, yeah... White men are expected to hand over their influence to immigrants and illegals. It's considered "racist" for white men to put themselves and their people first, and nobel for every other race of men to do the same.
White people have become fat and stupid, it's time for a new breed to rule.
Yes, the Zionist occupation government is ensuring that we white Christians are destroyed. Miscegenation, shariah courts, more ethnics in government and silencing whites by the word racism is the method of our destruction. Like Rotherham abuse could have been stopped early but the health visitors was accused of being racist simply for correctly pointing out that all the perpetrators are Pakistani men.
Look at the ethnicity of the country's leaders. No.
I do not think so.
Yes, these days one is considered a Nazi for saying white people have a right to determine their own destiny.
no its their minds people make a nation not money and you are selling guns to the Indians again that means more dead Americans